Hacker News new | comments | show | ask | jobs | submit login

It's ironic that wall street spent so many years saying Bitcoin is worthless, but is buying-in when it's lost its use case to competitors within cryptocurrency, and really is useless (except as a speculative vehicle or "store of value"). I guess BTC can fork infinitely, so there's that.

It should _have_ existed, but the idea came around only about 2015? It's evolving slowly. But if it were "built-in" would be a lot better of course.

The price of energy would gradually increase, to the point is no longer feasible to mine bitcoin with it. That's how markets work: the higher the demand, the higher the price, to the point where demand meets availability*cost.

> to prevent unintended shutdowns and extend battery life

The cynic in me thinks it's actually to encourage new hardware upgrades.

Also good point - no matter how much you scale "onchain" you still end up with settlement periods, a few min at least. Impractical for offline payments.

Bitcoin is limited at 21 million coins. After that, no more btc, and people will trade in btc divisions such as satoshis. But there are hundrerds of bitcoin alternatives now, some very interesting.

> There have been successful tests of blocksizes of up to 1 GB or 1000x the current size

no need to be genius to just increase blocksize 1000x and "assume" gigabit connection which no one has.

Lightning network is a great idea, in theory, in practice however it will just not work for the problem you mentioned - no one is interested in providing this kind of liquidity.

As of why need blockchain at all - is because 1st layer is the court and guarantees you to own money from the channel. You just don't have to ask court for every coffee payment (i.e. broadcast to every single laptop in the world).

My understanding is that if enough people stop mining the difficulty will be lowered so that the energy costs fall. Also there are transaction fees.

It's an extremely important point when you consider the water going to things like almonds in California, the majority of which are then exported. Their almond crop alone consumes enough water to supply Los Angeles and San Francisco combined. Only 20% of California water is for non-agricultural purposes.

Or put another way, the total economic value of all exported almonds from California is about $3.5 to $4 billion. The water use is about on par with the entire city of Los Angeles for that almond export crop. The metro area of LA has a trillion dollar economy by comparison (250+ fold larger than the almond exports). That's an economically irrational imbalance.

The obvious question becomes: is that what California wants to do with its scarce water resources, export large parts of it at an extremely low value basis? At some point in the near future it's likely some hard choices will need to be made, including whether to invest extremely large sums of money into more desalination plans or to begin restricting the most voracious water consumers (cattle are another one, although I'm unsure of their export ratios).

That's pretty much what people use stock market for anyway. Most investors are no warren buffet, they bet at best on educated guesses.

College textbooks.

That's not me, but I found those translations very useful a few years ago! Hope it can help someone.

Voluntary Human Extinction is unnecessary..

The earth will be a red Giant in about 5 billion years and fry the earth anyway

Yeah, it is funny that ML, which is biggest fears driven by the media as the ultimate job destroyers, would put this generation of programmers in danger as well.

However, the paradigm shift is inevitable, once discovered, people will use it, and use it anywhere possible.

> The day my first was born was the day my motivations and my reason for being shifted.

One of the many reasons I remain childfree.

Project Euler is nice for math problems. It has little real world value for developers. The first ones can be a nice hands on introduction to problem solving.

Another upgrade idea: anti-missile system. Rather than destroying, slowing or deflecting missiles, it would draw a vector in front of each missile as it gets close to you. Higher level could warn you earlier and draw more useful vectors.

I certainly agree with familiarizing yourself with OWASP as a resource and community. It's also helpful to think about what threats you are securing your code from. This might be a good place to start: https://www.hacksplaining.com

Very informative walk-through of a BT security analysis. Thanks for including links to tools and clear code snippets.

Yeah, I am aware of refinement types. You can do a lot of what I am envisioning with just refinement types. Expressing loop invariants with just refinement types is really awkward though.

The main reason why the language I am envisioning is not a functional language is because Idris, Liquid haskell, ... have already explored many of the related ideas. I want to take ideas from them and bring it into an imperative language as that hasn't been as well explored.

I don't think it needs to be changed. It's actually really interesting to wonder what Mathematics is going to be like in 2048.

I don't agree Prowler is good against Goliaths. Even if you find one that stays on the edge of the crowd and doesn't use radar, it takes so long to kill a G because of your poor damage potential. It's easy for another player to swoop in and steal your kill. And it only gets worse as Goliath upgrades Defense, Speed, and Energy.

I am the same though I'm only 30. I no longer seek out puzzles but to solve interesting real world problems. The time you put into a challenging Euler puzzle, you can use React/Rails/NodeJs/Go/Rust/lang du jour to solve real world problem, helping real people.

Also as a liberal arts in programming, Euler, is fun in the sense of a well done proof is fun in math. If that is your thing you'll enjoy Euler, otherwise as in most programming you will brute force your way to a solution.

More precisely, it is possible perfectly to replicate the payoff of a futures contract (on any asset) up front, even though that payoff is uncertain at the time. Whatever the future’s payoff ends up being at the contract’s maturity, you can replicate it in advance simply by borrowing a certain amount of cash and using it to buy a corresponding amount of the asset underlying the futures contract; if you do this, the value of your holdings in cash (after paying interest on your borrowings) and in the asset are guaranteed exactly to equal the value of the futures contract at maturity, no matter what, so any deviation in the price of the futures contract from the value of the replicating portfolio of cash and asset would create a pure arbitrage. Consequently futures prices are strongly constrained to be within a tight range determined by the price of the underlying asset, the interest rate on cash and the yield on the asset (if any), the time until the maturity of the futures contract, and cost of transacting in these markets.

This arbitrage relationship is about as hard they get, so trading futures is tantamount to trading the underlying asset itself, since changes in price in the futures market will be translated directly via arbitrage into changes in price in the spot market for the underlying asset, and vice versa.

So what happens when the entire earth's energy production is required to produce the next bitcoin and there is therefore no further bitcoin mining?

It is beyond my imagination why people would downvote you for asking a geniue question - most of us aren’t even geologists. Search doesn’t always lead us to an authoritative answer.

Anyway, you did raise a good question, althoug focused on the core.

I have a similar concern with the immediate extraction. We know that there is plenty of water underground we can relie on for many years - critical for drought seasons, but underground overuse (or called ungerground depletion) can lead to a number of environmental problems such as land subsidence.

The extraction must be done at a reasonable depth below the surface, but what is the rate of the heat influx reaching this depth? Would we deplete this energy before wr can replenish? Is this even a concern?

Furthermote, if you look at Wikipedia, there are a number of environment concerns. Although pollution is far less than what is generated by burning fossil fuel, nonetheless, careful design and location are required.

Great talk, with lots of new insights into what's happening at Google. I really think his point that ImageNet is the new Mnist now holds true. Even research labs should be buying DeepLearning11 servers (10 x 1080Ti) for $15k, and training large models in a reasonable amount of time. It may seem that Google are way ahead, but they are just doing synchronous SGD, and it was interesting to see the drop in prediction accuracy from 128 TPU2 cores to 256 TPU2 cores for ImageNet (76 -> 75% accuracy). So, the algorithms for dist. training aren't unknown, and with cheap hardware like the DL11 server, many well-financed research groups can compete with this.

> stuff such as the vulnerabilities in the DNS resolver because they reimplemented things from scratch without the necessary domain knowledge on how to do that securely that has been collected and implemented by the dozens of existing implementations over the decades

By domain knowledge how to do this securely in implementations tested oer time, do you mean something like https://access.redhat.com/articles/2161461?

It doesn't look like systemd's resolver fares worse in comparison... In addition it can be sandboxed (and is), an advantage over having a resolver part of libc.

These successful tests are research-level developments, just like lightning network.

Bitcoin is extremely early stage technology. It'll take years for a bunch of scaling developments and blocksize is just a temporary measure.

The argument of "why not use R studio" was not at all convincing for me.

R studio already has vim key bindings, so trying to make vim into rstudio-like seems like spending time on the wrong problem. As in, even if you succeed beyond your wildest dreams you're still not going to compete with R studio.

Things that make Rstudio great that is not so easy to implement include the knitR integration and Rmarkdown integration and embedded graphs and most importantly Rstudio server.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact