

State of Ethereum: August Edition - panarky
https://blog.ethereum.org/2014/08/27/state-ethereum-august-edition/

======
qnr
I'm reasonably sure that ethereum will be a failure because of a fundamental
incompatibility between speculators (enticed by "get rich by being early
adopter just like those bitcoin folks from 2010") and developers (who actually
increase the value of the platform)

I have a novel idea for a decentralized application which I think might take
off. There is no way in hell I will build this with Ethereum where it will
disproportionally benefit "early adopters" doing nothing and sitting on their
butts. (Instead I will likely fork an existing codebase such as NXT or one of
the lighter altcoins)

I believe many developers interesting in building actual apps (not 10-line
code snippets shown as examples on the website) feel the same.

To build on the analogy from an earlier Ethereum thread, that's like Google
selling developer versions of Google Glass but because of speculators buying
them in bulk, the price is $100,000 per item. As a result, no one actually
develops for the platform.

~~~
abrkn
The early speculators in the cryptocurrency space are developers.

------
nl
The biggest unaddressed problem I see with Ethereum is the potential of the
blockchain growing _very_ large. There has been some discussion of this (on
the Ethereum blog and elsewhere), but these discussions focus on showing that
the blockchain will remain secure at very large sizes.

I'm more interested in the operational mechanics of having to deal with a very
large block chain. The figure of 100 TB was used in one blog post, and that is
a challenging amount of data to deal with for anyone.

What I haven't seen is any progress on any proposals to either provide
incentives to limit the size, or ideas around splitting the chain.

~~~
vbuterin
I briefly mentioned Ethereum 1.5 at the end of my blog post; I'll soon cover
some clever ways of creating quasi-scalable dapps using the ETH blockchain as
a usually-inexpensive security backend.

Believe me, consensus and scalability are the two largest problems on my mind.
We don't have nice solutions (yet :) ) because the problems are fundamentally
hard.

------
mabbo
"The block time is decreased from 60 seconds to 12 seconds"

Now this, I like. I went to a ethereum meeting in Toronto a few months back,
and the block time was the only question I had for the speaker. Reducing the
block time to something reasonably quick will allow for usable user
applications.

~~~
0x0
I thought the risk of a split chain was tied mostly to the wall clock, not the
block count? Would you not still need the equivalent of 6 BTC block
confirmations aka 60 minutes of confirmations?

~~~
vbuterin
No, the block count is the only thing that matters. If an attacker has X
hashpower, the probability that they will succeed in a 51% attack after N
confirmations is (X / (1 - X))^N ; you can derive this formula yourself via
generating functions. The block time appears nowhere in that analysis.

~~~
0x0
But with a shorter block time, they can make more attempts in the same time
span... Even if each attempt has a low probability of success on its own,
wouldn't the higher number of attempts make it more likely to succeed at least
once?

I was led to believe the purpose of slow (long) confirmations was to reduce
the risk of an attacker mining an alternative chain in secret (to be revealed
later as a hijack and a double spend), by making it unlikely to be able to
sustain a high hashrate (multiplied by "luck"?)

~~~
Bubbles99
Above and beyond that, the effect of high block times is that centralized
miners gain a significant advantage over decentralized ones. The current block
latency of the Bitcoin network is in the order of a number of seconds. As we
know that current mining pools are indicative of the best connected (many of
them have private peering and significant numbers of edge router nodes), we
can assume that the majority of the network will be even poorer than this. In
some testing the difference between a Stratum socket on the top 4 pools
announcing a block was often up to 3 seconds.

For a 12 second block time in Bitcoin, the network would likely collapse down
to just the biggest pool, as everybody else would have a disproportionately
high number of orphan blocks. Sensible miners would need to move to the
biggest pool in order to avoid a significant double-digit percentage losses.

> I thought the risk of a split chain was tied mostly to the wall clock, not
> the block count?

This is a correct statement to make. The number of blocks considered to be
safe is completely irrelevant, and only has the effect of denoting the time on
the wall. An hour for 6 confirmations in Bitcoin wasn't chosen so much because
it was 6 blocks long, but because it is a reasonable tradeoff between
convenience and security. If the block time had been 5 minutes you would need
12 to get the same level of confidence in a transaction. I don't understand
why the Ethereum developers appear to think otherwise.

------
rthomas6
>Currently, the biggest threat to our approach is likely some kind of rapidly
switching FPGA.

Finally, a project for my area of focus. Man that would be fun.

~~~
jamoes
It's interesting how much work they're putting in to making their mining
process ASIC resistant. Their approach to implement a turing-complete
scripting system into their blockchain which miner's must be able to execute
is definitely interesting. But it's also very complicated. I'll be interested
to see if they can successfully implement it.

In my opinion, a much more viable approach to ASIC-resistance - which is
already implemented and live today - is proof-of-capacity hard-drive based
mining. This is an approach to mining which Microsoft proposed in a research
paper last year [1]. It's recently been implemented by an altcoin called
Burstcoin [2].

I should disclose that I've been mining BURST, and therefore am financially
interested in seeing it succeed. However I also genuinely believe it is
interesting technology, and I'd like to see some sort of proof-of-capacity
mining succeed purely because of the technical merits.

The proof-of-capacity mining process basically works by pre-generating as many
"plots" as possible to your hard-drive. Every plot is like a lottery ticket
for a lottery that takes place once per block. If you have the luckiest plot
for a block, then you get to claim the next block. After initial plot
generation, the mining process is very energy efficient since it only requires
a small amount of hard-drive reads per block. It's also ASIC-resistant and
likely to remain very decentralized because the of the wide distribution of
hard-drive storage capacity.

[1]
[https://research.microsoft.com/pubs/217984/permacoin.pdf](https://research.microsoft.com/pubs/217984/permacoin.pdf)

[2]
[https://bitcointalk.org/index.php?topic=731923.0](https://bitcointalk.org/index.php?topic=731923.0)

