As for changes actually being implemented, to be honest I haven't seen anything actually substantial since P2SH. The one big change that would benefit everyone now, increasing the block size limit, has been on the table for a year now with absolutely no progress toward pushing it through. If it does come close to happening, then I'll publish a bitcoinmagazine article cheering it on. For now though, it seems as far away as ever.
What I generally want to say when I make such statements in blog posts is "We _plan_ to do something that others have not yet _put into practice_"; of course, that just means we're equal and not better, but the point is to say that we're moving quickly and we'll get there soon. Bitcoin is currently a slow-moving target, and it given the $5 billion of existing capital stored inside of it it would be irresponsible to do things any other way; so I think it's unlikely that Bitcoin will develop second-layer scalability protocols first. If you wish to wait for actual results then that is a philosophy that I very much respect.
I have realized over time that pretty much nothing in Ethereum is new; Turing-complete contracts were in Ripple and Qixcoin (although I was not _thinking of_ either one of those two, and I did not even realize that Ripple contracts were Turing-complete, when I came up with the idea), Patricia tries I got from Alan Reiner back in 2012, all sorts of clever blockchain designs were mulled over on bitcointalk in 2009, and that doesn't even begin to describe the legions of forgotten hackers on cypherpunk mailing lists in the 1990s. A few weeks ago I learned about the concept of "rules engines". And then of course there's Yap stones. Meanwhile, Vertcoin is coming up with a memory hard proof of work that claims to be revolutionary and powerful but runs into a fundamental scalability issue that I solved months ago with Dagger. So perhaps I do need to tone down my "this is amazing and new" rhetoric; but at the same I've come to realize that since we are philosophically similar people attacking similar problems some degree of collision, whether of the "independent discovery" form or the "heard about it, forgot it, reinvented it without realizing" form is inevitable.
Where do you think the words "merkle-sum-trees" came from? :)
> Unless you do some crazy ugly hack like creating a separate overlay merkle tree with its root being output 1 of the coinbase, that's a hard-forking protocol change.
No crazy hack is required. You just include a commitment to a merkle-sum-tree tree of transaction values, along with the UTXO commitment. It doesn't have to commit to transactions, it's just a tree of values. There is no loss of efficiency, you don't even have to signal the data normally since all full nodes already have it.
> As for changes actually being implemented, to be honest I haven't seen anything actually substantial since P2SH
Well thats a step back from the position you took above that it can't change at all ever. Now the changes have been not substantial and not often enough. ::shrugs::
> that would benefit everyone
It's highly debatable that it would benefit everyone now, we're certantly not up against the limit. People are still using the blockchain in very inefficient ways, and the ecosystem of tools to increase efficiency hasn't developed yet. At the same time the count of full nodes is falling— increasing the cost of running one right now may not be a good strategy.
> so I think it's unlikely that Bitcoin will develop second-layer scalability protocols first
I don't have any interest in being first. I'd much better have a well designed and considered approach. Unfortunately, so far, none of the alt-systems— even ones which raised millions of dollars of funding— have developed anything that turned out to be useful to implement in Bitcoin. Maybe that will change.