
Intel’s former chief architect: Moore’s law will be dead within a decade - shawndumas
http://www.extremetech.com/computing/165331-intels-former-chief-architect-moores-law-will-be-dead-within-a-decade
======
bediger4000
I'm curious about what happens to the art of programming when Moore's Law goes
away.

Pretty clearly, programmers (as a society) have let Moore's Law carry the
ball. We've quit doing assembler, we've quit doing C, we've started to do
interpreted stuff. If you're an Arch Linux user, you'll have noticed the 10x
or 100x size difference between a GUI and a text interface system. So, even in
terms of user interface, we've let Moore's Law carry the water.

After speed and multi-threading and parallel programming stop giving rewards,
do we go back down the abstraction chain? Do we start programming in C or
assembler again to gain additional speed? Do we all give up (at least to some
extent) on GUI programs?

~~~
conexions
Herb Sutter wrote a good article on this a few years ago. It was written in
2005 and centered mostly on multi-threading and parallel programming, but
definitely worth a read. [http://www.gotw.ca/publications/concurrency-
ddj.htm](http://www.gotw.ca/publications/concurrency-ddj.htm)

