Hacker News new | past | comments | ask | show | jobs | submit login

It's entirely possible, and I certainly hope so. But it hasn't happened yet. The history so far suggests that better technology will generally be ignored unless there's a clear way to make money from it now, which tends to favor incremental improvements and makes early mistakes impossible to ever correct.



I think it's happening already:

We are seeing innovations in GPUs and now TPUs/IPUs (AI coprocessors) that deviate significantly from classic instruction stream based architecture, and understanding how to coordinate the parallel dataflow and computation is what it's all about.

That seems to be where all the high performance stuff is trending, and there's a lot of money in it because of the level of interest in current-generation AI/ML and its many applications.

Programming those devices is kind of clunky at the moment, and 50 years is a long time during which I'd expect significant advances in tooling, backed by all that money.

Then you have all sorts of changes going on in software architecture, such as mostly-declarative, mostly-functional, and reactive sorts of programs (or systems - serverless, microservices, pub/sub backend architectures, etc), and large distributed systems where it becomes increasingly necessary to combine "classic" coding techniques with transaction-carrying logic to keep it reliable.

I think between the drive for new architectures to support high-performance computation, especially with complex, non-linear memory workloads, and the amount of resources going into making them more usable, the newer types of processors will become increasingly versatile, and we'll find ourselves shifting more and more "conventional" workloads onto them just because the capacity is there and the "external" distributed systems problems we see at data centre scale just happen to be similar to the "internal distributed" systems problems being solved by technical means inside the large new devices. Which can, incidentally, talk to each other to make larger versions of the same devices with similar logic guarantees.


I wouldn't bet on current funding patterns lasting for the next 50 years. AI winter is probably coming soon.

All your other points I agree with, however the higher-level counterargument is that all these things have been just around the corner, and the tyranny of the von-Neumann architecture has been on its last legs for about 50 years now, yet it is somehow still with us despite many opportunities for both hardware and tooling-based alternatives. Anyway, here's hoping!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: