Intel might eventually find itself in the same shoes Kodak did - when your primary business dries up, there's unlikely to be a follow-on that is successful enough to keep the company going.
We've hit the physical limits of our current processes but we haven't remotely hit the physical limits of..well physics.
It's possible that we are on the flat before the next curve starts whether that curve will reach the consumer space I don't know but at the top end of computing (supercomputers) there is still massive demand for more processing power.
Next, how exactly are you going to make these atom-sized objects in quantity? You can't be manually placing atoms. But lithography has a hard physical diffraction limit, forcing the use of higher frequency light. The higher the frequency, the greater the energy. Etching single-atom features will require hard x-rays, which we may not even have technology to generate. Even if we do, focusing x-rays is not trivial, as you can't just slap in a glass lens. Plus, the focal lengths may be rather long. And what kind of photoresist do you use? Hard x-rays are likely to blast pretty much anything, and the etching characteristics are probably not neat little troughs. How do you ensure you have your one atom stay put when everything around it is being blasted by high-energy x-rays?
I think the physical limits are looming pretty large.
I'm optimistic because I've seen "end of progress" reports on computing power since I was a kid in the 80's.
We are long way away from this https://en.wikipedia.org/wiki/Bremermann's_limit
We've known for ages that Germanium and similar offer better characteristics than Silicon but the cost to improve silicon has stayed below the cost to retool for Germanium til now, if we do hit the limit at 5nm silicon then the cost equation changes and alternate materials become worth the investment.
There are always good uses for more computing power, but not on the desktop where the big volumes are.
1. Force Windows 10 upgrade.
2. Windows 10 sucks on this computer!
3. Buy Windows 10 compatible computer.
Why do you say so? Do you work at Microsoft?
Anecdotally: I upgraded a somewhat older laptop to Windows 10 and found that the screen brightness was locked to full. Called up support and was told that no, there's no fix and no, there won't be one for the foreseeable future - mine isn't a "supported system", so it has to stay on Windows 7. Looked up the issue online and found many others with the same problem going back to release day.
A lot of future technologies that rely on exponential increases in silicon capability will not be possible. It's almost impossible to replicate an exponential.
The rate of progress will slow and look more like more mature industries such as aerospace.
The only thing I can see which is going to benefit a lot from CPU power is more advance AI or machine learning built in. But we are only seeing very gradual moves in this area now.
VR and Games are also candidates of course, but most people aren't hard core gamers. They are fine with Angry Birds ;-)
And with the bandwidth we got today, it wouldn't necessarily be a big problem to offload much of that processing to servers and do it as a service instead. In many ways we might be heading back to the dumb clients of old where the majority of processing happens on a remote server.
If there were consumer-grade 16- and 32- core Intel processors with reasonable prices I would consider buying them for the sake of certain experiments. It would save me from dealing with CUDA or OpenCL. But right now that's the domain of pricey high-end server stuff.
The issue is creating compelling reasons for buying a new PC. Intel/Microsoft have not been very successful at promoting those sales-drivers.
An interesting aspect of Intel culture is how it reacts to body blows, urgently seeking a path out of the darkness along many different paths simultaneously. Remember the legend of "operation crush" -- the sales force was tasked with getting 1000 design wins for the 8088, from whatever place they could find them. Some guy in Florida found a renegade group in Boca Raton working on a skunkworks project called the "PC". Meh, it didn't sound like much, but hey, only 999 more design wins left to go, eh? Bag the order and get on the road again... whoever that guy was, I sure hope he made his quota that quarter.
Plus, the last few laptops I've purchased all seem to die JUST after the warranties (extended or otherwise) expire.
Are they finally at least putting 1080p+ screens on these as a standard thing, or is everyone still stick on that crummy 1366x768 regression?
I went to a computer store recently, and told them that I was interested in dropping about 2k on a laptop, and I brought a list of specs (at least 16gb ram, lots of cores, etc.). He suggested I should get a gaming laptop and never gave any suggestions from his store.
It will be interesting to see what the future holds, because they're already getting close to the physical limits of what they can build in terms of processors. If they start slowing down financially, it may be the case that we'll still be using computers that are about as fast as they are today in 7-10 years from now.
No that's wrong. They simply don't need to use 4 or 8 cores.
Let's take the segment of the market with an NVidia GPU. They have CUDA, one of the most polished development environments for GPGPU today.
When was the last time you wrote something in CUDA? What was it for?