And we'd be shooting ourselves in the foot to do so. If America is forced to use only the clunky corporate-owned American AI at a fee, we'll very quickly fall behind competitors worldwide who use DeepSeek models to produce better results for much, much cheaper.
Not to mention it'd defeat the whole purpose of a "free market" economy. (Not that that means much of anything anymore)
It never meant anything. There's no such thing as a free market economy. We haven't had one of those in modern times, and arguably human civilization has never had one. Markets and their participants have chronically been subject to information asymmetry, coercion/manipulation, and regulation, among other things.
I don't think all of that is a bad thing (regulation tends to make it harder to do the first two things), but "free markets" are the economic equivalent to the "point mass" in physics: perhaps useful sometimes to create simple models and explanations of things, but will never exist in the real world.
Yes, technically and pedantically you are correct.
But restricting the trade in micro chips only because the USA is afraid it will loose a technical and commercial edge is a long long way from a free market.
It is too late, too. China has broken out and they are ahead in many fields. Not trading chips with them will make them build their own foundries. In two decades they will be as far ahead there as they are in many other fields.
If the USA would trade then the technological capacities of China and the USA would stay matched, as they help each other. China ahead in some areas, the USA ahead in others.
That would still (probably) not be a pure Free Market but it would be a freer market, and better for everybody except a few elites (on both sides)
The Nvidia export restrictions also might be shooting us in the foot too, or at least Nvidia. They really benefit from CUDA remaining the de facto standard.
The Nvidia export restrictions have already harmed Nvidia. Deepseek-R1 is efficient on compute and was trained on old, obsolete cards because they couldn't get ahold of Nvidia's most cutting edge tech, so they were forced to innovate instead of just brute-forcing performance with faster cards. That has directly resulted in Nvidia's stock crashing over the last week.
That's true, but I mean it'd be a hundred times worse if Deepseek did this on non-Nvidia cards. Which seems like only a matter of time if we're going to keep doing this.
> Which seems like only a matter of time if we're going to keep doing this.
What's funny is that people have been saying this since OpenCL was announced but today we're actually in a worse spot than we were 10 years ago. China too - their inability to replicate EULV advancements has left their lithography in a terrible place.
There's a reason China is literally dependent on Nvidia for competitive hardware. It's their window into export-controlled TSMC wafers and complex streaming multiprocessor designs. Losing access to Nvidia hardware isn't an option for them (or the United States for that matter) which is why the US pushes so hard for export controls. There is no alternative.
Well I wasn't optimistic about OpenCL in the past, because nobody will bother with that when they can pay a little (or even a lot) more for Nvidia and use CUDA. Even though OpenCL might work in theory with whatever tools someone is using, it's unpopular and therefore less supported. But this time is different.
Is it? Time will tell, but it wasn't "different" even during the crypto craze when CUDA was literally printing money. We were promised world-changing ASICs just like with Cereberas and Groq, and ended up with nothing in the end.
When AI's popularity blows over (and it will, like crypto), will the market have responded fast enough? From where I'm standing, it looks like every major manufacturer (and even the Chinese market) is trying the ASIC route again. And having watched ASICs die a very painful and unsatisfying death in the mining pools, I'd like to avoid manufacturing purpose-made ICs that are obsolete within months. I'm simply not seeing the sort of large-scale strategy that threatens Nvidia's actual demand.
I don't have any hope for ASICs. GPUs are going to stay, but if enough big countries are only able to obtain new non-Nvidia GPUs, big corps and even governments will find it worthwhile to build up the ecosystem around OpenCL or something else. Before now, CUDA had unstoppable momentum.
Crypto mining was just about hash rates, so I don't think it really mattered whether you used CUDA or not. Nvidia cards were just faster usually. People did use AMD too, but that didn't really involve building up the OpenCL ecosystem, just making it run one particular algo. They do also use ASICs for BTC in particular, I don't think that died.
Not to mention it'd defeat the whole purpose of a "free market" economy. (Not that that means much of anything anymore)