Doesn’t your point about video compression tech support Nvidia’s bull case?
Better video compression led to an explosion in video consumption on the Internet, leading to much more revenue for companies like Comcast, Google, T-Mobile, Verizon, etc.
More efficient LLMs lead to much more AI usage. Nvidia, TSMC, etc will benefit.
No - because this eliminates entirely or shifts the majority of work from GPU to CPU - and Nvidia does not sell CPUs.
If the AI market gets 10x bigger, and GPU work gets 50% smaller (which is still 5x larger than today) - but Nvidia is priced on 40% growth for the next ten years (28x larger) - there is a price mismatch.
It is theoretically possible for a massive reduction in GPU usage or shift from GPU to CPU to benefit Nvidia if that causes the market to grow enough - but it seems unlikely.
Also, I believe (someone please correct if wrong) DeepSeek is claiming a 95% overall reduction in GPU usage compared to traditional methods (not the 50% in the example above).
If true, that is a death knell for Nvidia's growth story after the current contracts end.
I can see close to zero possibility that the majority of the work will be shifted to the CPU. Anything a CPU can do can just be done better with specialised GPU hardware.
Then why do we have powerful CPUs instead of a bunch of specialized hardware? It's because the value of a CPU is in its versatility and ubiquity. If a CPU can do a thing good enough, then most programs/computers will do that thing on a CPU instead of having the increased complexity and cost of a GPU, even if a GPU would do it better.
We have both? Modern computing devices like smart phones use SoCs with integrated GPUs. GPUs aren't really specialized hardware, either, they are general purpose hardware useful in many scenarios (built for graphics originally but clearly useful in other domains including AI).
People have been saying the exact same thing about other workloads for years, and always been wrong. Mostly claiming custom chips or FPGAs will beat out general purpose CPUs.
Yes, I was too hasty in my response. I should have been more specific that I mean ML/AI type tasks. I see no way that we end up on general purpose CPUs for this.
In terms of inference (and training) of AI models, sure, most things that a CPU core can do would be done cheaper per unit of performance on either typical GPU or NPU cores.
On desktop, CPU decoding is passable but it's still better to have a graphics card for 4K. On mobile, you definitely want to stick to codecs like H264/HEVC/AVC1 that are supported in your phone's decoder chips.
CPU chipsets have borrowed video decoder units and SSE instructions from GPU-land, but the idea that video decoding is a generic CPU task now is not really true.
Now maybe every computer will come with an integrated NPU and it won't be made by Nvidia, although so far integrated GPUs haven't supplanted discrete ones.
I tend to think today's state-of-the-art models are ... not very bright, so it might be a bit premature to say "640B parameters ought to be enough for anybody" or that people won't pay more for high-end dedicated hardware.
> Now maybe every computer will come with an integrated NPU and it won't be made by Nvidia, although so far integrated GPUs haven't supplanted discrete ones.
Depends on what form factor you are looking at. The majority of computers these days are smart phones, and they are dominated by systems-on-a-chip.
That's also what AVX is but with a conservative number of threads.. If you really understand your problem I don't see why you would need 32 threads of much smaller data size or why you would want that far away from your CPU.
Whether your new coprocessor or instructions look more like a GPU or something else doesn't really matter if we are done squinting and calling it graphics like problems and/or claiming it needs a lot more than a middle class PC.
It lead to more revenue for the industry as a whole. But not necessarily for the individual companies that bubbled the hardest: Cisco stock is still to this day lower than it was at peak in 2000, to point to a significant company that sold actual physical infra products necessary for the internet and still around and profitable to this day. (Some companies that bubbled did quite well, AMZN is like 75x from where it was in 2000. But that's a totally different company that captured an enormous amount of value from AWS that was not visible to the market in 2000, so it makes sense.)
If stock market-cap is (roughly) the market's aggregated best guess of future profits integrated over all time, discounted back to the present at some (the market's best guess of the future?) rate, then increasing uncertainty about the predicted profits 5-10 years from now can have enormous influence on the stock. Does NVDA have an AWS within it now?
>It lead to more revenue for the industry as a whole. But not necessarily for the individual companies that bubbled the hardest: Cisco stock is still to this day lower than it was at peak in 2000, to point to a significant company that sold actual physical infra products necessary for the internet and still around and profitable to this day. (Some companies that bubbled did quite well, AMZN is like 75x from where it was in 2000. But that's a totally different company that captured an enormous amount of value from AWS that was not visible to the market in 2000, so it makes sense.)
Cisco in 1994: $3.
Cisco after dotcom bubble: $13.
So is Nvidia's stock price closer to 1994 or 2001?
I agree that advancements like DeepSeek, like transformer models before it, is just going to end up increasing demand.
It’s very shortsighted to think we’re going to need fewer chips because the algorithms got better. The system became more efficient, which causes induced demand.
Not only are 10-100x changes disruptive, but the players who don't adopt them quickly are going to be the ones who continue to buy huge amounts of hardware to pursue old approaches, and it's hard for incumbent vendors to avoid catering to their needs, up until it's too late.
When everyone gets up off the ground after the play is over, Nvidia might still be holding the ball but it might just as easily be someone else.
If you normalize Nvidia's gross margin and take into account of competitors sure. But its current high margin is driven by Big Tech FOMO. Do keep in mind that 90% margin or 10x cost to 50% margin or 2x cost is a 5x price reduction.
Because DeepSeek demonstrates that loads of compute isn't necessary for high-performing models, and so we won't need as much and as powerful of hardware as was previously thought, which is what Nivida's valuation is based on?
That's assuming there isn't demand for more powerful models, there's still plenty of room for improvement from the current generation. We didn't stop at GPT-3 level models when that was achieved.
Yes, over the long haul, probably. But as far as individual investors go they might not like that Nvidia.
Anyone currently invested is presumably in because they like the insanely high profit margin, and this is apt to quash that. There is now much less reason to give your first born to get your hands on their wares. Comcast, Google, T-Mobile, Verizon, etc., and especially those not named Google, have nothingburger margins in comparison.
If you are interested in what they can do with volume, then there is still a lot of potential. They may even be more profitable on that end than a margin play could ever hope for. But that interest is probably not from the same person who currently owns the stock, it being a change in territory, and there is apt to be a lot of instability as stock changes hands from the one group to the next.
That would be an unusual situation for an ETF. An ETF does not usually extend ownership of the underlying investment portfolio. An ETF normally offers investors the opportunity to invest in the ETF itself. The ETF is what you would be invested in. Your concern as an investor in an ETF would only be with the properties of the ETF, it being what you are invested in, and this seems to be true in your case as well given how you describe it.
Are you certain you are invested in Nvidia? The outcome of the ETF may depend on Nvidia, but it may also depend on how a butterfly in Africa happens to flap its wings. You aren't, by any common definition found within this type of context, invested in that butterfly.
Technically, all the Nvidia stock (and virtually all stocks in the US) are owned by Cede and Co. So Nvidia has only one investor.[0] There's several layers of indirection between your Robinhood portfolio and the actual Nvidia shares, even if Robinhood mentions NVDA as a position in your portfolio.
You will find that the connection between ETFs and the underlying assets in the index is much more like the connection between your Robinhood portfolio and Nvidia, than the connection between butterflies and thunderstorms.
[0] At least for its stocks. Its bonds are probably held in different but equally weird ways.
> Technically, all the Nvidia stock (and virtually all stocks in the US) are owned by Cede and Co.
Technically, but they extend ownership. An ETF is a different type of abstraction. Which you already know because you spoke about that abstraction in your original comment, so why play stupid now?
Better video compression led to an explosion in video consumption on the Internet, leading to much more revenue for companies like Comcast, Google, T-Mobile, Verizon, etc.
More efficient LLMs lead to much more AI usage. Nvidia, TSMC, etc will benefit.