Hacker News new | past | comments | ask | show | jobs | submit login

The shoehorning only works if there is buyer demand.

As a company, if customers are willing to pay a premium for a NPU, or if they are unwilling to buy a product without one, it is not your place to say “hey we don’t really believe in the AI hype so we’re going to sell products people don’t want to prove a point”






Is there demand? Or do they just assume there is?

If they shove it in every single product and that’s all anyone advertises, whether consumers know it will help them or not, you don’t get a lot of choice.

If you want the latest chip, you’re getting AI stuff. That’s all there is to it.


"The math is clear: 100% of our our car sales come from models with our company logo somewhere on the front, which shows incredible customer desire for logos. We should consider offering a new luxury trim level with more of them."

"How many models to we have without logos?"

"Huh? Why would we do that?"


Heh. Yeah more or less.

To some degree I understand it, because as we’ve all noticed computers have pretty much plateaued for the average person. They last much longer. You don’t need to replace them every two years anymore because the software isn’t out stripping them so fast.

AI is the first thing to come along in quite a while that not only needs significant power but it’s just something different. It’s something they can say your old computer doesn’t have that the new one does. Other than being 5% faster or whatever.

So even if people don’t need it, and even if they notice they don’t need it, it’s something to market on.

The stuff up thread about it being the hotness that Wall Street loves is absolutely a thing too.


That was all true nearly 10 years ago. And it has only improved. Almost any computer one finds these days is capable of the basics.

There are two kinds of buyer demands: product, buyers, and the stock buyers. The AI hype can certainly convince some of the stock buyers.

Apple will have a completely AI capable product line in 18 months, with the major platforms basically done.

Microsoft is built around the broken Intel tick/tick model of incremental improvement — they are stuck with OEM shitware that will take years to flush out of the channel. That means for AI, they are stuck with cloud based OpenAI, where NVIDIA has them by the balls and the hyperscalers are all fighting for GPU.

Apple will deliver local AI features as software (the hardware is “free”) at a much higher margin - while Office 365 AI is like $400+ a year per user.

You’ll have people getting iPhones to get AI assisted emails or whatever Apple does that is useful.


We're still looking for "that is useful".

The stuff they've been trying to sell AI to the public with is increasingly looking as absurd as every 1978 "you'll store your recipes on the home computer" argument.

AI text became a Human Centipede story: Start with a coherent 10-word sentence, let AI balloon it into five pages of flowery nonsense, send it to someone else, who has their AI smash it back down to 10 meaningful words.

Coding assistance, even as spicy autocorrect, is often a net negative as you have to plow through hallucinations and weird guesses as to what you want but lack the tools to explain to it.

Image generation is already heading rapidly into cringe territory, in part due to some very public social media operations. I can imagine your kids' kids in 2040 finding out they generated AI images in the 2020s and looking at them with the same embarrassment you'd see if they dug out your high-school emo fursona.

There might well be some more "closed-loop" AI applications that make sense. But are they going to be running on every desktop in the world? Or are they going to be mostly used in datacentres and purpose-built embedded devices?

I also wonder how well some of the models and techniques scale down. I know Microsoft pushed a minimum spec to promote a machine as Copilot-ready, but that seems like it's going to be "Vista Basic Ready" redux as people try to run tools designed for datacentres full of Quadro cards, or at least high-end GPUs, on their $299 HP laptop.


Cringe emo girls are trendy now because the nostalgia cycle is hitting the early 2000s. Your kid would be impressed if you told them you were a goth gf. It's not hard to imagine the same will happen with primitive AIs in the 40s.

Early 2000's ??

Bela Lugosi Died in 1979, and Peter Murphy was onto his next band by 1984.

By 2000 Goth was fully a distant dot in the rear view mirror for the OG's

    In 2002, Murphy released *Dust* with Turkish-Canadian composer and producer Mercan Dede, which utilizes traditional Turkish instrumentation and songwriting, abandoning Murphy's previous pop and rock incarnations, and juxtaposing elements from progressive rock, trance, classical music, and Middle Eastern music, coupled with Dede's trademark atmospheric electronics.
https://www.youtube.com/watch?v=Yy9h2q_dr9k

https://en.wikipedia.org/wiki/Bauhaus_(band)


I'm not sure what "gothic music existed in the 1980s" is meant to indicate as a response to "goths existed in the early 2000s as a cultural archetype".

That Goths in 2000's were at best second wave nostalgia cycle of Goths from the 1980s.

That people recalling Goths in that period should beware of thinking that was a source and not an echo.

In 2006 Noel Fielding's Richmond Felicity Avenal was a basement dwelling leftover from many years past.


True Goth died our way before any of that. They totally sold out when the sacked Rome, the gold went to their heads and everything since then has been nostalgia.

That was just the faux life Westside Visigoths .. what'd you expect?

#Ostrogoth #TwueGoth


There was a submission here a few months ago about the various incarnations of goth starting from the late Roman empire.

https://www.the-hinternet.com/p/the-goths



The product isn’t released, so I don’t think we know what is or isn’t good.

People are clearly finding LLM tech useful, and we’re barely scratching the surface.


I expect this sort of thing to go out of fashion and/or be regulated after "AI" causes some large life loss, e.g. starting a war or designing a collapsing building.

> while Office 365 AI is like $400+ a year per user

And I'm pretty sure this is only Introductory pricing. As people get used to it and use it more it won't cover the cost. I think they rely on the gym model currently; many people not using the ai features much. But eventually that will change. Also, many companies figured that out and pull the copilot license from users that don't use it enough.


I hope that once they get a baseline level of AI functionality in, they start working with larger LLMs to enable some form of RAG... that might be their next generational shift.

Who is getting $400/y of value from that?

Until AI chips become abundant, and we are not there yet, cloud AI just makes too much sense. Using a chip constantly vs using it 0.1% of the time is just so many orders of magnitude better.

Local inference does have privacy benefits. I think at the moment it might make sense to send most of queries to a beefy cloud model, and send sensitive queries to a smaller local one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: