Hacker News new | past | comments | ask | show | jobs | submit login

Each model seems to becoming more efficient and effective so maybe it's a trend that doesn't continue on the average use side.



Thanks to most of the world being "GPU poor", there is a lot of research and engineering effort going into making models much more compute efficient. Another way that OpenAI gets to benefit from the world of open source/weight models.


I think there are still _a lot_ of use cases that are currently prohibitively expensive for which increased efficiency will immediately induce matching demand.


Fair point. I think the lagging indicator is how much alternate models are becoming capable using less horsepower.


We have an ambitious goal; mammal brain power consumption.


That's a very tall ask lol.

There was a post of a detailed imaging of a tiny part of the brain and it contained 1.7 Petaflops of data or something.

CPUs and GPUs have a long way to go beyond nm production.


Sure, but we know it’s possible in practice as we carry one of these with us and it works on almost no energy, so it gives the inspiration at least to try to get a little bit closer. There is a very large gap…


Maybe. But there is no maybe about the mess being left for the future.

If future people are forfeit as we refuse to sacrifice today, why preserve people today? I say bring chaos now so the mess makers deal with their externalities for a change instead of waving them off to make “line go up”.


What you are saying means people not learning anything new or different.

There is lots that only humans can do, including what a sentence predictor presented as AI can't imagine. It's getting better at reasoning for sure, but net new, novel stuff, I think is still the domains of most.

It's true though that GPT may offset a lot of people to have to change and grow, and it might be hardest for the people who got away with BSing. I've never had that luxury and always have had to deliver so I guess it feels a little less frightening.

Either way, I figure its easier to learn which way the currents in this realms are flowing to recognize them better, than to sit on the side of the pool looking at it with disdain.

We stop growing not only when we stop learning, but when we stop creating.


> We stop growing not only when we stop learning, but when we stop creating.

This is generic enough to leave open to debate what we should create.

It waves off the impact on the future to create.

It’s cute and poetic but as usual ignores externalities as our taught economic models inherently rely on ignoring externalities. You are refusing to engage in such discussion because we don’t socially normalize an obligation to do so.

Why must we create what we currently create?

Can you not create new skill and awareness any other way? Are you that unimaginative you must simulate the galactic differential manifold inside a machine as you cannot imagine it and draw it for yourself? At the cost of what for the species over the next decades for us?

And I say for us as you cannot guarantee there’s Lindy effects measured in centuries from the outputs of the work. It’s not unreasonable to find all this high minded talk near equivalent to hearsay and religious belief in the potential for forever growth and expansion.

It’s not unreasonable to assume this just conviction to economic memes you memorized wrapped in overly reductive poetry to obfuscate.


Maybe it will work out if chatgpt leads to less people being needed, and it ends up using less water than people.


We didn’t need to build ChatGPT for that. Such physical measure and awareness that resources are finite and humans require resources has been in front of human eyeballs for centuries.

You’re just a contemporary worker bee following orders. The economy is built on academic BS as the older, less numerate financier generation had no ability to falsify that which they had no education in, they handed their pensions and post world war welfare job money to their kids who made up this and think the future just has to keep honoring decades old contracts while ignoring they don’t honor their elders past.

Entropy will attenuate the past. Our achievements are meaningless to the next centuries as physics will force them to be rebuilt.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: