Hacker News new | past | comments | ask | show | jobs | submit login

I feel it. I'll add that I'm also rather scared of how AI's compute and data hungry nature is de-democratizing tech and...reality in a big way. A handful of giga-scale companies hold all the cards. The actual technology behind AI is mostly a small set of clever tricks for encoding data in a way that neural nets can work efficiently on, the nets themselves aren't magic and the process of making them is well understood, but unless you have the money to shell out for tens of thousands in hardware and have access to literally everyone's private data to train on you're just going to be very second tier for eternity.



> and...reality in a big way

This is the part that frightens me somewhat, because I don't think the public is prepared for what's coming. Although, it's of secondary concern to me versus your first point:

> de-democratizing tech

...which is certainly problematic.

I'm at a point right now where I haven't fully decided what to think. On any given day, my Facebook feed is filled with a growing collection of AI generated images, and more and more comments are expressing exasperation that it's "real" when the images are (to me) clearly generative "art." But, I think we're at a point where the average user is going to be duped by at least some small percentage of images that are out there. It wouldn't be too difficult to imagine a time in the next 5-10 years (maybe less) when the majority population is convinced of an imagined event that never happened.

More to your point, though, even companies that have a large volume of data to train on but simply don't have the resources are going to be "very second tier." I've noticed this with the AI subscription for Logos Bible Software. ChatGPT does a better job of answering questions, generally, than Logos which oftentimes can't even answer fairly trivial queries based on one's library. I've heard this has changed in the past couple of months for the better, but I'm not optimistic because of the capability mismatch.


I feel like this is one of those red lines that change society's behavior for the worse. We didn't need to invent locks until thieves were commonplace. We didn't need to invent private walled gardens until our data was being used to make our lives worse....


Data part seems to be different. With tech it's always in a race to the (cheapest) bottom. And I totally see hardware price (or rather computation price) coming down, but data would be harder and harder to collect, because:

1. Regulators finally think of something.

2. Internet is becoming increasingly full of AI-generated content which is bad as a training data.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: