Hacker Newsnew | past | comments | ask | show | jobs | submit | peebee67's commentslogin

I've often had the mental image of Galileo trying to order a pizza and being very disappointed at the garlic bread that turned up.


"Privacy policy" is definitely one of the most prominent Newspeak terms around today. The full term should read "how we'll violate your privacy policy".


They've apparently had a corporate philosophy of obfuscating the underlying system from the end user and deliberately inhibiting their ability to learn how it fits together since at least the early 2000's.

I feel like the current ignorance of the average computer user was a deliberate outcome they've been working towards for more than 20 years. As someone who has been using computers since the late 80's, I find their current offerings harder to use than ever.


> In general, I don’t really understand educators hyperventilating about LLM use. If you can’t tell what your students are independently capable of and are merely asking them to spit back content at you, you’re not doing a good job.

Sounds as though you do understand it.


Greedy and relentless OpenAI's scraping may be, but that his web-based startup didn't have a rudimentary robots.txt in place seems inexcusably naive. Correctly configuring this file has been one of the most basic steps of web design for living memory and doesn't speak highly of the technical acumen of this company.

>“We’re in a business where the rights are kind of a serious issue, because we scan actual people,” he said. With laws like Europe’s GDPR, “they cannot just take a photo of anyone on the web and use it.”

Yes, and protecting that data was your responsibility, Tomchuck. You dropped the ball and are now trying to blame the other players.


OpenAI will happily ignore robots.txt

Or is that still my fault somehow?

Maybe we should stop blaming people for "letting" themselves get destroyed and maybe put some blame on the people actively choosing to behave in a way that harms everyone else?

But then again, they have so much money so we should all just bend over and take it, right?


If they ignore a properly configured robots.txt and the licence also explicitly denies them use, then I'd guess they have a viable civil action to extract compensation. But that isn't the case here at all, and while there's reports of them doing so, they certainly claim to respect the convention.

As for bending over, if you serve files and they request files, then you send them files, what exactly is the problem? That you didn't implement any kind of rate limiting? It's a web-based company and these things are just the basics.


You don't!

That's exactly what they're aspiring to here, following on from a well-established pedigree of Australian lawmakers and their dysfunctional relationship with the Internet.


You do! It already happens - just not for everyone.

Example:

https://m.facebook.com/help/582999911881572


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: