Hacker News new | past | comments | ask | show | jobs | submit login

> Why readPixels is not subject to anti-fingerprinting is beyond me. It does not sprinkle hardly visible typos all over the page, so that works for me.

> keep the styling and the top of the page (about 8 KiB uncompressed) in the gzipped HTML and only compress the content below the viewport with WebP

Ah, that explains why the article suddenly cut off after a random sentence, with an empty page that follows. I'm using LibreWolf which disables WebGL, and I use Chromium for random web games that need WebGL. The article worked just fine with WebGL enabled, neat technique to be honest.






It isn't neat as long as it doesn't work with all modern web browsers (even with fingerprinting protection) and doesn't have a fallback for older browsers. WWW should be universally accessible and progressively enhanced, starting with plain HTML.

It isn’t a serious proposal. It’s a creative hack that no one, author included is suggesting should be used in production.

The author is using the hack in "production".

Debatable. I consider a personal blog to be fair game for experimentation, it’s not a paid for project and has no customer.

This philosophy hands your content on a silver platter to ai companies, so they can rake in money while giving nothing back to the author.

I don’t support LLM companies stealing content and profiting from it without contributing back. But if you’re going to fight that by making things more difficult for humans, especially those with accessibility needs, then what even is the point of publishing anything?

There's a saying, a bird in the hand is worth two in the bush.

An author might reasonably prefer 90% of people visit his site to 100% of people consuming the content indirectly.


I don’t think those numbers are even close to realistic. It’s absurd to think that having an accessible website will make everyone consume it via LLM, or that having an inaccessible website (which, by the way, will only stave off scraping temporarily) will make most people visit it. We can’t have a productive conversation by making up numbers.

Wrapping the website in javascript also won't stop ML crawlers as they probably all already use headless chromium to deal with the modern web.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: