> Why readPixels is not subject to anti-fingerprinting is beyond me. It does not sprinkle hardly visible typos all over the page, so that works for me.
> keep the styling and the top of the page (about 8 KiB uncompressed) in the gzipped HTML and only compress the content below the viewport with WebP
Ah, that explains why the article suddenly cut off after a random sentence, with an empty page that follows. I'm using LibreWolf which disables WebGL, and I use Chromium for random web games that need WebGL. The article worked just fine with WebGL enabled, neat technique to be honest.
It isn't neat as long as it doesn't work with all modern web browsers (even with fingerprinting protection) and doesn't have a fallback for older browsers. WWW should be universally accessible and progressively enhanced, starting with plain HTML.
I don’t support LLM companies stealing content and profiting from it without contributing back. But if you’re going to fight that by making things more difficult for humans, especially those with accessibility needs, then what even is the point of publishing anything?
I don’t think those numbers are even close to realistic. It’s absurd to think that having an accessible website will make everyone consume it via LLM, or that having an inaccessible website (which, by the way, will only stave off scraping temporarily) will make most people visit it. We can’t have a productive conversation by making up numbers.
> keep the styling and the top of the page (about 8 KiB uncompressed) in the gzipped HTML and only compress the content below the viewport with WebP
Ah, that explains why the article suddenly cut off after a random sentence, with an empty page that follows. I'm using LibreWolf which disables WebGL, and I use Chromium for random web games that need WebGL. The article worked just fine with WebGL enabled, neat technique to be honest.