Author here - this was just a quick PoC, I'm pleasently surprised that it seems to be handling all the HN traffic.
It's served from a python script using aiohttp, behind nginx, on a $16/year VPS.
I might make a github repo with more details, but in the meantime, here's the server script: https://pastebin.com/ykUeppqc (apologies for pastebin, I don't have access to my github account at present)
It is impossible to spend any real money (as opposed to their $300 credit) without switching to a paid account, which is a multi-step process that you can't perform accidentally.
After the 30 day trial expires your account is switched to the "always free" mode. It does not allow creating any paid resources which are not marked as "always free".
I was curios and read about it, and you are correct. But just like I said, there are many pitfalls. for example - I assumed you can change your paid plan freely but:
It's with HostUS. I got a coupon a few years ago (via LowEndBox), and it's been renewing at the same rate ever since. I have no complaints about them, but I'm not sure you can get the same pricing today.
At this price point you are barely paying for the IPv4 address. It will be an OpenVZ container.
I had a $15/yr VPS with BuyVM.net for many years and would absolutely recommend them at this price point, except that they have shut down this offering and switched to KVM (it's for the best). Ramnode.com are honest enough and still offer the "192MB SVZ" plan for $15/yr.
I would rather scrape by on the GCS/AWS/Heroku free tier, Netlify / GH pages, ...rather than going back to OpenVZ. Better to pay just a few dollars more for a proper KVM VPS.
Fun fact: similar things, but much more general, were in the HTTP/1.1 standard from 1997. However, HTTP frames relied on HTTP chunks, which were fundamentally broken in the Win-HTTP-stack implementation. (Third party browsers like Netscape Navigator or Opera implemented a HTTP-stack of their own, but IE relied on the built-in implementation.)
(Because of this flaw, keep-alive was broken, as well, and requests stalled until they timed out. Which is, why everyone set the request timeout to just a short period and you couldn't type a HTTP request manually in a terminal anymore.)
I wouldn't know a browser that would still support it. Because of said issues, it didn't see any significant use (apart from a few enthusiastic experiments early on.)
Pretty neat. That sent me on a brief hunt out to see how it might work. It seems that PNG (including APNG) has a fairly easy to follow pattern of chunks. So this could be a fairly simple cgi-bin type script that just does a "sleep(2)" or similar when it encounters a fcTL (frame control) chunk.
Technically correct, for the majority of CURRENT releases of still maintained software.
Notably Edge only got it in Jan of 2020, on the switch to Webkit ( Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3919.0 Safari/537.36 Edg/79.0.294.1 ) (PS: note the old KDE KHTML legacy)
For older android devices, and locked down corporate systems, IE and outdated versions of other software do not support apng.
Videos are much more efficient than apngs or gifs, though, and it's just as easy to allow downloads (unless they're using variable bitrates which are a bit pointless for short animations anyway)
It's just as easy to allow downloads, but it seems a lot more common for sites to go out of their way to disallow video downloads than gif downloads. Also a bunch of them use video tags that don't point to an actual video file but get fed video content out of band.
I'm "guilty" of this on my video site https://pushups.ndn.today (does not work on iOS).
So far one person figured out how to download those videos, after spending 4 hours studying my source code.
> I would have expected them to have fixed the obvious issue with their UI by now.
Are we talking about the same Reddit that still runs three different versions of their website because the newest version is so bad that people refuse to use it? I'm not holding my breath.
That's just using the blob API likely. Nothing stopping you src'ing an <img> tag with one of those.
If the page you're on can survive a reload, you can often find the interesting mp4/m3u8 request in the devtools network tab. You probably knew this already.
I've kind of expected that you have to open the DOM inspector to find image links these days. Not sure what they think they're protecting, but the protection doesn't work.
(Reminds me of sites that disable pasting in passwords from your password manager, so you have to have xdotool type them for you. Again, not sure what they think they're saving me from. And I'm not sure why browsers have APIs that allow this sort of thing.)
Yes this was one of the reasons I built gif.com.ai - GIFs are a special format and my intention for the app was to retain a gif as just that. When you generate a gif you get an Imgur copy and it sometimes formats the gifs to videos but that’s not a problem because you can get the original gif format back by adding “i” in the url prefix and adding “.gif” to the end of the url.
Images and videos can both be downloaded by right click. Also, both of them can be configured not to work that way by html/css structure. It's not a difference between image vs video.
But then you might be able to re-share "their" content without subjecting your friends to that website's ads and that website's tracking (but I repeat myself)
It's not a file format problem, it's a user experience problem. I can't save a Twitter "GIF" and share it without resorting to external tools like youtube-dl.
It's not a file format problem, it's a developer/company problem... They didn't implement it correctly. I can download videos on many sites by right clicking on them. They could "block" right-clicking on GIFs too if they wanted to.
This reminds me of adding an animated gif (that's a video of someone else) as my background in Zoom or other conferencing software. The result is that people won't see "me" but someone else when my camera is turned-off.
No, they expect that if you're interested in "how various image formats look as they load" and the explanation behind them, that this will be an interesting video to you.
Also, I just skimmed through it: it contains many examples throughout, and given that what's being shown is how an image changes over time as it loads, a video is not an inappropriate format I'd say.
The part where the two of you discuss how Firefox and Chrome now load progressive JPGs with blur instead of keeping it pixelated made me wonder: what happens if I set the CSS of an image to upscale it with pixelation? Does it also force progressive JPGs to stay pixelated while loading?
Good question! I don't know off the top of my head, but since the decoder and CSS are seperate, I think you'll still get the blurred version, but then up/downscaled using nearest neighbour.
This is cool! It's interesting to see how programs handle this - the thumbnail in a specific gallery app on my Android phone is the first "frame", while it's the finished image in my file explorer.
I am having flashbacks to the dial-up BBS days of the late 80s, where I made a decent local living designing animated logon screens for various sites in exchange for access to said sites. Had to optimize character animations based on the baud rate of the inbound modem connection, separate versions for 1200 and 2400 baud.
Long ago I was making node server which served partials of progressive JPEG files, stopping at IIRC the SOS headers. I had an 'enhance' button that worked. :)
It would have worked even better with JPEG 2000, but browsers killed that off for some reason. I miss the old Netscape days.
The reason is patents. The JPEG people made sure everything they produced after JPEG proper was irrelevant for the open Web by patenting it to hell and back. (Even JPEG is only recently freely implementable in full, because it has a better mode with arithmetic coding instead of Huffman that nobody uses in order to avoid patents.) I’ve heard there’s also a small issue where a naïve JPEG 2000 decoder would produce images that look less sharp even if they’re nominally better (cf. similar issues with H.264, mentioned at <https://news.ycombinator.com/item?id=16615962>), but the patents is the big one.
Now browsers (specifically Firefox) making sure MNG (the better-designed animated sibling of PNG) never got off the ground is not such a good look for them.
Motion JPEGs are widely used in surveillance systems. If you can hack into the system using an GIF you can loop the output since the website doesn’t enforce file type.
Great question. So you're limited by the 5 or so frames you have at maximum. I would just put the png in a html page that auto-refreshes every second. And then a bit of javascript for the controls. But at that points it's more doom on HTML5 than png doom ...
No need to refresh, you can use multipart/x-mixed-replace to keep serving new images forever over the same HTTP request. This is not specific to PNG though.
Adam7[0] is an interlacing algorithm for PNG files. Without explaining what it does, it basically allows lower speed connections to see an image progressively get sharper instead of line by line. This made sense in the days of dialup speeds, but not much anymore.[a]
The trick here involves Adam7, but modern connections are too fast. So the network is artificially slowed so each subimage is a “frame”.
> This made sense in the days of dialup speeds, but not much anymore.[a]
I have Javascript disabled by default, and I notice a lot of websites (especially newspapers) initially have a blurry <img/>, and later replace it with the sharp image using JS. So there is clearly still a need for this feature, even if web developers don't know about it.
A bit late of a reply, but <picture> allows specifying multiple sizes of images. Which one will be downloaded depends on the criteria the developer set (generally based on the page or rendered image dimensions).
I've just been trying to understand this. I believe the trick is in how with Adam7 there are progressively more pixels displayed with each pass. This allows you to make big changes from one pass to the next, even from almost completely white to almost completely black.
If you then interrupt the connection at the right point - directly after one pass - you can get these animations. But unlike a gif, once loaded they are no longer animated since they are done loading.
Edit: Also important here is the cache policy ('Cache-Control: no-store').
Part of the problem with Adam7 interlacing is that it interacts poorly with compression. The compressor operates on the image data after filtering and interlacing; enabling interlacing means that there is a much weaker relationship between each pixel and the neighboring pixels in the image stream.
As a lot of these replies mention - it's using interlacing to print frames using PNG.
Something I think is interesting here also, the server is holding each frame for two seconds before it sends that chunk to the browser, so the animation would look similar regardless of connection speed (2 seconds per frame + your browser's actually latency to the server)
It'd be nice to control the interval via a get parameter. Sure one could record and frame-by-frame it, but it'd be pretty easy on their side to have the uri default to something like `/interval/2/`. Maybe they thought of this, maybe it increases surface are...I think it'd be worthwhile for accessibility.
I believe he's just taking advantage of how interlacing works in png to create this effect. As you load the higher resolution, the image changes drastically in this case.
It's served from a python script using aiohttp, behind nginx, on a $16/year VPS.
I might make a github repo with more details, but in the meantime, here's the server script: https://pastebin.com/ykUeppqc (apologies for pastebin, I don't have access to my github account at present)
Just in case the server dies, there's a video of it here: https://twitter.com/David3141593/status/1388602027484356614