Clicking the image button submits the form with x and y click coordinates, and the server returns an HTTP 204 telling it to stay where it is, followed by pushing out the updated jpeg to all connected clients.
It’s pretty fun, and I wanted to have it online all the time but it has a problem I haven’t sorted out where sometimes when a client disconnects the server won’t realize and trying to push them a jpeg locks the whole rendering system.
Most JS apps in my experiences fail to handle server errors with any sort of user feedback at all.
I'm attempting to support every client in every configuration on my web-based forum, and this would be a nice little toy.
I need to revisit and update that project sometime but it was a very neat event loop C implementation that reads frames from a cheap consumer webcam and without doing any reencoding feeds it to connected sockets as an MJPEG stream (it does have support for re-encoding if the camera can’t produce JPEGs). Was a really fun exercise in minimalism and efficiency as my goal was to stream multiple cameras from a single Raspberry Pi back when they were super slow.
While it's a crude and handy way to update some content after the initial load without JS, it was also widely used with JS before Web Sockets - to push messages to clients with less latency than XHR polling.
"Content-Type: chunked" is much better because it gives you the size of each chunk upfront! But that requires .js and also was buggy in IE until version 7.
I made a multiplayer online system that relies on chunked: https://github.com/tinspin/fuse
I wonder if this includes SVG images...
EDIT: It does!
Edit° (~1 hour later): So I've been checking every few minutes, and it's consistently been at ~450, this is a cool metric to have.
°This link was posted to HN ~4 hours ago. I first checked ~1.5 hours after it'd been posted, and it was #1 on the front page. It's still #1 as of this edit. The counter has consistently been between 445 and 455 the whole time.
The application has retained a 3 day uptime. But I don't think the 450 number is entirely true with Nginx causing an artifical cap on it. Would probably be higher if I didn't hit that limit. Drats, upped the limit, hope it recovers.
Edit: #4, post is six hours old, counter up to ~190.
Edit: #5, ~6 hours, 173 points, ~220 current site readers.
Edit: Back to #4, ~6 hours, 186 points, ~300 site readers.
Edit: #12, ~11 hours, 276 points, ~445 readers
> So now we know that a #1 post on HN has about ~450 concurrent visitors
I thought you were trying to get a measure of traffic to HN. In that case, the number of visitors to the site is only an approximation. But if you were talking about how much traffic HN directs to the #1 post, then the number is exactly correct.
Indeed I was :)
This worked surprisingly well. They started with almost pure HTML with the results split into multiple tables to produce what looked like a single table of results. The only drawback was that the columns needed fixed width so that they would align properly.
Later it was replaced with a standard XHR polling mechanism.
It is interesting to look back at what we had to do that is trivial with today's technology.
Maybe with CSS these days you could make only the last child of a div visible, and have constant updates with no need for JS.
Anyway. I quite like this idea of no JS. I know he said it would be evil to serve ads using MJPEG, but I'd probably prefer it if my page had less JS on it.
We had a progress bar that would show the upload percentage, then the unzipping, then the virus scan on the server, then it would be marked as done.
So crazy what you can do with server-side-rendering if you really want to. :)
Funnily enough, it also uses another evil trick with css background images, which TFA mentions as well: https://underjord.io/is-this-evil.html
- dynamically generated animated GIFs
- Content-Type: multipart/x-mixed-replace : the way to animate images on the web since 1993 - before animated GIFs
Animated GIFs have been around since 1987.
Pedantry aside, as soon as web/HTTP became The Internet (protocol of choice), you are right that the rest was bound to happen.
However, exactly the fact that it wasn't too constrained and allowed a lot of messing around (including with HTML, compared to eg gopher which was more semantic) is what made it "win" over all the other protocols except maybe email (and even there, >50% of people read it with web clients).
That suite of protocols is a suite of internet protocols, and "web" (from "world wide web", certainly familiar from www in websites) is a combination of HTML served over HTTP. If it has evolved to mean the internet, my apologies, but 10-20 years ago if you said "web", you meant HTTP:
There is nothing evil in analytics by itself - it's invaluable for improving usability.
Didn't work on IE.
But still something really interesting
You can check it out here https://alertcamp.com/live/Apple,Google,Microsoft,Amazon,Fac...
Maybe you could build an in-memory video stream for each user and just serve it slowly but then they would likely need to press play or you rely on autoplay and I've no idea how well that would play with buffering behaviour.
This solution maps quite closely to the idea that MJPEG is intended for "live" video. And also I find it adorable that it is just an img tag.
Whilst you do have to actually trigger the play somehow, the range will come in controllable chunks, so you can respond in kind without trashing the stream.
MP4 was used for streaming long before WebM existed, and before HLS was established.
The table with pointers to all the frames cannot be made before all the frames are encoded and their size is known. Then you can shuffle the file and move the table to the start (known as faststarting), and in that case you can start viewing the video before it is completely downloaded.
This can't be used for live content, since the encoded frames does not exist at the time you start viewing in that case.
moov [moof mdat]+
Browser/firewall refusing to accept anything other than HTTP? HTTP range requests broken? Connection keepalive broken? I know, we'll segment the video into blocks and send a list of the blocks to download as individual files!