The whole reason people don't like JavaScript weighing down their pages is because they take longer to load. Loading three large images for every update can't be better.
I'd like to point out that this solution is inaccessible to users with impaired vision. That's probably okay for you (because only you are interested in the data), but this makes the solution insufficient for any public usage.
I can assure you the difference in load times between this and using a battle-tested library like D3.js will be imperceptible to the user. You could even throw a thin React app around it, and it would still be blazing fast to the eye.
I am not sure I follow. It's 3 images, so you manually refresh every 5 minutes (maybe that can be automated with meta-refresh so avoiding JS, although I don't see that much advantage over setTimeout?)
Ah, I should have explained better. Page load isn't really my main concern. I used to embed dynamic graphs (Metabase or D3.js) and my laptop would lag like crazy plotting all of those points and allowing for hover tooltips (which were nifty but didn't provide me much utility). I use R in the backend and it plots 10,000 points in a fraction of a second. Web browsers only have to fetch and display an image.
tl;dr I have fast bandwidth -- but I have an old CPU.
Thanks for explaining. I quite like it. Given a rarely updating but detailed graph this would perform better. But you do lose the interactivity benefits of JS.
A candlestick approach might allow you to reduce the amount of data to send to the client in d3 if you use R to aggregate the data first.