But whatever this site is doing with the fixed-scroll yellow block is super weird. There's a "95%" thing next to a sun - is it somehow related to the solar battery powering the server?
Rather than focusing on the content, the design focuses on the infrastructure at the expense of the content. Maybe that's the point, but this particular element is certainly off-putting.
The site was 95% solar, but the blue bar at the top was 100% distracting. I could barely read the content.
<div id="battery" style="height: 95%;">
<a href="https://solar.lowtechmagazine.com/weather/"><span id="sky"></span><span id="level">95%</span></a></div>
Yes, but does it also consume less power, formatting the images this way?
I have no concept of whether even 10% means another hour or another year, and since it's load dependent, neither do they. If the power is at 75%, what do I care?
It really belongs on a status page, and as a banner, pinned to the top of the page. If anything, it motivates random people to slam the page with more hits, just so they can watch the meter move.
That was my second impression, once I realized what it was. I wanted to watch for a chnage. My first impression was that the styles were defective, it was a scripting error (until I read that extra fluff was eschewed) or whatever, but I just wanted to find a way to make it get out of the way, so I could read without the distraction, so I started looking for "dismiss" icons, to close a broken pop-up.
Then I noticed the Sun icon and the percentage, and realized it was a meter.
Here's my unsolicited design riff on the "battery indicator." I'm a fan of the concept, but the 100vw element width of the yellow fill was distracting me from the rest of the content. I used DevTools to mock up a narrow version which fits within the right content gutter, right next to the scroll bar. I had to tweak the percentage metric display a bit by stacking the sun above the value and removing the "%" indicator, but feel the end result has increased readability and is still understandable.
I'm just curious to know how my prediction that this won't grow any larger than it is today could be wrong.
"Less than 100% reliability is essential for the sustainability of an off-the-grid solar system."
IMO, this can only become mainstream once there is no longer an expectation of "100%" uptime for websites in general.
It's far-fetched, but I can imagine it becoming mainstream if mesh networking becomes popular, or if the Internet were to become a widely self-hosted platform (e.g. people had their own "clouds" running in their homes). In such situations, 100% uptime would no longer be an expectation, so the perceived cost of using an off-the-grid system would be reduced.
This article is a perfect example. This solar powered server which does not have 100% uptime for artistic reasons is probably sitting in someone’s kitchen next to a refrigerator which uses many times as much power and definitely has 100% uptime.
The solar webserver is another step towards that goal.
Except that the making-cold mechanism of fridges have nowhere near a 100% duty-cycle. You probably wouldn't notice if it didn't even meet two "nines" of uptime.
In fact, that's how frost-free freezers work, by taking advantage that fact to run a heater(!) for some number of minutes several times a day. A defrost timer that does 10 minutes every 8 hours reduces compressor uptime to 98%.
Though transmitting data that far might be against the point of the art project.
I welcome that day.
On the other hand, one of the first things people with disposable income want today is good internet access, so as income grows for the middle five billion (and it's growing rapidly), there's tremendous market pressure to supply them with faster internet - maybe not up to the standards of the top billion, but not "low tech" either.
Just my 2¢
I can see more wide spread adoption when people pay 2/mo for twitter and 3/mo for facebook.
You could hook up a raspberry pi to a solar system, weatherproof it, then deploy it in the wild. Add a wireless "router" of some sort, and then you have a mobile broadcasting station.
Imagine a world where such systems are very cheap to build. Also -- imagine that they could be "networked" if they are in range. Finally, imagine hobbyists building out that network for fun. Maybe in neighborhoods, public parks, etc.
Imagine the technology greatly improves and gets even cheaper. These are plastered everywhere, by everyone. They can transmit and receive data very quickly, over reasonably far distances.
Maybe it becomes an off-grid anarchist internet. :)
1.4MB (most are thumbnails) and kept to stock bootstrap 3
1. If there was a way to show others:"i'm saving energy while browsing the web"
2. People using those sites could use this as a tool for telling themselves a story about themselves
3. site builders would choose this from similar reasons
Maybe the user community that will form around this will be interesting, so it would be worthwhile to join, for the sake of people and content.
(In this case, the tangible benefits are highly debatable, especially since we moved on from CRT monitors - although I'd be interested to see data from OLED displays)
Can anyone here think of [...]
how [...] this experiment
could [be] squeezed into
a form [...] as the next
big [...] thing?
Right now we only think in terms of wi-fi routers and phones, but when 4 out of 5 useless gadgets (oh boy! internet-enabled shampoo bottles that also can advertise hair gel! yaayy!) have a battery and 5G internet, an RSS/JSON/XML feed of their status will be abundantly relevant.
To... someone. Probably not me, but I'm sure someone will want this.
1) Reducing website bandwidth and computational usage (static sites, small images, reducing extra resource requests). Super common topic that HN talks about all the time.
2) Reducing the electricity bill for the server and powering it with renewable resources. This part is quite neat, I think, but for a single bespoke website, the energy savings of using a single low-power server seems pretty trivial in the grand scheme of things — I could get similar savings by changing a living room lamp to a more energy efficient bulb.
This inspires me to propose a third category of energy efficiency that the article doesn't touch on: human energy costs.
Humans are quite energy intensive devices, and I suspect that the labor and human-maintenance costs of making this website dwarf the direct energy costs of running a webserver. I'd be curious how one would go about optimizing these other energy costs, and whether that can be done in a way that still preserves quality of work/life for the humans involved.
It took me time to work out where to start... Once you think it through you realise it is pretty easy and obvious. But (and this gives away my age a bit), I have not built a static webpage since school (over 10 years ago), and have become so accustomed to all the crazy shit that goes into web development today like Webpack, NPM, React, Angular etc etc that I had forgotten how to actually just serve a piece of HTML.
I have found that static generators, specifically Hugo, have made just plain HTML fun again. They give you just enough power to not repeat yourself endlessly (e.g. maintaining each page and copy-pasta all changes between them) without turning into the system's we're all trying to step back from.
Neocities? It's where I occasionally dump my static-can't-be-bothered sketches.
The effort required to produce a site like this would be negligible, given that it's almost entirely plain HTML. Dithering the images would take seconds in something like GIMP. The HTML looks to have been produced by a tool, based on the formatting, so they wouldn't need to spend time working with markup.
To illustrate, I'm going to invent some numbers:
Let's say that this site on "standard" hardware would use 1kW per day. This means the existing site saves ~950W per day looking at just the hardware energy use.
However, the hardware it's running on had to be produced, let's say that costs 10kW. The solar panels and batteries need power to make, let's; say they cost another 30kW of power.
Additionally, the system needs to be designed and built by a human being. Humans use a lot of energy. Let's say that this took 1 week to design and build. Let's say a human "uses" 50kW of "power" per day (e.g. from food production, etc.). This means it costs an additional 350kW of power to design and build the infrastructure.
Adding up all these made up numbers gives us a total infrastructure cost of 390kW. This means that, in order to "pay back" the difference, the new system has to operate for (390,000 W / 950 W/d =) 410 days.
These numbers are wildly inaccurate and cut a lot of corners, but this should give an idea as to what the grandparent was getting at.
Sidenote: I'm not, personally, trying to make a point about the project itself. Just, in order to fully understand the energy impact you have, you need to take a lot more into account than just "the amount of work performed by the computer".
The page was also framework after framework piled on top of each other, so they’d be using React and jQuery at the same time, stuff like that. And so marketing could change whatever they wanted whenever they wanted without breaking the entire site, everything was served from a CMS.
So it’s probably a lot of that.
At a previous gig we had a similar situation. It was so bad my coworkers and I would joke that we needed one of those "break in case of emergency" boxes with the glass front and a hammer hanging adjacent. Inside the box: a can of lighter fluid and a match. The purpose: when the crazy got to be too much you just light own hair on fire and then go run around in traffic.
What you just described would definitely be a break the glass kinda situation :-)
Before going static as well I made a WP theme with Twig, removing most <?php wp_magic_function(); ?> from there. It was fast, way faster, than the usual WP themes. Obviously nowhere close to a static HTML.
Thats probably it. Some beginner has written an n+1 query and the page is slow as shit now.
The example image given can be re-compressed using techniques like structural similarity (SSIM) or Multi-scale structural similarity (2003 Paper). Using jpeg-archive I resized the example image from the blog with decent results:
/tmp ls -1sh *.jpg *.png | sort -nr
Another proposal for image compression is to fall back on SVG, c.f. https://jmperezperez.com/svg-placeholders/ -- at least this does not need JS.
Incidentally, even where PNG is appropriate (images with crisp edges, such as text and icons) you can frequently get a large size savings by applying zopflipng, which uses an alternative GZIP compression algorithm to get a higher-efficiency compression, at the cost of greater CPU use for encoding.
If you take this image, it's 43 KB for a 800x533 image. I'd expect a JPEG to be able to do that without too much in the way of artifacts.
It does give the page a kind of neat newspapery vibe, though.
For example, looking at their response headers I see the image cache control expires after a day, plus has validation caching. This could be significantly improved. First, get rid of the `etag`, `expires`, and `last-modified` headers and go with expiration caching only. Increase the `max-age` from one day (86400) to one year (31536000). I get the HTML expiring after 24 hours, but images should be more aggressively cached. Validation caching always requires a round trip to the server, plus running validation rules on every single request so it can't save nearly as much bandwith and CPU as simple expiration caching.
Also, if they got rid of the those headers, plus the pointless `server: nginx/1.10.3` they could save over 100b per HTTP response. I counted 19 requests on that article, so it may seem small, but it can add up to a significant amount.
However, since we were still working on the site at the time it got massively popular, we changed the caching settings in order for people to see the changes and fixes that we were pushing.
I'll look into those server headers as well!
Since it's a statically generated website, I'd favor never expiring the cache, and using cache-busting hashes instead. That way the cache is only emptied when the file actually changes.
I do agree that the consistency is nice for aesthetic reasons, but poorly rendered images do detract from the overall message.
I tried testing some variants and made a static HTML gallery on neocieties. TL;DR:
- cleaning up the photo before filtering it makes a huge difference in image quality and compression.
- quality 80 JPG is 86,6KiB in color, 70,9Kib in grayscale.
- quality 40 (which is ugly but still has readable text) uses 44,8KiB and 37,8Kib respectively. The latter looks a lot better than the dithered image IMO.
- similarly, the grayscale PNG with pushed shadows and positioned dithering has readable text and contains significantly more detail at 4 colors and 38,7KiB than the 16-color 42,4KiB image on their server.
That does somewhat depend on whether we are talking about positioned or Floyd-Steinberg dithering. Positioned dithering is a lot more regular and predictable, and hence easier to compress.
EDIT: I just pulled their RSS feed - 50 articles currently in rotation! 32.42kB according to FireFox of just snippets of text and links!
^ seems to have full articles.
This isn't very useful if you want your RSS reader to pull down articles for you to read while you're offline. For that to work, you want the feed itself to have the full text of every article embedded inside it.
My suggestion is to get a larger solar panel. Solar is cheap now, less than US$1 per watt. Why not get a 100W or a 200W panel? It will put out enough power to supply the server and router on a rainy day. The battery will only be necessary overnight.
The price of 18650's is low enough that building your own pack to handle 30W continuous overnight is not costly. To help with the environmental impact and cost you can get used batteries that are being recycled or thrown away.
This should be an obvious choice for sites not requiring dynamic elements. Also their choice of comments through email is a good solution to a problem static-made blogs have.
That won't work for every case, but is interesting.
That's nice. I'm already disallowing sites to choose their own typeface. Except if a site is about design, then the typeface doesn't matter and probably I won't prefer their choice over mine.
>No Third-Party Tracking
This makes sense and I don't understand why self-hosted sites are doing it. The logs are there. Analyzing them is all that is needed.
>No Advertising Services
I'm fine with that but some sites rely on ads. I ain't aware of a method that will keep with their goals and provide ads.
1) Laziness. A third-party JS tracker usually comes with a complete dashboard, full of pretty (and sometimes even useful) graphs.
2) Data. Client-side trackers can spy on users more, giving you more information you can e.g. missapply in an A/B test trying to drive "engagement".
RE 1, there exist tools aimed at analyzing server logs. I played with GoAccess a bit, it's quite OK. https://goaccess.io/
Lazy-loaded 1px images can be used for tracking how far users have read and a/b testing can be done by compiling multiple versions and redirecting users to the version you want them to try.
Matomo looks really polished and if it provides the features you need, it seems like an obvious choice. If OTOH you're looking at rolling your own with ELK, Gravwell might make sense.
This is one I find interesting. I doubt they want to manually go through every email. So some semi-automation sounds sensible too. What would be an elegant solution to that?
But the end of article stuck out at me.
> we'll be offering print on demand versions of our blog.
How is that consistent with low energy footprint? Will the printers and the cellulose factory run on solar as well?
But the question is indeed an intriguing one.
Whether you're doing it for energy purposes or not, it's a fun project because it feels a bit like sticking it to the man, since we're used to always hosting our sites on someone else's machine.
It worked great! I eventually migrated to a VPS when the hardware started to get too flaky with age (and years of accumulated dust sucked through the fan inlet)
Forecast: today Partly cloudy in the morning and breezy until afternoon. / tomorrow Partly cloudy until afternoon. / day after tomorrow Clear throughout the day.
For that, they pulled in JQuery and made a XMLHTTP request.
If their intent is to lower energy consumption of their server alone, offloading this weather check is the correct choice. But surely requesting+caching that info on their end (by perhaps rebuilding the static site every hour with an updated forecast) would be overall more efficient than having X number of clients request the data.
Surely rebuilding the website every hour (and invalidating the cache for X number of clients) would result in more requests.
The dithering is obviously not to everyone's tastes, but I like the paper feel and think it's appropriate for a 'low tech' brand. Bravo to the lowtech team.
That being said, the point of the project is to produce a website that is 100% solar powered. They never said they were aiming for something that is feasibly sustainable.
The hardware choice is arguably overkill as well, at least if wifi is available and on use. Run it on a repurposed old phone, eliminate its share of the power draw of a network switch, etc.
I adore this for whatever low or high tech. As for me all this web-font stuff seems a huge piece of BS (except for rare cases when a website is meant to be more of a piece of art than serving a practical function or when you need to use an exotic language letters of which are rare to be found in the standard fonts on clients' computers) and I just block external fonts all the time.
I had someone complaining to me that my mailto: link doesn't work but actually its just they had the wrong mail client set as default on their system.
> Increasingly, activities that could perfectly happen off-line – such as writing a document, filling in a spreadsheet, or storing data – are now requiring continuous network access. This does not combine well with renewable energy sources such as wind and solar power, which are not always available.
I'm surprised no one has mentioned progressive web apps. This seems like a perfect use-case for them.
On a related note, I just found this research showing that Service Workers (on which PWAs are built) "do not have a significant impact over the energy consumption of [mobile] devices, regardless of the network conditions." 
It's the second place I see such low average size values (I'd say, somewhat reasonable), but I'm having a hard time believing in them. On a typical site, when I hit F12 and proceed to Network tab, I usually see ten times greater values. I fear that those statistics are either undercounting something, or 500k is just so many domains that the long tail of lightweight sites drags the average down, making values not representative of the real situation. I wonder how those stats would look if the average was weighted by the number of visits.
I'd really hoped it would catch on.
Sadly, that lightweight concept is now so forgotten that I can't even find it in a quick search. (perhaps someone else can find a pointer?)
Seems I'd misremembered it as being more recent than in was, last one in 2002. They apparently 'grew' to 10K, and that site is now 404 also.
At this point, 50K would be a welcome respite from the mountains of cruft that download with every click...
I will build your static html page for money. Hit me up. ;)
I don't have a cellphone myself, so i never tested it.
I'm just saying the low tech and mobile compatibility aren't mutually exclusive.
That's a low-tech website. Absolutely nothing but content.
The design team of solar.lowtechmagazine.com is squarely on the side of the bettermotherfuckingwebsite.com
From there you can heavily reduce the color count and select the dithering type.
Authors probably deploy by pushing to a git repo or ssh-ing into the server.
I would be happy to have a distraction-free device to focus on my code. But I am not sure the screen is large enough to really work on it.