Hacker News new | past | comments | ask | show | jobs | submit login
A solar-powered, self-hosted version of Low-Tech Magazine (lowtechmagazine.com)
235 points by hemmert on March 16, 2019 | hide | past | favorite | 66 comments

The author is concerned about the embodied energy cost of a larger battery, so has a smallish battery which only lasts about a day.

For my small solar powered installations I generally use a lead acid battery with a zero embodied energy cost. When I pull a starter battery out of a vehicle, instead of immediately recycling it I use it as a storage battery for one of my remote solar installations, sometimes adding it in parallel to an existing one. They can then perform for years as a bad, but useful, low current storage battery. Eventually something horrific¹ happens or a cell goes completely² and I recycle the battery.

So the lead and the sulphur spend a few years longer between reincarnations as batteries, but the embodied energy is zero for my use.

¹ A camera/weather sensor station went offline after an unusually strong storm from the south. I suspect there was unprecedented wave erosion and dropped it off the embankment. When the site becomes accessible again I'll probably find it in a heap at the shoreline with ruined batteries from freezing. (Sometimes I lose the sulphur, but there is a world sulphur glut so I don't feel too bad.) Other causes of demise are failed solar chargers or stuck on loads, usually from wiring failures. (darned rodents)

² I'd like to make a lead acid charge controller which can detect a shorted cell and just call it a 10V battery instead of a loss, that would extend their lives. There is DC/DC conversion already happening at charge and discharge, so it shouldn't hurt efficiency significantly.

Thanks for sharing, do you have any pointers for someone who would like to know more about building a renewable energy rig?

Inspiring! The page looks great and it brings me back to the 1990s web (in a good way).

Although I gotta say, I think the criticism of 100% uptime is an ideological position, letting the perfect be the enemy of the good. In this case having their website always up would give it wider reach, and presumably convince more people to reduce their energy usage, which is the goal. But I can appreciate those who practice what they preach.

If the authors are here, you can further optimize your page.

For images I was able to reduce image-weight by 494 bytes, using lossless optimization, with optipng.

For the html, using zopfli to compress the page, I squeezed out another 3020 bytes. And although I didn't fix it up, the html code includes unnecessary remnants from pre-html5 like closing P tags, lots of end slashes, etc. I'd bet you could save a few KB (before compression) by removing it.

In all these are over 3514 bytes you don't need to read from disk, cache, encrypt, and send over the network using your 1-2.5W webserver.

For cryptography, you can save some cycles using ECDSA instead of RSA, and prioritizing aes-128-gcm over aes-256-gcm. And what about ChaCha? To me it seems logically inconsistent to spend extra cycles protecting this read-only low-energy site powered exclusively by solar. 128-bits are enough; in practice you only need to worry about side channels, not key length.

Just as an aside, on the topic of webpage optimization, here are some of my favorite tools: zopfli, pngquant, optipng, jhead, cwebp.

And here are some excellent resources:

  * https://webpagetest.org/
  * https://webspeedtest.cloudinary.com/
  * https://developers.google.com/speed/pagespeed/insights/
Then when you're done, check for html errors with

  * https://html5.validator.nu/
  * https://validator.w3.org/

Your optimization recommendations made me smile because it made me think of the "demo scene" (I mean this https://en.wikipedia.org/wiki/Demoscene) => are there websites similar to those "demos" that accomplish to present a lot, but using in this case of "website" as few server & network resources as possible?

(I did try to search but the query is tricky => I keep finding websites targeted at Amiga & C64 & similar stuff, but nothing about a "demo about the website itself"...)

> Although I gotta say, I think the criticism of 100% uptime is an ideological position, letting the perfect be the enemy of the good. In this case having their website always up

The web site is a mirror of a conventionally-hosted web site: https://www.lowtechmagazine.com/

I see a lot of links to https://www.lowtechmagazine.com/ on hacker news, and there's always a link in the corner of the screen to the solar server.

+ as a magazine, downtime gives an SEO penalty.

> Only one weight (regular) of a font is used, demonstrating that content hierarchy can be communicated without loading multiple typefaces and weights.

This point has nothing to do with energy use (or very, very little) but is a really great graphic design tip for those looking to make a site more pleasant to browse.

The other weight that's not being used in favor of regular is "bold". Instead of bolding, the font size attribute is changed to establish hierarchy. Faux small caps are also used.

From the CSS:

h1, h2, h3, h4, h5, h6 { font-weight: normal; }

A great graphic design tip for simple websites is stop using additional files. If you can't do it with a single HTML file, you're adding unnecessary complexity.

Any food beyond soylent is just unnecessary complexity. Any books beyond reference manuals are just unnecessary complexity. Any interior colors on your walls other than white is just unnecessary complexity. Any music beyond Bach is just unnecessary complexity...

Maybe people enjoy aesthetics other than minimalist? Maybe some people enjoy complex things?

Bach is, quite literally, famous for being baroque, not minimalist.

Soylent is way more complicated than potatoes.

Poor overly generalized statement there. Point being importance of identifying and addressing complexity as a 'I want'... not a 'I need'. Accept it or abhorr it, first recognize it as a complexity.

Both the parent and gp mentioned graphic design. I think you can make the case for designing something simply, over and above aesthetics.

I hope we can agree that adding unnecessary complexity to software is bad, and that simplicity should be something to strive for.

Am not saying graphic design is nearly as (inherently) complex as computer engineering, but neither benefits from complicating things that don't need to be complicated.

You provide a good example with your reference manual. The diagrams will be clear, the fonts readable, they wont have jazzy designs etc etc etc.

You must mean someone not JS. Minimalism in music is a 20th century phenomenon.

The thread is about a solar powered site and energy use.

It's more complex than that. From a technical standpoint, for many browsers, you probably want a few files but not too many. This is because the browser will download several files at the same time. Small files do have a bit of overhead for each request, so it all depends. Of course I've way over simplified the matter myself.

To me additional files doesn't always mean additional complexity. It's the way files depend on each other and how responsibilities of certain files are expressed... Sometimes additional files can make the site's behaviour clearer and simpler.

Regarding image compression: https://homebrewserver.club/low-tech-website-howto.html#imag...

The original is 160KB. When I save it in GIMP, the re-encoding (at 90% quality be default) brings me to a baseline value of 164KB. If I turn the quality down to 50%, it appears to be almost identical but is only 72KB, less than half the size. Turning it down further until I can only barely make out more detail than in the dithered image (for example, I can see that the operator on the bottom left is looking at something, whereas in the dithered image I originally thought her face was a headphone and she was facing away from the camera), and at 7% the limit is finally reached. At 6%, the quality is no longer better than the dithered image. So at 7%, what's the size? 20.9KB, much less than the dithered image that is claimed to be 36.5KB.

I'm not sure why I double checked, but I did. You know what? The dithered image is not 36.5KB as the article claims, but a 76KB PNG. Which is impressive because with PNG, I can't get it below 140KB in GIMP. JPG and GIF also can't reduce it beyond 76KB. Their PNG encoder is much better than GIMP's, but I don't know where the 36.5KB claim comes from. What I do know is that JPG can get it to 72KB with virtually no quality loss, or 21KB at a quality still slightly higher than the dithered version. All you have to do is change the quality setting, an option available in almost every editor (no fancy tools or skills needed).

Dithering seems to be just chosen for its 1950s look, not for actual power saving.

> You know what? The dithered image is not 36.5KB as the article claims, but a 76KB PNG.

Incorrect, it is 36.5K:

  $ wget https://homebrewserver.club/images/international-switchboard3.png
  --2019-03-16 12:41:11--  https://homebrewserver.club/images/international-switchboard3.png
  Resolving homebrewserver.club (homebrewserver.club)...
  Connecting to homebrewserver.club (homebrewserver.club)||:443... connected.
  HTTP request sent, awaiting response... 200 OK
  Length: 37359 (36K) [image/png]
  Saving to: ‘international-switchboard3.png’

  international-switc 100%[===================>]  36.48K  26.4KB/s    in 1.4s    

  2019-03-16 12:41:14 (26.4 KB/s) - ‘international-switchboard3.png’ saved [37359/37359]

When you tried reproducing the results using GIMP, were you creating similarly constrained indexed-color/paletted images?

You likely won't get the same compression levels with rgb pixels, you can use `pnginfo` to compare the file properties:

  $ pnginfo international-switchboard3.png
    Image Width: 800 Image Length: 655
    Bitdepth (Bits/Sample): 2
    Channels (Samples/Pixel): 1
    Pixel depth (Pixel Depth): 2
    Colour Type (Photometric Interpretation): PALETTED COLOUR (4 colours, 0 transparent) 
    Image filter: Single row per byte filter 
    Interlacing: No interlacing 
    Compression Scheme: Deflate method 8, 32k window
    Resolution: -547765670, 22093 (Unknown value for unit stored)
    FillOrder: msb-to-lsb
    Byte Order: Network (Big Endian)
    Number of text strings: 0

Huh that's weird, my Firefox definitely reported that. I also checked the other image, that size matched perfectly with the claim in the text. My first thought is http-level compression, but that shouldn't be that effective on an average image.

(I'm not at my computer anymore so I can't check what you suggested right now.)

What I find really frustrating about this is that the whole premise is supposed to be about how using older technology can be more environmentally friendly, yet they used a newer codec that simply wasn't appropriate to the task they threw at it to give the impression of "old" while wasting more resources.

And to throw one final nail in the coffin, if they used a modern, but reasonably mature codec like VP8, they could have done even better on resource usage while retaining the same quality.

If you want to build something to demonstrate that using older technology has advantages, that's fine. But you should make sure that you're actually demonstrating what you intended. And I'd pick something other than energy efficiency to target, because the last decade has seen a ton of technological progress improving energy efficiency because of its importance to mobile devices running on tiny batteries.

Just a query - what about the power-saving on the users machines from easier decompression?

I posted this elsewhere in the thread, but JPEG seems to be significantly easier to decompress than PNG. Probably mostly due to hardware being optimized for it, but it might also have to do with the way the encoding works. Decompressing JPEGs mostly relies on doing some relatively fixed operations on localized 8x8 tiles, while PNG uses DEFLATE with a window size of up to 32KB.

All I noticed is that my browser scrolled less smoothly when an image was in full view. Didn't try with the jpegs but it didn't seem particularly efficient. You could be right though, this is a subjective n=1 kind of remark.

I'm confused by the choice of image format. Why use dithered grayscale PNGs for these photos when a full color JPEG at the same bit rate looks a hundred times better. Are they just sacrificing quality for the sake of a low tech aesthetic, without regard for actual efficiency?

It seems to be purely about aesthetics and ridicules other things said on the page. JPEG would not only be quicker to decompress but could also be much smaller for a similar "geeky" look.

The image aesthetic is emulating the appearance of a dot-matrix printer. It's just a low-tech motif to go with the publication.

That would be all well and good, but the page specifically claims that using dithered images lowers energy use.

They may have ran metrics on decoding processing time neé energy usage

I doubt it. The only study I could find on energy usage in image decoding was this


which found JPEG decoding to be much less energy intensive than PNG decoding, mostly because it's faster. Note that that's not just a function of file size, as lower quality JPEGs actually took more energy to decode.

Which all makes a lot of sense when you consider how much work has gone into optimizing frequency domain based compression and decompression.

There was a comment section with a lot of discussion of this, but it doesn't appear in the "past" link.


Just for reference, their normally-powered website is also available: (I prefer despite the solar novelty)


And likewise contains some amazing content.

Whoa the redesign is beautiful compared to this!

In contrast to the forcing function of mobile and special constraints, the additional freedom provided by UX on desktop is simply filled up with sidebars and ads.

Designers just don't know what to do with the extra space.

Reading any news article at a browser width of greater than 720px is a disaster. We've forgotten how to design for the desktop.

There's a more technical version of the original Low Tech article available[1]. I've been tracking this project for a while and found it really inspiring.

I've been starting to build my own systems based upon the Raspberry Pi 2/3 and the ODROID-C2, and currently use the ODROID as an ultra-low-power server. Most of my documentation and configuration so far has been in a private repo, but with this much interest I made it public[2] in hopes that others would be willing to issue PRs and work on it with me. If you're interested in this type of optimization definitely contact me, I'd love to work on this stuff more with environmentally-interested people.

[1] https://homebrewserver.club/low-tech-website-howto.html

[2] https://github.com/rarecoil/ecoserve

>We were told that the Internet would “dematerialise” society and decrease energy use. Contrary to this projection, it has become a large and rapidly growing consumer of energy itself. According to the latest estimates, the entire network already consumes 10% of global electricity production, with data traffic doubling roughly every two years.

That's not a complete statement unless you also include the energy use associated with the paper processing that no longer happens, along with controlling for the massive increase in the number of people with access to computation. If all communication that presently happens over the internet was to be redirected to other forms energy use would probably not change that much, after all the telecoms put it all on the same backbone anyways.

"paper processing that no longer happens"

I wish. I've yet to see an actual paperless office. And paperless bills have just been replaced by the marketing they send you instead.

Unless you're talking about newspapers, you may have a point there.

I haven't touched an office paper since I signed my contract. The fully paperless office may be a myth, but there's still plenty of paper processing that no longer happens.

As for the marketing, you can refuse that use of your data, at least if you live under the aegis of the GDPR.

Battery, weather and panel status:


What I find interesting about this is that I need to read either entirely in the yellow or entirely in the blue part. For some reason I can't read where the divide is.

I think that you're kind of right - I feel that some part of my brain treats the 2 colors as 2 different chapters.

It would be interesting to add as well "wind" (a "wind speed detector"?) and see how/when solar & wind are active over some timespan :)

I dunno, feels outside the scope of the point of the site (and the point of the solar meter on the site). There are already millions of weather stations all over the world.

I’m not the person you replied to, but I don’t think they want to monitor wind for weather-related reasons. I think they are referring to monitoring wind power generation as related to solar power generation, to see how the wind compensates for the lack of sun, at night or on cloudy/stormy days

Right, that's what I meant - thx for expanding/explaining my post :)

> Why does it go off-line?

> How often will it be off-line?

> When is the best time to visit?

I'd use a Service Worker [1] to go offline-first. Host and serve content that goes offline automatically unless you edit it too often.

Anyway, interesting extreme project.

[1] https://developer.mozilla.org/en-US/docs/Web/API/Service_Wor...

> DESIGN SOFTWARE: The website is built with Pelican, a static site generator. We have released the source code for ‘solar’, the Pelican theme we developed here.

Did they share the article sources too? I didn't see the article markdown/rest file in their lowtechmag github account.

Also, it'd be cool to share a torrent of the sources for when people's access to github goes down. There's so much you can do when you have software like pelican installed, if you have a site's theme, content, and config files.

Low power and very pleasing to read. The Web could be a nicer place if more websites followed this design.

I personally found the "email to comment" charming.

This is a low-tech website:


It's also more readable than whatever this is.

It may have a low footprint, but there’s a lot of high-tech involved to make a solar powered website work.

I think the whole website is about appreciating what works with less ressources (and might ultimately be much simpler). And to achieve this, you don't necessarily, have to do away with tech (like most people scared by the idea of less growth typically assume and then start BS about: do you want to go back to hunting..). I profoundly dislike the idea of using brute-force subsampling for saving on data though (especially for diagrams) - if you have information to convey, do it clear and so that it lasts, otherwise skip it.

I've always wanted to make very low power micros to do simple sensor/comms things in fully autonomous remote locations, but I don't know how to make the solar requirement economical, compared with the cost of the micro.

Is there a _cheap_ way to add solar to very low power boards?

AliExpress has lots of cheap panels and 5V solar controllers. I suppose the trick is finding one with a credible 3rd party review/how-to.

Search for "solar panel usb", you can find 2W models for <$5, or 10W for ~$12. Add a powerbank and you're set.

You even get cheap powerbanks with solar integrated.

I would like to do something like this with ESP8266 chips in a distributed solar-powered website. It you snag free WiFi you lose the power-hungry router, but only put it off on the local Starbucks-other-shop-with-free-WiFi.

Pretty cool, seeing the battery status is pretty cool. I wouldn't do this but seeing the battery status tempts me to come up with ways to run it down faster just to test it out. Anyone else think the same thought?

There's a CPU load average and some other stats here: https://solar.lowtechmagazine.com/power.html

Running the Qualsys SSL server test against the site does appear to raise the 15 minute load average several points, though it's hard to say if it's just it being on HN doing that by itself.

I assume it would have to be pretty cloudy to affect the charge level.

This is a cool, geeky project and I'm not trying to bust on it or be snarky, but I have to wonder how many continuos watts of power draw it would add to the world to just host the site on an AWS EC2 nano?

I think it's kind of pointless.

Humanity uses more kWh -equivalent of energy in 1 seconds of an airline flight than everyone who ever has, or ever will use viewing this website. Shit, just takeoff is something like an equivalent 90 MW or so worth of electricity: https://aviation.stackexchange.com/questions/19569/how-many-...

I think society would be better served by engineers learning and working on new technologies to extract and use energy, instead of pushing bits around on a website. Why not get a degree in Nuclear Engineering, or Oil and Gas Engineering? It would certainly benefit society more than this.

It’s a static site, so perhaps S3 (or any static cloud storage origin) and any CDN are a better comparison. What is the total power consumption of 1/nth of a large pool of cloud resources compared to the power draw of the router they are using in the office that is powered by the grid?

Regardless, I must say I love the “offline is ok”, “slow computing” (ala slow food), “get off my lawn”, “sustainable beats schfnacy” attitude. Nice one!

At some level of popularity, it might use less power on a CDN, since you would avoid a lot of routers and links. The math on power savings for Netflix local ISP cached content might be interesting, even though that's not really the point.

This is super awesome, and this is the kind of thing that makes me long for an early retirement where I can tweak how I live.

Cool. I read a few articles. Definitely bookmarking and returning for more.

Could it benefit from using something like IPFS?

That sure is using a lot of recent modern, low power tech to “question the belief in technological progress”.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact