Hacker News new | comments | show | ask | jobs | submit login
How to Build a Low-Tech Website (lowtechmagazine.com)
600 points by RomanZolotarev 80 days ago | hide | past | web | favorite | 183 comments



The dithering on images is actually kinda nice. Less visual over-stimulation.

But whatever this site is doing with the fixed-scroll yellow block is super weird. There's a "95%" thing next to a sun - is it somehow related to the solar battery powering the server?

Rather than focusing on the content, the design focuses on the infrastructure at the expense of the content. Maybe that's the point, but this particular element is certainly off-putting.


It's obviously subjective, but I really like the yellow-ish block background for the battery level. Lends the site a distinctive paper-like feel that reminds me of the Financial Times.


I also liked the yellow color. I've been finding myself using the evening feature ("Night Shift" on iOS) a lot lately.

The site was 95% solar, but the blue bar at the top was 100% distracting. I could barely read the content.


Yes, halfway into the article they explain that it’s precisely that, and what motivates them to make it visible to users.


  <div id="battery" style="height: 95%;">
    <div id="bat_data">
      <a href="https://solar.lowtechmagazine.com/weather/"><span id="sky"></span><span id="level">95%</span></a></div>
  </div>
Looks pretty much like the battery level and current weather conditions ;-)


Before starting to read the article, I tried clicking and dragging on that element and searched the page for "brightness" and "dim" before peeking at the source to realize it isn't a brightness control after all.


Well OK, this makes me wonder: if it truly is a static site, how are they injecting all this dynamic data? Or are they just faking it by re-generating the site periodically?

edit: nvm, just found the javascript that rejiggers the DOM based on data from https://solar.lowtechmagazine.com/api/stats.json


I feel that the focus on infrastructure _is_ the content. It may be over the top (dithering is especially BS), but without battery indicator how many people would discover that the server runs on sustainable solar power?


I agree with you on the yellow background battery meter, but not the dithered images. At desktop sizes, it reminds me of the low-res dithered hell of the 90s, except its worse because it's effectively yellowscale. I hate image dithering with a burning passion. I bet the same images would look better as low res JPEGs (they are currently PNGs, which are made less compressible with the dithering noise).


It might be annoying if you're really trying to get a good look at something, but I think it is kind of a neat aesthetic. Some of the old PC-Engine/whatever dithered art looks great.


Yeah I thought that yellow block was an ad overlay or something at first. They should move it to the side so it doesn't affect reading the content.


> The dithering on images is actually kinda nice. Less visual over-stimulation.

Yes, but does it also consume less power, formatting the images this way?


Sure, as it reduces overall filesize. They could have achieved similar efficiency by compressing full-color images though.


Or they can do away with some of the useless images and save a lot more bytes.


> The main challenge was to reduce page size without making the website less attractive. Because images take up most of the bandwidth, it would be easy to obtain very small page sizes and lower energy use by eliminating images, reducing their number, or making them much smaller. However, visuals are an important part of Low-tech Magazine’s appeal, and the website would not be the same without them.


Yeah, as a random viewer, I really only need to know that the power is running out when the last bit of charge is nearly used up, at like 5% maybe.

I have no concept of whether even 10% means another hour or another year, and since it's load dependent, neither do they. If the power is at 75%, what do I care?

It really belongs on a status page, and as a banner, pinned to the top of the page. If anything, it motivates random people to slam the page with more hits, just so they can watch the meter move.

That was my second impression, once I realized what it was. I wanted to watch for a chnage. My first impression was that the styles were defective, it was a scripting error (until I read that extra fluff was eschewed) or whatever, but I just wanted to find a way to make it get out of the way, so I could read without the distraction, so I started looking for "dismiss" icons, to close a broken pop-up.

Then I noticed the Sun icon and the percentage, and realized it was a meter.


I love the aesthetic! Great work.

Here's my unsolicited design riff on the "battery indicator." I'm a fan of the concept, but the 100vw element width of the yellow fill was distracting me from the rest of the content. I used DevTools to mock up a narrow version which fits within the right content gutter, right next to the scroll bar. I had to tweak the percentage metric display a bit by stacking the sun above the value and removing the "%" indicator, but feel the end result has increased readability and is still understandable.

https://imgur.com/a/IIPZILz


Lower tech access to the image:

https://i.imgur.com/o0Dtkai.png


I strongly agree that design element makes it look broken. I checked my uMatrix to make sure it wasn't blocking any styles before realizing that it was part of the design. I had a similar thought about a vertical bar that doesn't take up 100cw. Nice mockup!


This looks great! That was my one complaint about the site, the meter looked like a broken element. I figured out what it was pretty quickly and I love the concept, but it just didn't look quite right.


Small sub-cultures, like this one, do experiments all the time in their little corner of the web. Some of the cultures die after a while, and some grow into something else, something bigger, something that starts to percolate on the mainstream surface, albeit possibly only tangental to how it started. I'm not sure I'd predict this movement will grow into something that reaches mainstream tech, but let's say I'm wrong, just for fun. Can anyone here think of a potential narrative that would describe how the current state of this experiment could possibly be perturbed and squeezed into a form that the mainstream soaks up as the next big (or medium-sized) thing?

I'm just curious to know how my prediction that this won't grow any larger than it is today could be wrong.


I think the article hits on the key problem:

"Less than 100% reliability is essential for the sustainability of an off-the-grid solar system."

IMO, this can only become mainstream once there is no longer an expectation of "100%" uptime for websites in general.

It's far-fetched, but I can imagine it becoming mainstream if mesh networking becomes popular, or if the Internet were to become a widely self-hosted platform (e.g. people had their own "clouds" running in their homes). In such situations, 100% uptime would no longer be an expectation, so the perceived cost of using an off-the-grid system would be reduced.


Having been in the “mesh” space for years, there’s a ton of this techno-primitivism fantasy floating around. The reality is that people want to access the internet and they want it to work well.

This article is a perfect example. This solar powered server which does not have 100% uptime for artistic reasons is probably sitting in someone’s kitchen next to a refrigerator which uses many times as much power and definitely has 100% uptime.


They seem to be working on that as well: https://solar.lowtechmagazine.com/2016/05/how-to-go-off-grid...

The solar webserver is another step towards that goal.


> refrigerator which uses many times as much power and definitely has 100% uptime.

Except that the making-cold mechanism of fridges have nowhere near a 100% duty-cycle. You probably wouldn't notice if it didn't even meet two "nines" of uptime.

In fact, that's how frost-free freezers work, by taking advantage that fact to run a heater(!) for some number of minutes several times a day. A defrost timer that does 10 minutes every 8 hours reduces compressor uptime to 98%.


A refrigerator is not 99.999% available (power outages). Maybe 99.9% if you live somewhere without trees and snow.


Maybe the webserver sits in a hut like Joey Hess' with a fridge that is also solar powered https://joeyh.name/blog/entry/fridge_0.1/


You'd be able to get to normal levels of uptime by putting backup solar servers in e.g. the Sahara and northern Australia, then taking them out of your DNS rotation when they dropped below 5% battery, surely.

Though transmitting data that far might be against the point of the art project.


I think it will more likely come when the true cost of high energy consumption is factored into our decisions, and people are forced to trade off high availability for low power consumption.


>IMO, this can only become mainstream once there is no longer an expectation of "100%" uptime for websites in general.

I welcome that day.


Why can't you do some non-linear optimization with your resource routing? Power grids have been doing this for ages.


Could well not happen, but the most likely driver of slimmed down websites, anyway, would be a desire to make them load better in India, Africa, parts of Latin America and eastern Asia that have low connection speeds. Not saying that's how it will go, but it's feasible that to get to the second and third billion internet users, you need to be slimmer.


Smaller Web pages would be nice on my phone in Massachusetts when I've got spotty service, honestly.


But how are they going to sell you the product AND sell you as a product at the same time? WE NEED DATA!


It's still kind of a niche, though. People who don't live where decent internet access is readily available also tend to not have enough disposable wealth to make great business targets, especially for social media and other sites that sell eyeballs rather than widgets.

On the other hand, one of the first things people with disposable income want today is good internet access, so as income grows for the middle five billion (and it's growing rapidly), there's tremendous market pressure to supply them with faster internet - maybe not up to the standards of the top billion, but not "low tech" either.


While unlikely, I could see more wide-spread adoption of this concept if a large online company began using the approach. If Twitter for example decided to cut out all their bloat, simplify served graphics and assets, and invest in sustainable energy, I could see other high-profile internet companies following their lead.

Just my 2¢


They would be getting rid of their product, which is to analyze and selling their users.

I can see more wide spread adoption when people pay 2/mo for twitter and 3/mo for facebook.


It seems to me that it would eventually be possible, to build a relatively cheap system that can both soak up power and transmit/recieve data without any external wiring.

You could hook up a raspberry pi to a solar system, weatherproof it, then deploy it in the wild. Add a wireless "router" of some sort, and then you have a mobile broadcasting station.

Imagine a world where such systems are very cheap to build. Also -- imagine that they could be "networked" if they are in range. Finally, imagine hobbyists building out that network for fun. Maybe in neighborhoods, public parks, etc.

Imagine the technology greatly improves and gets even cheaper. These are plastered everywhere, by everyone. They can transmit and receive data very quickly, over reasonably far distances.

Maybe it becomes an off-grid anarchist internet. :)


If you plaster random devices in public places I think you'll get something like this happening:

https://en.wikipedia.org/wiki/2007_Boston_Mooninite_panic


See "brutalist" websites. At the start it was very similar in spirit with this website: raw, functional, rejecting looks for something you appreciate in a different way. Now: ugly, full of js, heavy.


Its what I did with beatsaver: https://beatsaver.com

1.4MB (most are thumbnails) and kept to stock bootstrap 3


You made Beatsaver? Love your site, makes finding songs so much easier!


Maybe there's a niche there for environmentalists:

1. If there was a way to show others:"i'm saving energy while browsing the web"

2. People using those sites could use this as a tool for telling themselves a story about themselves

3. site builders would choose this from similar reasons

Maybe the user community that will form around this will be interesting, so it would be worthwhile to join, for the sake of people and content.


Option 1 has been done before:

http://www.blackle.com/

(In this case, the tangible benefits are highly debatable, especially since we moved on from CRT monitors - although I'd be interested to see data from OLED displays)


well, if the sub-culture takes off among young people then it could supplant selfies as the means by which people broadcast what they are doing 'right now' ... interestingly that would drive down interest in going viral as a sudden spike in visitors could kill someone's site until the next day


if it ends up being very easy to run a solar powered website- i.e. if it turns out the process can be refined and simplified nicely- then lots of places would do it since its cheaper and cool. but im sure lots of typical business constraints (uptime for instance) would dictate that lots of companies will never do it.


Seems like plenty of organizations optimize for 'cheaper', and this is one means of doing so.


  Can anyone here think of [...] 
  how [...] this experiment 
  could [be] squeezed into 
  a form [...] as the next 
  big [...] thing?
This is definitely firmly in the "Internet of Things" genre of niche interests. Remote power monitoring and other system diagnostics for systems designed to account for expected faults, and manage their own capacity to operate, for sure are useful for internet managed consumer appliances.

Right now we only think in terms of wi-fi routers and phones, but when 4 out of 5 useless gadgets (oh boy! internet-enabled shampoo bottles that also can advertise hair gel! yaayy!) have a battery and 5G internet, an RSS/JSON/XML feed of their status will be abundantly relevant.

To... someone. Probably not me, but I'm sure someone will want this.


Neat article, but I think we should distinguish basically two things here that the article blurs together:

1) Reducing website bandwidth and computational usage (static sites, small images, reducing extra resource requests). Super common topic that HN talks about all the time.

2) Reducing the electricity bill for the server and powering it with renewable resources. This part is quite neat, I think, but for a single bespoke website, the energy savings of using a single low-power server seems pretty trivial in the grand scheme of things — I could get similar savings by changing a living room lamp to a more energy efficient bulb.

This inspires me to propose a third category of energy efficiency that the article doesn't touch on: human energy costs.

Humans are quite energy intensive devices, and I suspect that the labor and human-maintenance costs of making this website dwarf the direct energy costs of running a webserver. I'd be curious how one would go about optimizing these other energy costs, and whether that can be done in a way that still preserves quality of work/life for the humans involved.


On topic 1 I was thinking to myself the other day, "I wonder if I can create a static webpage that isn't some sort of SPA or web app."

It took me time to work out where to start... Once you think it through you realise it is pretty easy and obvious. But (and this gives away my age a bit), I have not built a static webpage since school (over 10 years ago), and have become so accustomed to all the crazy shit that goes into web development today like Webpack, NPM, React, Angular etc etc that I had forgotten how to actually just serve a piece of HTML.

My dependence on all these frameworks and CLIs sent me on a quest to find out more about pure javascript, and I came across an incredible talk by a developer who created a pure javascript web app for music notation. I really wanted to link it here but cannot find it.


Yea here is the talk (https://www.youtube.com/watch?v=k7n2xnOiWI8)

I have found that static generators, specifically Hugo, have made just plain HTML fun again. They give you just enough power to not repeat yourself endlessly (e.g. maintaining each page and copy-pasta all changes between them) without turning into the system's we're all trying to step back from.


Yes, thank you! What a great talk.


> It took me time to work out where to start...

Neocities? It's where I occasionally dump my static-can't-be-bothered sketches.

http://neocities.org/


I was expecting an ancient 90s era neocities site, but it's actually very beautifullly designed. Well done Neocities.


There is a lot of user submitted 90s style content though.


The odd thing about this to me is that I feel just throwing those few MB of static files on a CDN would actually be more energy efficient. There is a great deal of energy needed to create and distribute a raspberry pi, and there is basically no energy needed to add a few files to servers that are already up and running constantly.


Is that a comment on the effort to produce a site like this, or the cheapness of energy and bandwidth?

The effort required to produce a site like this would be negligible, given that it's almost entirely plain HTML. Dithering the images would take seconds in something like GIMP. The HTML looks to have been produced by a tool, based on the formatting, so they wouldn't need to spend time working with markup.


What about the energy used to design and build the infrastructure? There's a lot more that went into this project than just the static site, which is something the article touches on.

To illustrate, I'm going to invent some numbers:

Let's say that this site on "standard" hardware would use 1kW per day. This means the existing site saves ~950W per day looking at just the hardware energy use.

However, the hardware it's running on had to be produced, let's say that costs 10kW. The solar panels and batteries need power to make, let's; say they cost another 30kW of power.

Additionally, the system needs to be designed and built by a human being. Humans use a lot of energy. Let's say that this took 1 week to design and build. Let's say a human "uses" 50kW of "power" per day (e.g. from food production, etc.). This means it costs an additional 350kW of power to design and build the infrastructure.

Adding up all these made up numbers gives us a total infrastructure cost of 390kW. This means that, in order to "pay back" the difference, the new system has to operate for (390,000 W / 950 W/d =) 410 days.

These numbers are wildly inaccurate and cut a lot of corners, but this should give an idea as to what the grandparent was getting at.

Sidenote: I'm not, personally, trying to make a point about the project itself. Just, in order to fully understand the energy impact you have, you need to take a lot more into account than just "the amount of work performed by the computer".


I am going to guess they are getting a massive surge of traffic about now and yet the site still renders in a fraction of the time it takes most "modern" web apps or even just static content sites - this is how it should be.


Honestly, I'm not even sure what modern website are doing when they generate pages. I've had highly dynamic PHP/MySQL websites running on crap hardware, and it was trivial to attain .03 second page generation time. Didn't even need fancy caching, just some common-sense SQL optimizations.


At my last job we had an ad bidding API that would reach out to every advertiser we worked with, got the highest bid, then presented an advertisement to them.

The page was also framework after framework piled on top of each other, so they’d be using React and jQuery at the same time, stuff like that. And so marketing could change whatever they wanted whenever they wanted without breaking the entire site, everything was served from a CMS.

So it’s probably a lot of that.


LOL - oh man... I can relate!

At a previous gig we had a similar situation. It was so bad my coworkers and I would joke that we needed one of those "break in case of emergency" boxes with the glass front and a hammer hanging adjacent. Inside the box: a can of lighter fluid and a match. The purpose: when the crazy got to be too much you just light own hair on fire and then go run around in traffic.

What you just described would definitely be a break the glass kinda situation :-)


It's crappy wordpress themes.

Before going static as well I made a WP theme with Twig, removing most <?php wp_magic_function(); ?> from there. It was fast, way faster, than the usual WP themes. Obviously nowhere close to a static HTML.


I find its the bigger sites that have been introducing the slowest/most clunky front-ends. Thinking new Reddit, Tumblr, etc.


Its not the HTML generation that's slow, it's everything that comes after. Usually lots of Javascript to run media and ads. Unfortunate, but it's driven by business decisions, not tech.


> just some common-sense SQL optimizations.

Thats probably it. Some beginner has written an n+1 query and the page is slow as shit now.


A trillion npm packages will for sure add a couple seconds of delay to your website loading speed.


Hey, when you need to pad a string, what else are you supposed to do?


Yeah, pad it ourselves? What are we, savages?


That depends on whether or not they are required for first render. Whatever scripts you attach will only be requested after the HTML is delivered, unless they're embedded (most aren't). Isomorphic rendering and rehydration can deliver and render just as fast as a static page, because it is a static page... at first. The problem with modern web page load times isn't the number npm packages bundled for the project, it's that developers/companies, through incompetence, laziness, or a desire to make sure that would-be NoScript users have to run their tracking code before being allowed to view content, don't always render basic content and functionality before requesting that bundle. The only exceptions to this I can think of are scripts that require you to run them before first render, like PayPal buttons that want to inspect everything, make sure they aren't in an iframe, etc.


`npm install the-entire-universe`


The dithering is nice for some images but it takes some details on graphs etc..

The example image given can be re-compressed using techniques like structural similarity (SSIM)[1] or Multi-scale structural similarity (2003 Paper)[2]. Using jpeg-archive[3] I resized the example image from the blog[4] with decent results:

  /tmp  ls -1sh *.jpg *.png | sort -nr
  740K 6a00e0099229e88833022ad3b23825200b-750wi.png
  100K 6a00e0099229e88833022ad3b23825200b-750wi.jpg
   64K small-fry.jpg
   56K mpe.jpg
   24K ms-ssim.jpg
[1] https://en.wikipedia.org/wiki/Structural_similarity

[2] https://ece.uwaterloo.ca/~z70wang/publications/msssim.pdf

[3] https://github.com/danielgtaylor/jpeg-archive

[4] http://krisdedecker.typepad.com/.a/6a00e0099229e88833022ad3b...


If I understand you correctly, you propose a image format which should provide a better quality for less file size. However, to implement an in-browser client-side reader, one probably needs a javascript library, and this will blow up the size again [Side note: Actually I am surprised the site uses JavaScript at all, being that static].

Another proposal for image compression is to fall back on SVG, c.f. https://jmperezperez.com/svg-placeholders/ -- at least this does not need JS.


Those are compression techniques, they do not require js, nor a different client and work with the jpeg standard.


To implement the improvements you suggest on this site (which appears to be a static site), the best way would be to have your proposed image compression as part of the static site preprocessor, correct? Just trying to understand how it would be implemented, as this site is an interesting idea but the dithered images would likely not have mass-appeal.


Yes, you'd just run all of the images through jpeg-recompress (from the GitHub repo linked above). They're already doing something like this to dither, so it's just a question of swapping out which command to run.

Incidentally, even where PNG is appropriate (images with crisp edges, such as text and icons) you can frequently get a large size savings by applying zopflipng, which uses an alternative GZIP compression algorithm to get a higher-efficiency compression, at the cost of greater CPU use for encoding.[1]

[1]: https://blog.codinghorror.com/zopfli-optimization-literally-...


What kind of compression do you get if you make a (colorized) copy of the dithered version and then use lossy compression on it?


From clicking around this site, it looks like their image dithering technique has taken away the color quality required to properly view the graphs on this article: https://solar.lowtechmagazine.com/2015/10/the-4g-mobile-inte...


I'm not convinced it saves much in the way of bandwidth, either.

If you take this[0] image, it's 43 KB for a 800x533 image. I'd expect a JPEG to be able to do that without too much in the way of artifacts.

It does give the page a kind of neat newspapery vibe, though.

[0]: https://solar.lowtechmagazine.com/dithers/sps_wide.png


Even if it did save bandwidth, that does not equal significant energy savings. And if it did, there are better ways to save.

For example, looking at their response headers I see the image cache control expires after a day, plus has validation caching. This could be significantly improved. First, get rid of the `etag`, `expires`, and `last-modified` headers and go with expiration caching only. Increase the `max-age` from one day (86400) to one year (31536000). I get the HTML expiring after 24 hours, but images should be more aggressively cached. Validation caching always requires a round trip to the server, plus running validation rules on every single request so it can't save nearly as much bandwith and CPU as simple expiration caching.

Also, if they got rid of the those headers, plus the pointless `server: nginx/1.10.3` they could save over 100b per HTTP response. I counted 19 requests on that article, so it may seem small, but it can add up to a significant amount.


Hi I'm one of the people that worked on the webdesign, back-end and hardware. You are right about the caching, one of our strategies is actually to cache very aggressively. Not only for images but even for all HTML aside from the front page and about pages. This works for us since low-tech magazine only publishes a maximum of one article a month and the site spans a decade of content which won't change anymore.

However, since we were still working on the site at the time it got massively popular, we changed the caching settings in order for people to see the changes and fixes that we were pushing.

I'll look into those server headers as well!


Cool, that makes sense. You should be able to easily turn off etags in your nginx config. This won't just save bytes in your response headers, but will stop a lot of HEAD requests to your server that are required to do validation caching.


> First, get rid of the `etag`, `expires`, and `last-modified` headers and go with expiration caching only. Increase the `max-age` from one day (86400) to one year (31536000).

Since it's a statically generated website, I'd favor never expiring the cache, and using cache-busting hashes instead. That way the cache is only emptied when the file actually changes.


Unfortunately there is no "never" setting. The HTTP spec recommends 1 year.


Isn't the etag just a hash of the content?


It can be, but doesn't need to be as long as it is some sort of indication of the uniqueness of the content. A last modified date should work in the same way.


I agree. I feel they are doing it more for aesthetics than actual 'energy' consumption, but then they must justify why they are dithering, and "because it looks pretty" doesn't fit their shtick.


Some of their images look like they are best suited to PNG format.

I do agree that the consistency is nice for aesthetic reasons, but poorly rendered images do detract from the overall message.


If you look at the non-solarpowered website, it turns out that TypeKit is serving a 830,8 KiB fullcolour PNG file. AAARGH. No wonder that they think they need to use dithering and such if that is their baseline.

I tried testing some variants and made a static HTML gallery on neocieties[1]. TL;DR:

- cleaning up the photo before filtering it makes a huge difference in image quality and compression.

- quality 80 JPG is 86,6KiB in color, 70,9Kib in grayscale.

- quality 40 (which is ugly but still has readable text) uses 44,8KiB and 37,8Kib respectively. The latter looks a lot better than the dithered image IMO.

- similarly, the grayscale PNG with pushed shadows and positioned dithering has readable text and contains significantly more detail at 4 colors and 38,7KiB than the 16-color 42,4KiB image on their server.

[0] http://www.lowtechmagazine.com/2018/09/how-to-build-a-lowtec...

[1] https://blindedcyclops.neocities.org/low-tech-image-tests/ga...


Hi, I responded to your great comment over at lowtechmagazine[0]. I'm not sure if you check the thread there regularly so I'm also pinging you here

[0]: http://www.lowtechmagazine.com/2018/09/how-to-build-a-lowtec...


This[0] is also a 43kB 800x533 image, saved at 30% JPEG quality from the GIMP. The whole dithering thing seems to be the wrong approach.

[0]: https://bitsquirrel.co.uk/800x533.jpg


Dithered images do not compress well. I'd be interested to see side-by-side comparisons of optimised pngs, optimised jpeg, reduced-pallete optimised pngs, and some more cutting-edge formats like BPG[0] and FLIF[1]

[0] https://bellard.org/bpg/ [1] https://flif.info/


> Dithered images do not compress well.

That does somewhat depend on whether we are talking about positioned or Floyd-Steinberg dithering. Positioned dithering is a lot more regular and predictable, and hence easier to compress.



I hope they fix the RSS feed and put the whole articles in there. It should be the norm, especially on a website with "low" uptime.


It would be nice if there was a site that indexed RSS feeds that contain whole articles, not just ledes and snippets meant to get you to click into ad-riddled articles.


I agree, but I would also expect to only receive text data and not images (or very highly compressed images). The other part of keeping the bandwidth cost of an RSS feed down is keeping a smaller number of articles in "rotation".

EDIT: I just pulled their RSS feed - 50 articles currently in rotation! 32.42kB according to FireFox of just snippets of text and links!



Seems to have links to, and 1-sentence descriptions of, all of their articles.

This isn't very useful if you want your RSS reader to pull down articles for you to read while you're offline. For that to work, you want the feed itself to have the full text of every article embedded inside it.


It has the full content. See view-source. Firefox hides it.


I mean article content, not just the first line or the title. This way articles are cached in case of downtime.


For some reason Firefox is only showing the <summary> contents when you click that link, but it's all in there and it shows the full article in ttrss for me.


Cool experiment!

My suggestion is to get a larger solar panel. Solar is cheap now, less than US$1 per watt. Why not get a 100W or a 200W panel? It will put out enough power to supply the server and router on a rainy day. The battery will only be necessary overnight.

The price of 18650's is low enough that building your own pack to handle 30W continuous overnight is not costly. To help with the environmental impact and cost you can get used batteries that are being recycled or thrown away.


Some cool concepts there.

>Static Site

This should be an obvious choice for sites not requiring dynamic elements. Also their choice of comments through email is a good solution to a problem static-made blogs have.

>Dithered Images

That won't work for every case, but is interesting.

>Default typeface

That's nice. I'm already disallowing sites to choose their own typeface. Except if a site is about design, then the typeface doesn't matter and probably I won't prefer their choice over mine.

>No Third-Party Tracking

This makes sense and I don't understand why self-hosted sites are doing it. The logs are there. Analyzing them is all that is needed.

>No Advertising Services

I'm fine with that but some sites rely on ads. I ain't aware of a method that will keep with their goals and provide ads.


> This makes sense and I don't understand why self-hosted sites are doing it. The logs are there. Analyzing them is all that is needed.

1) Laziness. A third-party JS tracker usually comes with a complete dashboard, full of pretty (and sometimes even useful) graphs.

2) Data. Client-side trackers can spy on users more, giving you more information you can e.g. missapply in an A/B test trying to drive "engagement".

RE 1, there exist tools aimed at analyzing server logs. I played with GoAccess a bit, it's quite OK. https://goaccess.io/


You don't need a thirds party tracker to get fancy graphs. Matomo is free, open source, self-hosted, works on logs, and has fancy graphs. An ELK stack would also work fine.

Lazy-loaded 1px images can be used for tracking how far users have read and a/b testing can be done by compiling multiple versions and redirecting users to the version you want them to try.


If I could toot my own horn for a moment, Gravwell (gravwell.io) has a 2GB/day free license which should be plenty to ingest web logs. We've got a GeoIP module to resolve IPs to locations, we can display geographic heatmaps (see https://dev.gravwell.io/docs/#!search/map/map.md), a variety of charts, tables, etc.

Matomo looks really polished and if it provides the features you need, it seems like an obvious choice. If OTOH you're looking at rolling your own with ELK, Gravwell might make sense.


> This should be an obvious choice for sites not requiring dynamic elements. Also their choice of comments through email is a good solution to a problem static-made blogs have.

This is one I find interesting. I doubt they want to manually go through every email. So some semi-automation sounds sensible too. What would be an elegant solution to that?


The idea is great, the software side especially is something I'd like to see more of, and the not-always-available scheme is very commendable as well.

But the end of article stuck out at me.

> we'll be offering print on demand versions of our blog.

How is that consistent with low energy footprint? Will the printers and the cellulose factory run on solar as well?


Print, on it's own, is not that high energy. Paper has been made for thousands of years, printing has been here for 400.

But the question is indeed an intriguing one.


I once basically did just that, besides the image dithering. I ran a small blog off a Raspberry Pi, only using Markdown files and something to render them to HTML. I used my own IP address, which I know you're not technically supposed to do but in 15 years I've never had an ISP crack down on me for it, and had a script that would run periodically to update the A record for my domain in Route 53. (since my IP was dynamic)

Whether you're doing it for energy purposes or not, it's a fun project because it feels a bit like sticking it to the man, since we're used to always hosting our sites on someone else's machine.


What is the reason for not using your own ip?


I believe it goes against most terms of service agreements with non-commercial isp plans.


My guess is that this is mostly a way for them to object if you start using too much bandwidth. Most ISP's "guarantee" way more bandwidth than they can actually provide on a daily basis. If you're hosting a website that gets low traffic, normally, they probably will not only not know, but not care. Just my guess.


For several years starting in 2010 I ran my personal website off a Pentium III Thinkpad running Plan 9 sitting on the floor behind my desk. I wrote the web server in Go; it was a "hybrid" in that it served mostly static files, but in the blog subdirectory it would render Markdown to HTML.

It worked great! I eventually migrated to a VPS when the hardware started to get too flaky with age (and years of accumulated dust sucked through the fan inlet)


And then, at the end, there's one line of useless info - a weather forecast for their server location.

Forecast: today Partly cloudy in the morning and breezy until afternoon. / tomorrow Partly cloudy until afternoon. / day after tomorrow Clear throughout the day.

For that, they pulled in JQuery and made a XMLHTTP request.


The weather content is relevant in that their server would go offline with too many cloudy days.

If their intent is to lower energy consumption of their server alone, offloading this weather check is the correct choice. But surely requesting+caching that info on their end (by perhaps rebuilding the static site every hour with an updated forecast) would be overall more efficient than having X number of clients request the data.


All it amounts to is one additional request, and the json file itself is probably generated by a cron job, not dynamically on demand.

Surely rebuilding the website every hour (and invalidating the cache for X number of clients) would result in more requests.


Not useless, as it's relevant to the solar power for the server.


I think this website is a beautiful piece of engineering: simple, efficient, clear and humane. It solves a pressing human problem with mature, straightforward technologies.

The dithering is obviously not to everyone's tastes, but I like the paper feel and think it's appropriate for a 'low tech' brand. Bravo to the lowtech team.


I think this is a really cool idea actually. I help run a fair few websites for authors/creators that don't need 5 digits of up-time and super heavy websites. I'm sure most blogs/sites would be fine being much much lighter and running off something like this.


I like the gauge, but wish it was only in the corner or something. That it is the background of the whole page is pretty distracting.


Solar power, low-bandwith, clean markup, and lots of numbers to optimize. Practical or not, the makes my inner hacker pretty happy.


Could save an easy 80kB by cutting out Jquery. All it does is the little weather widget, and could be easily written in standard js.


In a similar vein, the weather icons are 800x800px images displayed at no larger than 20x20px


Those large icons are rather compression friendly compared to the dithered images. It would still be a relatively huge save at 20x20 (after resize to 20x20 with imagemagick and pngcrush -brute, the 5.5k clear-day.png goes down to 559 bytes).


800x800 indeed is excessive. We'll resize them to something smaller. Jquery is already on its way out. Thanks for the feedback!


Wonder if they've considered a gravity battery? As an additional benefit it'd let someone local "recharge" the battery manually rather than being 100% tied to the Sun.


I imagine huge weights hanging off the balcony of a Barcelona apartment :)


Could be pretty stylish I'd think. Chandelier with moving crystals? #LowTech :P


I think this is called a "Grandfather clock" ;-)


We can hire a British from Salou for that.


Is there an off the shelf gravity battery that one could purchase for charging over USB?


It's Low Tech magazine: you can bet he's considered and analysed every form of energy storage you can think of, and several you haven't.


I'm curious how the energy usage for this site compares to something like hosting a site on S3.


Using a datacentre such as S3 will gain the benefits of an economy of scale. Per-website S3 energy costs are going to be substantially lower than the energy cost of a solar-server: each solar-server that is installed will have had to have been built in factories which cost energy to run, shipped in delivery vehicles which cost energy to drive, plus the humans behind the website would require energy while building the server... all of which would easily be more than the net energy consumption of a single website in S3.

That being said, the point of the project is to produce a website that is 100% solar powered. They never said they were aiming for something that is feasibly sustainable.


Probably not good I would think, especially if you take into account the energy used to produce and ship the computer, router, etc.


I would be interested to see how the additional traffic from HN is impacting their power usage.


As a way to make use of resources you'd otherwise not have a use for this is interesting, but my feeling is that as a way to reduce "your" energy consumption it's misguided. I'm pretty sure that the energy usage of running even on very efficient dedicated hardware is going to be a lot higher than the incremental increase in consumption for an incredibly tiny VM instance in a data center focused on energy efficiency.

The hardware choice is arguably overkill as well, at least if wifi is available and on use. Run it on a repurposed old phone, eliminate its share of the power draw of a network switch, etc.


The best part is the dithering, this website reminds me of old Chemistry lab manuals.


> Default typeface

I adore this for whatever low or high tech. As for me all this web-font stuff seems a huge piece of BS (except for rare cases when a website is meant to be more of a piece of art than serving a practical function or when you need to use an exotic language letters of which are rare to be found in the standard fonts on clients' computers) and I just block external fonts all the time.


For developers it always seems like a good idea to use the system defaults but then you push it out to users and find out they all have insane defaults that work like crap.

I had someone complaining to me that my mailto: link doesn't work but actually its just they had the wrong mail client set as default on their system.


Not using custom web fonts that have to be loaded from external resources doesn't imply using pure defaults of the client's system. You still can specify font size and style with pure CSS.


This is one of the coolest web projects I've seen lately. Great job!

> Increasingly, activities that could perfectly happen off-line – such as writing a document, filling in a spreadsheet, or storing data – are now requiring continuous network access. This does not combine well with renewable energy sources such as wind and solar power, which are not always available.

I'm surprised no one has mentioned progressive web apps. This seems like a perfect use-case for them.

On a related note, I just found this research showing that Service Workers (on which PWAs are built) "do not have a significant impact over the energy consumption of [mobile] devices, regardless of the network conditions." [0]

[0]: http://www.ivanomalavolta.com/files/papers/Mobilesoft_2017.p...


> The size of the average web page (defined as the average page size of the 500,000 most popular domains) increased from 0.45 megabytes (MB) in 2010 to 1.7 megabytes in June 2018.

It's the second place I see such low average size values (I'd say, somewhat reasonable), but I'm having a hard time believing in them. On a typical site, when I hit F12 and proceed to Network tab, I usually see ten times greater values. I fear that those statistics are either undercounting something, or 500k is just so many domains that the long tail of lightweight sites drags the average down, making values not representative of the real situation. I wonder how those stats would look if the average was weighted by the number of visits.


I remember from over a decade ago there was an annual contest to make the best web page that would fit into 5KB - yes, 5000 bytes. There were many impressive entries.

I'd really hoped it would catch on.

Sadly, that lightweight concept is now so forgotten that I can't even find it in a quick search. (perhaps someone else can find a pointer?)



Thanks, nice!

Seems I'd misremembered it as being more recent than in was, last one in 2002. They apparently 'grew' to 10K, and that site is now 404 also.

At this point, 50K would be a welcome respite from the mountains of cruft that download with every click...


Preach!


There is also the issue that caching (e.g. cdn of jQuery or React libraries) can mean that, if you visit a given website every day in a month, you may only download that library once (the first time). Their estimate is definitely lowball for your first visit, but it might be a reasonable estimate for how much is actually downloaded on average.


I thought that the caching you describe in practice rarely actually offers much of a benefit (for various reasons), but maybe things changed?


I'm proud of my low tech personal website: http://mrtno.com - I promise: it will never work on smartphones, just good old computers...

I will build your static html page for money. Hit me up. ;)


You're proud that you make webpages that can't be consumed by the primary method most people use to get online?


Even more confusing, the linked website seems to work just fine on my Moto.


well, it's just not optimized for it, it's just HTML + CSS, so it should work pretty much everywhere.

I don't have a cellphone myself, so i never tested it.


YES!


Why?


The CSS @media attribute has been widely available since IE6. It's not exactly high tech.


Are we discussing about high tech or low tech?


You're in a thread about low tech websites bragging that your website is so low tech that it doesn't work with mobile devices.

I'm just saying the low tech and mobile compatibility aren't mutually exclusive.


Speed in your biggest feature in every application. I don't understand why the current state of affairs _is utterly obsessed_ with lazy loading static content and SPA for things that seldom change.


It's a good start, but the CSS is obnoxious and definitely not low-tech, and the image dithering is a bad compromise between having real images and simply not having images at all. In short, it's an attempt at a low-tech website made by people who've never seen a real low-tech website, and probably wouldn't know Lynx if they saw it in use.


Doesn't this seem a bit uncharitable a characterization? What's an example of a "real low-tech website" that does better than the work presented here? Are you browsing hackernews using lynx or surfraw?


https://yarchive.net/

That's a low-tech website. Absolutely nothing but content.


According to the article, the goal here was to demonstrate the possibility of making an attractive, modern-looking Web page while still having it be lightweight. Your example does not achieve the first goal at all.


this is typical motherfuckingwebsite.com vs bettermotherfuckingwebsite.com.

The design team of solar.lowtechmagazine.com is squarely on the side of the bettermotherfuckingwebsite.com


What if you moved this to Saudi Arabia or somewhere where it's always sunny? I feel like this is the grassroots movement where actual decentralization occurs-imagine a mesh network localized within cities and then somehow branching out to other.


If you want to be online only part of the time, look into IPFS. Even if you go offline, your popular pages will likely be cached for a while, and can be cached forever if someone takes an interest in them.


I wish the content of the photographs wasn't so obscured


Unrelated, but I really like the background/font color contrast on the site. Makes for very easy reading to my eye.


So, how many requests per second can one serve with such setup, without a noticeable impact on performance?


Can someone explain to me how to reproduce the image dithering look in photoshop please ?


Easiest method I know of is Image > Mode > Indexed Color...

From there you can heavily reduce the color count and select the dithering type.


Didn't Bruce Sterling or William Gibson write a story where there was a street tribe called the "LoTeks?" I think it was William Gibson in the Johnny Mnemonic short story. Molly Millions fought a monomolecular weapon wielding assassin, using the LoTek's space to defeat him with "culture shock."


For static sites, what's the strategy for user authentication?


Anything that can send an HTTP GET.


What would you need user authentication for in a static site?


Just want to see how far can it be pushed beyond simple sites.



There are no end-user accounts.

Authors probably deploy by pushing to a git repo or ssh-ing into the server.


"Progress"


Low-Tech seems a bit of a misnomer when referring to deceptive simplicity. Wonder if they considered building After Dark.

https://after-dark.habd.as


Does someone has already tried to code on an AlphaSmart? I suppose it's a little like using ed.

I would be happy to have a distraction-free device to focus on my code. But I am not sure the screen is large enough to really work on it.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: