
How to Build a Low-Tech Website - RomanZolotarev
https://solar.lowtechmagazine.com/2018/09/how-to-build-a-lowtech-website/
======
ryanianian
The dithering on images is actually kinda nice. Less visual over-stimulation.

But whatever this site is doing with the fixed-scroll yellow block is super
weird. There's a "95%" thing next to a sun - is it somehow related to the
solar battery powering the server?

Rather than focusing on the content, the design focuses on the infrastructure
at the expense of the content. Maybe that's the point, but this particular
element is certainly off-putting.

~~~
JepZ

      <div id="battery" style="height: 95%;">
        <div id="bat_data">
          <a href="https://solar.lowtechmagazine.com/weather/"><span id="sky"></span><span id="level">95%</span></a></div>
      </div>
    

Looks pretty much like the battery level and current weather conditions ;-)

~~~
EForEndeavour
Before starting to read the article, I tried clicking and dragging on that
element and searched the page for "brightness" and "dim" before peeking at the
source to realize it isn't a brightness control after all.

------
Mtinie
I love the aesthetic! Great work.

Here's my unsolicited design riff on the "battery indicator." I'm a fan of the
concept, but the 100vw element width of the yellow fill was distracting me
from the rest of the content. I used DevTools to mock up a narrow version
which fits within the right content gutter, right next to the scroll bar. I
had to tweak the percentage metric display a bit by stacking the sun above the
value and removing the "%" indicator, but feel the end result has increased
readability and is still understandable.

[https://imgur.com/a/IIPZILz](https://imgur.com/a/IIPZILz)

~~~
profalseidol
Lower tech access to the image:

[https://i.imgur.com/o0Dtkai.png](https://i.imgur.com/o0Dtkai.png)

------
tw1010
Small sub-cultures, like this one, do experiments all the time in their little
corner of the web. Some of the cultures die after a while, and some grow into
something else, something bigger, something that starts to percolate on the
mainstream surface, albeit possibly only tangental to how it started. I'm not
sure I'd predict this movement will grow into something that reaches
mainstream tech, but let's say I'm wrong, just for fun. Can anyone here think
of a potential narrative that would describe how the current state of this
experiment could possibly be perturbed and squeezed into a form that the
mainstream soaks up as the next big (or medium-sized) thing?

I'm just curious to know how my prediction that this won't grow any larger
than it is today could be wrong.

~~~
l_t
I think the article hits on the key problem:

"Less than 100% reliability is essential for the sustainability of an off-the-
grid solar system."

IMO, this can only become mainstream once there is no longer an expectation of
"100%" uptime for websites in general.

It's far-fetched, but I can imagine it becoming mainstream if mesh networking
becomes popular, or if the Internet were to become a widely self-hosted
platform (e.g. people had their own "clouds" running in their homes). In such
situations, 100% uptime would no longer be an expectation, so the perceived
cost of using an off-the-grid system would be reduced.

~~~
woah
Having been in the “mesh” space for years, there’s a ton of this techno-
primitivism fantasy floating around. The reality is that people want to access
the internet and they want it to work well.

This article is a perfect example. This solar powered server which does not
have 100% uptime for artistic reasons is probably sitting in someone’s kitchen
next to a refrigerator which uses many times as much power and definitely has
100% uptime.

~~~
marci
They seem to be working on that as well:
[https://solar.lowtechmagazine.com/2016/05/how-to-go-off-
grid...](https://solar.lowtechmagazine.com/2016/05/how-to-go-off-grid-in-your-
apartment/)

The solar webserver is another step towards that goal.

------
decasia
Neat article, but I think we should distinguish basically two things here that
the article blurs together:

1) Reducing website bandwidth and computational usage (static sites, small
images, reducing extra resource requests). Super common topic that HN talks
about all the time.

2) Reducing the electricity bill for the server and powering it with renewable
resources. This part is quite neat, I think, but for a single bespoke website,
the energy savings of using a single low-power server seems pretty trivial in
the grand scheme of things — I could get similar savings by changing a living
room lamp to a more energy efficient bulb.

This inspires me to propose a third category of energy efficiency that the
article doesn't touch on: human energy costs.

Humans are quite energy intensive devices, and I suspect that the labor and
human-maintenance costs of making this website dwarf the direct energy costs
of running a webserver. I'd be curious how one would go about optimizing these
other energy costs, and whether that can be done in a way that still preserves
quality of work/life for the humans involved.

~~~
markatkinson
On topic 1 I was thinking to myself the other day, "I wonder if I can create a
static webpage that isn't some sort of SPA or web app."

It took me time to work out where to start... Once you think it through you
realise it is pretty easy and obvious. But (and this gives away my age a bit),
I have not built a static webpage since school (over 10 years ago), and have
become so accustomed to all the crazy shit that goes into web development
today like Webpack, NPM, React, Angular etc etc that I had forgotten how to
actually just serve a piece of HTML.

My dependence on all these frameworks and CLIs sent me on a quest to find out
more about pure javascript, and I came across an incredible talk by a
developer who created a pure javascript web app for music notation. I really
wanted to link it here but cannot find it.

~~~
0xCMP
Yea here is the talk
([https://www.youtube.com/watch?v=k7n2xnOiWI8](https://www.youtube.com/watch?v=k7n2xnOiWI8))

I have found that static generators, specifically Hugo, have made just plain
HTML fun again. They give you just enough power to not repeat yourself
endlessly (e.g. maintaining each page and copy-pasta all changes between them)
without turning into the system's we're all trying to step back from.

~~~
markatkinson
Yes, thank you! What a great talk.

------
jtms
I am going to guess they are getting a massive surge of traffic about now and
yet the site still renders in a fraction of the time it takes most "modern"
web apps or even just static content sites - this is how it should be.

~~~
romaniv
Honestly, I'm not even sure what modern website are doing when they generate
pages. I've had highly dynamic PHP/MySQL websites running on crap hardware,
and it was trivial to attain .03 second page generation time. Didn't even need
fancy caching, just some common-sense SQL optimizations.

~~~
wincy
At my last job we had an ad bidding API that would reach out to every
advertiser we worked with, got the highest bid, then presented an
advertisement to them.

The page was also framework after framework piled on top of each other, so
they’d be using React and jQuery at the same time, stuff like that. And so
marketing could change whatever they wanted whenever they wanted without
breaking the entire site, everything was served from a CMS.

So it’s probably a lot of that.

~~~
jtms
LOL - oh man... I can relate!

At a previous gig we had a similar situation. It was so bad my coworkers and I
would joke that we needed one of those "break in case of emergency" boxes with
the glass front and a hammer hanging adjacent. Inside the box: a can of
lighter fluid and a match. The purpose: when the crazy got to be too much you
just light own hair on fire and then go run around in traffic.

What you just described would definitely be a break the glass kinda situation
:-)

------
b2ccb2
The dithering is nice for some images but it takes some details on graphs
etc..

The example image given can be re-compressed using techniques like structural
similarity (SSIM)[1] or Multi-scale structural similarity (2003 Paper)[2].
Using jpeg-archive[3] I resized the example image from the blog[4] with decent
results:

    
    
      /tmp  ls -1sh *.jpg *.png | sort -nr
      740K 6a00e0099229e88833022ad3b23825200b-750wi.png
      100K 6a00e0099229e88833022ad3b23825200b-750wi.jpg
       64K small-fry.jpg
       56K mpe.jpg
       24K ms-ssim.jpg
    

[1]
[https://en.wikipedia.org/wiki/Structural_similarity](https://en.wikipedia.org/wiki/Structural_similarity)

[2]
[https://ece.uwaterloo.ca/~z70wang/publications/msssim.pdf](https://ece.uwaterloo.ca/~z70wang/publications/msssim.pdf)

[3] [https://github.com/danielgtaylor/jpeg-
archive](https://github.com/danielgtaylor/jpeg-archive)

[4]
[http://krisdedecker.typepad.com/.a/6a00e0099229e88833022ad3b...](http://krisdedecker.typepad.com/.a/6a00e0099229e88833022ad3b23825200b-750wi)

~~~
ktpsns
If I understand you correctly, you propose a image format which should provide
a better quality for less file size. However, to implement an in-browser
client-side reader, one probably needs a javascript library, and this will
blow up the size again [Side note: Actually I am surprised the site uses
JavaScript at all, being that static].

Another proposal for image compression is to fall back on SVG, c.f.
[https://jmperezperez.com/svg-placeholders/](https://jmperezperez.com/svg-
placeholders/) \-- at least this does not need JS.

~~~
b2ccb2
Those are compression techniques, they do not require js, nor a different
client and work with the jpeg standard.

~~~
claudiulodro
To implement the improvements you suggest on this site (which appears to be a
static site), the best way would be to have your proposed image compression as
part of the static site preprocessor, correct? Just trying to understand how
it would be implemented, as this site is an interesting idea but the dithered
images would likely not have mass-appeal.

~~~
wolfgang42
Yes, you'd just run all of the images through jpeg-recompress (from the GitHub
repo linked above). They're already doing something like this to dither, so
it's just a question of swapping out which command to run.

Incidentally, even where PNG is appropriate (images with crisp edges, such as
text and icons) you can frequently get a large size savings by applying
zopflipng, which uses an alternative GZIP compression algorithm to get a
higher-efficiency compression, at the cost of greater CPU use for encoding.[1]

[1]: [https://blog.codinghorror.com/zopfli-optimization-
literally-...](https://blog.codinghorror.com/zopfli-optimization-literally-
free-bandwidth/)

------
powellzer
From clicking around this site, it looks like their image dithering technique
has taken away the color quality required to properly view the graphs on this
article: [https://solar.lowtechmagazine.com/2015/10/the-4g-mobile-
inte...](https://solar.lowtechmagazine.com/2015/10/the-4g-mobile-internet-
thats-already-there/)

~~~
juliangoldsmith
I'm not convinced it saves much in the way of bandwidth, either.

If you take this[0] image, it's 43 KB for a 800x533 image. I'd expect a JPEG
to be able to do that without too much in the way of artifacts.

It does give the page a kind of neat newspapery vibe, though.

[0]:
[https://solar.lowtechmagazine.com/dithers/sps_wide.png](https://solar.lowtechmagazine.com/dithers/sps_wide.png)

~~~
protonfish
Even if it did save bandwidth, that does not equal significant energy savings.
And if it did, there are better ways to save.

For example, looking at their response headers I see the image cache control
expires after a day, plus has validation caching. This could be significantly
improved. First, get rid of the `etag`, `expires`, and `last-modified` headers
and go with expiration caching only. Increase the `max-age` from one day
(86400) to one year (31536000). I get the HTML expiring after 24 hours, but
images should be more aggressively cached. Validation caching always requires
a round trip to the server, plus running validation rules on every single
request so it can't save nearly as much bandwith and CPU as simple expiration
caching.

Also, if they got rid of the those headers, plus the pointless `server:
nginx/1.10.3` they could save over 100b per HTTP response. I counted 19
requests on that article, so it may seem small, but it can add up to a
significant amount.

~~~
vanderZwan
> _First, get rid of the `etag`, `expires`, and `last-modified` headers and go
> with expiration caching only. Increase the `max-age` from one day (86400) to
> one year (31536000)._

Since it's a statically generated website, I'd favor _never_ expiring the
cache, and using cache-busting hashes instead. That way the cache is only
emptied when the file actually changes.

~~~
user111233
Isn't the etag just a hash of the content?

~~~
protonfish
It can be, but doesn't need to be as long as it is some sort of indication of
the uniqueness of the content. A last modified date should work in the same
way.

------
Retr0spectrum
Dithered images do not compress well. I'd be interested to see side-by-side
comparisons of optimised pngs, optimised jpeg, reduced-pallete optimised pngs,
and some more cutting-edge formats like BPG[0] and FLIF[1]

[0] [https://bellard.org/bpg/](https://bellard.org/bpg/) [1]
[https://flif.info/](https://flif.info/)

~~~
vanderZwan
> _Dithered images do not compress well._

That does somewhat depend on whether we are talking about positioned or Floyd-
Steinberg dithering. Positioned dithering is a lot more regular and
predictable, and hence easier to compress.

------
trqx
I hope they fix the RSS feed and put the whole articles in there. It should be
the norm, especially on a website with "low" uptime.

~~~
azdle
[https://solar.lowtechmagazine.com/feeds/all.atom.xml](https://solar.lowtechmagazine.com/feeds/all.atom.xml)

^ seems to have full articles.

~~~
avhon1
Seems to have links to, and 1-sentence descriptions of, all of their articles.

This isn't very useful if you want your RSS reader to pull down articles for
you to read while you're offline. For that to work, you want the feed itself
to have the full text of every article embedded inside it.

~~~
pmlnr
It has the full content. See view-source. Firefox hides it.

------
driverdan
Cool experiment!

My suggestion is to get a larger solar panel. Solar is cheap now, less than
US$1 per watt. Why not get a 100W or a 200W panel? It will put out enough
power to supply the server and router on a rainy day. The battery will only be
necessary overnight.

The price of 18650's is low enough that building your own pack to handle 30W
continuous overnight is not costly. To help with the environmental impact and
cost you can get used batteries that are being recycled or thrown away.

------
forgotpwd16
Some cool concepts there.

>Static Site

This should be an obvious choice for sites not requiring dynamic elements.
Also their choice of comments through email is a good solution to a problem
static-made blogs have.

>Dithered Images

That won't work for every case, but is interesting.

>Default typeface

That's nice. I'm already disallowing sites to choose their own typeface.
Except if a site is about design, then the typeface doesn't matter and
probably I won't prefer their choice over mine.

>No Third-Party Tracking

This makes sense and I don't understand why self-hosted sites are doing it.
The logs are there. Analyzing them is all that is needed.

>No Advertising Services

I'm fine with that but some sites rely on ads. I ain't aware of a method that
will keep with their goals and provide ads.

~~~
TeMPOraL
> _This makes sense and I don 't understand why self-hosted sites are doing
> it. The logs are there. Analyzing them is all that is needed._

1) Laziness. A third-party JS tracker usually comes with a complete dashboard,
full of pretty (and sometimes even useful) graphs.

2) Data. Client-side trackers can spy on users more, giving you more
information you can e.g. missapply in an A/B test trying to drive
"engagement".

RE 1, there exist tools aimed at analyzing server logs. I played with GoAccess
a bit, it's quite OK. [https://goaccess.io/](https://goaccess.io/)

~~~
mywittyname
You don't need a thirds party tracker to get fancy graphs. Matomo is free,
open source, self-hosted, works on logs, and has fancy graphs. An ELK stack
would also work fine.

Lazy-loaded 1px images can be used for tracking how far users have read and
a/b testing can be done by compiling multiple versions and redirecting users
to the version you want them to try.

~~~
floren
If I could toot my own horn for a moment, Gravwell (gravwell.io) has a 2GB/day
free license which should be plenty to ingest web logs. We've got a GeoIP
module to resolve IPs to locations, we can display geographic heatmaps (see
[https://dev.gravwell.io/docs/#!search/map/map.md](https://dev.gravwell.io/docs/#!search/map/map.md)),
a variety of charts, tables, etc.

Matomo looks really polished and if it provides the features you need, it
seems like an obvious choice. If OTOH you're looking at rolling your own with
ELK, Gravwell might make sense.

------
dmos62
The idea is great, the software side especially is something I'd like to see
more of, and the not-always-available scheme is very commendable as well.

But the end of article stuck out at me.

> we'll be offering print on demand versions of our blog.

How is that consistent with low energy footprint? Will the printers and the
cellulose factory run on solar as well?

~~~
pmlnr
Print, on it's own, is not that high energy. Paper has been made for thousands
of years, printing has been here for 400.

But the question is indeed an intriguing one.

------
ravenstine
I once basically did just that, besides the image dithering. I ran a small
blog off a Raspberry Pi, only using Markdown files and something to render
them to HTML. I used my own IP address, which I know you're not technically
supposed to do but in 15 years I've never had an ISP crack down on me for it,
and had a script that would run periodically to update the A record for my
domain in Route 53. (since my IP was dynamic)

Whether you're doing it for energy purposes or not, it's a fun project because
it feels a bit like sticking it to the man, since we're used to always hosting
our sites on someone else's machine.

~~~
kmgr
What is the reason for not using your own ip?

~~~
H1Supreme
I believe it goes against most terms of service agreements with non-commercial
isp plans.

~~~
rossdavidh
My guess is that this is mostly a way for them to object if you start using
too much bandwidth. Most ISP's "guarantee" way more bandwidth than they can
actually provide on a daily basis. If you're hosting a website that gets low
traffic, normally, they probably will not only not know, but not care. Just my
guess.

------
Animats
And then, at the end, there's one line of useless info - a weather forecast
for their server location.

Forecast: today Partly cloudy in the morning and breezy until afternoon. /
tomorrow Partly cloudy until afternoon. / day after tomorrow Clear throughout
the day.

For that, they pulled in JQuery and made a XMLHTTP request.

~~~
licyeus
The weather content is relevant in that their server would go offline with too
many cloudy days.

If their intent is to lower energy consumption of their server alone,
offloading this weather check is the correct choice. But surely
requesting+caching that info on their end (by perhaps rebuilding the static
site every hour with an updated forecast) would be overall more efficient than
having X number of clients request the data.

~~~
boomlinde
All it amounts to is one additional request, and the json file itself is
probably generated by a cron job, not dynamically on demand.

Surely rebuilding the website every hour (and invalidating the cache for X
number of clients) would result in more requests.

------
jbreckmckye
I think this website is a beautiful piece of engineering: simple, efficient,
clear and humane. It solves a pressing human problem with mature,
straightforward technologies.

The dithering is obviously not to everyone's tastes, but I like the paper feel
and think it's appropriate for a 'low tech' brand. Bravo to the lowtech team.

------
ObsoleteNerd
I think this is a really cool idea actually. I help run a fair few websites
for authors/creators that don't need 5 digits of up-time and super heavy
websites. I'm sure most blogs/sites would be fine being much much lighter and
running off something like this.

------
uxp100
I like the gauge, but wish it was only in the corner or something. That it is
the background of the whole page is pretty distracting.

------
tapvt
Solar power, low-bandwith, clean markup, and lots of numbers to optimize.
Practical or not, the makes my inner hacker pretty happy.

------
Obi_Juan_Kenobi
Could save an easy 80kB by cutting out Jquery. All it does is the little
weather widget, and could be easily written in standard js.

~~~
baliex
In a similar vein, the weather icons are 800x800px images displayed at no
larger than 20x20px

~~~
boomlinde
Those large icons are rather compression friendly compared to the dithered
images. It would still be a relatively huge save at 20x20 (after resize to
20x20 with imagemagick and pngcrush -brute, the 5.5k clear-day.png goes down
to 559 bytes).

~~~
arra
800x800 indeed is excessive. We'll resize them to something smaller. Jquery is
already on its way out. Thanks for the feedback!

------
mLuby
Wonder if they've considered a gravity battery? As an additional benefit it'd
let someone local "recharge" the battery manually rather than being 100% tied
to the Sun.

~~~
uneekname
I imagine huge weights hanging off the balcony of a Barcelona apartment :)

~~~
mLuby
Could be pretty stylish I'd think. Chandelier with moving crystals? #LowTech
:P

------
jkingsbery
I'm curious how the energy usage for this site compares to something like
hosting a site on S3.

~~~
g105b
Using a datacentre such as S3 will gain the benefits of an economy of scale.
Per-website S3 energy costs are going to be substantially lower than the
energy cost of a solar-server: each solar-server that is installed will have
had to have been built in factories which cost energy to run, shipped in
delivery vehicles which cost energy to drive, plus the humans behind the
website would require energy while building the server... all of which would
easily be more than the net energy consumption of a single website in S3.

That being said, the point of the project is to produce a website that is 100%
solar powered. They never said they were aiming for something that is feasibly
sustainable.

------
sbradford26
I would be interested to see how the additional traffic from HN is impacting
their power usage.

------
fencepost
As a way to make use of resources you'd otherwise not have a use for this is
interesting, but my feeling is that as a way to reduce "your" energy
consumption it's misguided. I'm pretty sure that the energy usage of running
even on very efficient dedicated hardware is going to be a lot higher than the
incremental increase in consumption for an incredibly tiny VM instance in a
data center focused on energy efficiency.

The hardware choice is arguably overkill as well, at least if wifi is
available and on use. Run it on a repurposed old phone, eliminate its share of
the power draw of a network switch, etc.

------
gonzo41
The best part is the dithering, this website reminds me of old Chemistry lab
manuals.

------
qwerty456127
> Default typeface

I adore this for whatever low or high tech. As for me all this web-font stuff
seems a huge piece of BS (except for rare cases when a website is meant to be
more of a piece of art than serving a practical function or when you need to
use an exotic language letters of which are rare to be found in the standard
fonts on clients' computers) and I just block external fonts all the time.

~~~
user111233
For developers it always seems like a good idea to use the system defaults but
then you push it out to users and find out they all have insane defaults that
work like crap.

I had someone complaining to me that my mailto: link doesn't work but actually
its just they had the wrong mail client set as default on their system.

~~~
qwerty456127
Not using custom web fonts that have to be loaded from external resources
doesn't imply using pure defaults of the client's system. You still can
specify font size and style with pure CSS.

------
acobster
This is one of the coolest web projects I've seen lately. Great job!

> _Increasingly, activities that could perfectly happen off-line – such as
> writing a document, filling in a spreadsheet, or storing data – are now
> requiring continuous network access. This does not combine well with
> renewable energy sources such as wind and solar power, which are not always
> available._

I'm surprised no one has mentioned progressive web apps. This seems like a
perfect use-case for them.

On a related note, I just found this research showing that Service Workers (on
which PWAs are built) "do not have a significant impact over the energy
consumption of [mobile] devices, regardless of the network conditions." [0]

[0]:
[http://www.ivanomalavolta.com/files/papers/Mobilesoft_2017.p...](http://www.ivanomalavolta.com/files/papers/Mobilesoft_2017.pdf)

------
TeMPOraL
> _The size of the average web page (defined as the average page size of the
> 500,000 most popular domains) increased from 0.45 megabytes (MB) in 2010 to
> 1.7 megabytes in June 2018._

It's the second place I see such low average size values (I'd say, somewhat
reasonable), but I'm having a hard time believing in them. On a typical site,
when I hit F12 and proceed to Network tab, I usually see ten times greater
values. I fear that those statistics are either undercounting something, or
500k is just so many domains that the long tail of lightweight sites drags the
average down, making values not representative of the real situation. I wonder
how those stats would look if the average was weighted by the number of
visits.

~~~
toss1
I remember from over a decade ago there was an annual contest to make the best
web page that would fit into 5KB - yes, 5000 bytes. There were many impressive
entries.

I'd really hoped it would catch on.

Sadly, that lightweight concept is now so forgotten that I can't even find it
in a quick search. (perhaps someone else can find a pointer?)

~~~
slater
[https://the5k.org/about.php](https://the5k.org/about.php) :)

~~~
toss1
Thanks, nice!

Seems I'd misremembered it as being more recent than in was, last one in 2002.
They apparently 'grew' to 10K, and that site is now 404 also.

At this point, 50K would be a welcome respite from the mountains of cruft that
download with every click...

~~~
slater
Preach!

------
kome
I'm proud of my low tech personal website:
[http://mrtno.com](http://mrtno.com) \- I promise: it will never work on
smartphones, just good old computers...

I will build your static html page for money. Hit me up. ;)

~~~
lexicality
You're proud that you make webpages that can't be consumed by the primary
method most people use to get online?

~~~
wtracy
Even more confusing, the linked website seems to work just fine on my Moto.

~~~
kome
well, it's just not optimized for it, it's just HTML + CSS, so it should work
pretty much everywhere.

I don't have a cellphone myself, so i never tested it.

------
exabrial
Speed in your biggest feature in every application. I don't understand why the
current state of affairs _is utterly obsessed_ with lazy loading static
content and SPA for things that seldom change.

------
msla
It's a good start, but the CSS is obnoxious and definitely not low-tech, and
the image dithering is a bad compromise between having real images and simply
not having images at all. In short, it's an attempt at a low-tech website made
by people who've never seen a real low-tech website, and probably wouldn't
know Lynx if they saw it in use.

~~~
eutropia
Doesn't this seem a bit uncharitable a characterization? What's an example of
a "real low-tech website" that does better than the work presented here? Are
you browsing hackernews using lynx or surfraw?

~~~
msla
[https://yarchive.net/](https://yarchive.net/)

That's a low-tech website. Absolutely nothing but content.

~~~
emodendroket
According to the article, the goal here was to demonstrate the possibility of
making an attractive, modern-looking Web page while still having it be
lightweight. Your example does not achieve the first goal at all.

------
BraveNewCurency
If you want to be online only part of the time, look into IPFS. Even if you go
offline, your popular pages will likely be cached for a while, and can be
cached forever if someone takes an interest in them.

------
SippinLean
I wish the content of the photographs wasn't so obscured

------
shanecleveland
Unrelated, but I really like the background/font color contrast on the site.
Makes for very easy reading to my eye.

------
bikamonki
So, how many requests per second can one serve with such setup, without a
noticeable impact on performance?

------
lepouet
Can someone explain to me how to reproduce the image dithering look in
photoshop please ?

~~~
glassesman
Easiest method I know of is Image > Mode > Indexed Color...

From there you can heavily reduce the color count and select the dithering
type.

------
stcredzero
Didn't Bruce Sterling or William Gibson write a story where there was a street
tribe called the "LoTeks?" I think it was William Gibson in the Johnny
Mnemonic short story. Molly Millions fought a monomolecular weapon wielding
assassin, using the LoTek's space to defeat him with "culture shock."

------
ww520
For static sites, what's the strategy for user authentication?

~~~
TeMPOraL
What would you need user authentication for in a static site?

~~~
ww520
Just want to see how far can it be pushed beyond simple sites.

------
browsercoin
What if you moved this to Saudi Arabia or somewhere where it's always sunny? I
feel like this is the grassroots movement where actual decentralization
occurs-imagine a mesh network localized within cities and then somehow
branching out to other.

------
oliv__
"Progress"

------
jhabdas
Low-Tech seems a bit of a misnomer when referring to deceptive simplicity.
Wonder if they considered building After Dark.

[https://after-dark.habd.as](https://after-dark.habd.as)

------
bondant
Does someone has already tried to code on an AlphaSmart? I suppose it's a
little like using ed.

I would be happy to have a distraction-free device to focus on my code. But I
am not sure the screen is large enough to really work on it.

