1. Write your article in Microsoft Word.
2. Save as 'Web Page, Filtered'
3. Upload/Place in html folder on Webserver.
Almost everyone has access to a copy of Word, if not I think LibreOffice has a html save function also.
This solution is 100% wrong for us technical people, but for non-techies who just want to get the information out there it's a really good option imho.
Item #3 would be the most difficult task of the lot for "non techies" in my experience: FTP client, web hosting setup, ...
I taught a class (10 people, master's degree, journalists) about web and publishing, with no prior experience. We used gitlab pages and hugo, with the intent of getting some hands on experience with web tech and owning an actual project and website.
Markdown writing and HTML/CSS hacking went smoothly but most of all, students were impressed by the experience of getting a website online, with their content, in a matter of minutes, all in the browser.
In comparison, registering for some HTTP/FTP plan somewhere and using a FTP client can be quite daunting.
There are some "dropbox based" website generator services that could make the publishing workflow even easier.
How about https://www.netlify.com? Recently had to teach a non-tech person how to host a website from scratch. With all the non-trivial requirements (dns, vps, web server), I thought it would be easiest to just use netlify. That way, a git push on the repo will deploy the site immediately.
"A push on the git repo" is not something I would think trivial for a non-tech person. However, netlify also supports just drag-and-dropping a zip onto the dashboard. That's probably as easy as it gets.
Yep, I had to explain the concept of version control system, using the old my_doc_file.docx, my_doc_file_final.docx, my_doc_file_final_final.docx, ...
I just told him when to use the commands git init, remote, add, commit, and push and what the workflow looks like.
As a couple of people have mentioned, there is a drag and drop feature. How did I miss this!? Anyway it would have saved a couple of minutes of explanation.
But then you need to create that zip file with the appropriate folder and file structure. I fear that would be too much to ask for then non-technical person that struggles with doing git commit.
You actually don't even have to zip it up. It just needs to be a folder with an index.html file in it. If Microsoft Word exports that, your non–technical person should be fine.
I don't know how I missed this. To be fair the "New site from git" button looks bright and green, while the "drag and drop your site folder here" looks dull and grey.
I think a lot of cheap shared web hosts let you upload via browser in a drag-and-drop widget. For example, neocities has a basic drag-and-drop uploader.
Thanks for mentioning Filtered in step 2. I always thought Word generates a ridiculously messy tag soup, but that's in default HTML export. I tested it with Filtered now, and the final markup is surprisingly clean.
It actually is a half-decent solution for static HTML generation. Wish I knew that in the past.
This really really isn't anywhere near true anymore.
Also, as mentioned, the complexity of bullet point 3 is enormously understated. MANY MANY SaaS options like Squarespace or even Netlify or Siteleaf are in themselves much simpler than your step 3.
Google Docs (and many note taking apps) let you export to HTML as well. Word Online lets you "transform" a document into a website and gives you a link to send to others, but it's not really the same since there's preset themes and none of them are as simple as a Word/Doc/Note exported as pure HTML.
Also available, Blue Griffon, which is an open-source successor to Dreamweaver. It offers templates and CSS, and what's uploaded is a totally static page unless you stick in something dynamic like a form.
I am sort of surprised at how talk about static websites is only becoming top of mind as of recent. When I think of static websites used with success, I always think of Obama's campaign, it was built with Jekyll. There was quite a bit of talk around static websites around that time, and it seemed like quite a few people were on-board with the idea.
When people talk about static website generators though, I don't think many people are talking about tools like dreamweaver.
A lot more people would be talking about tools like Dreamweaver (and FrontPage) if they weren't a long time gone. It's been nearly two decades from the last "proper" release. Some of the spirit of Dreamweaver lives on in other tools in Adobe's Creative Cloud, but also are harder to use, and more expensive.
I think of dreamweaver more as a wysiwyg tool than a static site generator. Although you can obviously generate static sites with it, I don't think it really fits the modern workflows I would associate with static site generators.
Dreamweaver and FrontPage both had very nascent/early ideas towards Content Management, they weren't "just" WYSIWYG tools towards the end. Given how much work goes into making static site generators WYSIWYG when building content (such as tools like Netlify CMS), the "ideal" is probably somewhere in the middle.
Had PHP hosting not got so cheap so quickly and subsequently early PHP CMSes become so common, it's possible there was an alternate universe where Dreamweaver or FrontPage continued to evolve into really interesting static site generating Content Management Systems.
As pointed out in the article, not only do you avoid 502 Bad Gateway, but rolling static sites is also trivially downtime-free, meaning no 503s either[1]
1. If two users access the same page, do they see the same content?
2. How often do you update the content of the website?
The reason to ask yourself 1) is to know if the content has to be generated on the fly. If it's a blog post or a hospital "about" page, then the content is the same for everyone. There is definitely no need to fetch it from the database on every page load. Like Eric says, "get it down to static HTML and CSS".
Cases where the content is dynamic are social networks (FB, Twitter), search engines (Google, any ecommerce search feature), and any logged in content (cart, likes, friends…). For the latter, I've seen cases where the logged in content is only fetched via JS after the initial page load. This hybrid solution probably saves a lot of bandwidth.
The reason to ask yourself 2) is to know how long, as a content creator, you are willing to wait between updates. I can imagine that updates to a blog post or a hospital "about" page are allowed to take several minutes. In that case, static websites are favourable.
I guess the waiting time is correlated with the amount of updates: the higher the amount of updates, the shorter the waiting time you would want.
The reason why CMSs like WordPress are still popular is because of their ease of use. I've had several clients ask me specifically for a WP solution. I guess the issue is that it's hard to detach the dynamically-created content from a dynamically-served website. There are cache plugins, but I'm still looking for a custom solution where WP is only used as a non-public CMS, that then, at a push of a button, gets generated into a static website (or even a Gatsby one).
There is a lot of room between “generate everything from the database on page load” and “fully static site”. We could expect many people editing crucial sites at this time are not so technically literate, under time and other pressure, and do not have the spare time to learn something completely new such as authoring a static site.
Given those criteria, I think a better solution to these problems would be some sort of caching layer, either built in to the CMS or perhaps a different server acting as a reverse proxy.
I've been working on an restaurant order system for a client lately.
Instead of doing the obvious thing and build a database connected web application hosted in the cloud; I opted for a local application that generates static files for customer ordering, uploads them to a web server and polls for orders. On the server there's a simple PHP script that writes a chunk of JSON to a file when the order is created.
The main motivation was to allow them to continue taking orders in the restaurant and perform maintenance even if the internet connection isn't working. Generating files also simplifies the implementation of customer specific customizations.
And since the database isn't customer/web facing any more and therefore doesn't need to scale beyond a couple of simultaneous users, I opted for a simple file based solution.
The fact that the customer order part runs blazing fast is a nice bonus.
I've done this before too and it works well for a good while. The issues start coming when the number of files increases to a volume where the hosting machine becomes slow. Having millions of tiny files takes its toll on even the best file system. However this can be mitigated by perodically archiving files. Just something to watch out for.
i one time made a html version that takes a query. js finds the line matching the text then hides everything before the previous seperator and after the next one. if a key is pressed it seeks the next match, if there isnt a (next) match foo-07.html forwards the query to foo-08.html which might not exist (haha) it shouldn't be so hard to update the js with a max value or check if it exists first but without such luxury it worked amazingly well for the amount of code involved
Sorry for the dumb questions, I was just trying to understand the precise architecture and solution here.
So the "local application that generates static files" is the one used by employees of the restaurant when a customer makes an order? Or is that used just for maintenance purposes when changing formats of the orders customers can do. But what I was confused most about was what "polls for orders" meant in this context.
The other thing I was confused about was the generation of "static files for customer ordering". So the application generates these files, which are presumably an html form (and/or corresponding PHP script) and uploads them to the server. And the server's html static pages that are used to order are accessed either locally on the box itself or through another computer on the local network connection (since you mentioned that it can run even when internet is down). Is my understanding correct?
The application is used for everything that happens locally, including taking orders in the restaurant and organizing deliveries.
Polls for orders means it reads the json-file that is appended to via the php-script when a customer makes an order from the web interface and imports new orders into the local database.
The application generates json-files, which are then read by static html files. This makes it easier to test with stub data and allows shipping the html/css to a designer if/when they want something nicer looking.
The html files and generated json data is never used locally.
Put a caching reverse proxy in front of dynamic content, like Varnish. That way the technically less inclined can keep using traditional UIs. Keep in mind most people don't even grok the static/dynamic distinction, they just want to publish content
This appears to be well-meaning (which is to say it's not the usual tautological whinging tech hate for dynamic sites, it appears to come from a place of concern for delivering important information in a crisis). But it's also quite a lot to ask in a crisis. Stuff like this should set off alarms:
> I can’t tell you how best to get static—only you can figure that out.
Okay. Is this helping? I guarantee anyone responsible for uptime of critical services is pretty overwhelmed right now, and not going to be immediately receptive to calls to make huge, abrupt architectural changes without any guidance.
Edit: this is how it should be done[1] (HN thread[2])
This is good advice, and is meant for service based sites that are now more important than ever. But I’m not sure how many service providers can even afford to spend time and money on such things and risk more failures in this process than the current timeouts or other issues. The largest companies would’ve already optimized these (to a great extent) long before this pandemic.
One can only hope that in the coming months, websites focus on reducing their bloat and catering to audiences with restricted bandwidth (if streaming services are already doing their bit to reduce traffic, so should others).
For truly static websites, the ones without a lot of dynamic or frequently changing content, what good CMSes exist that can easily adapt to or be used with SSGs (for self-hosting, not using Netlify or some other site)?
In case you're in the situation where you're considering moving your site from a CMS to something static, I did this with PhotoStructure's public site in December, switching from ghost to hugo. It's fantastic to be able to prop up a new site in seconds with an rsync command, and revert to prior versions with filesystem snapshots or standard backup software.
Something I found during the weekend, if you have PowerShell available (so any Windows machine, or any macOS with brew installed, or linux if you're willing to add a pkg source from microsoft):
1. Take a Markdown document
2. In PowerShell: ConvertTo-Html -Body (ConvertFrom-Markdown -Path .\README.md).Html -CssUri "stylesheet.css"
That results in a complete and valid HMTL document that can be published anywhere with its assets.
Updates in the field should be a solved problem with a CI/CD pipeline that is actually doing the work. I use Hugo to generate a few small blogs, but don't do any of the generation locally, nor uploading. Quick updates don't feel like much of a pain in these cases (update the file, commit, push), but agreed that on a big site it will be a bigger update.
I do this too for some sites. But it's what I mean by adding extra tooling. You can do that, but it creeps closer toward something more Wordpress + CDN like.
If Hugo is only changing the file with the text edit, awscli sync should only transfer the changed file to S3. If it's transferring all of them, it's because Hugo is touching all of them.
I have written sites using CMSes that load lickety-split. The trick is to use simple, “bald,” basic themes, very few plugins, no JS libraries, and hand-write your own CSS.
Depends on what the site does, and, most importantly, where the content originates.
That said, Eric Meyer rocks, and it’s always a good idea to listen to him.
Totally agree - the bottleneck are bloated themes. Just started to use Framework7 for a new site ... I like it's compact design and great features to design mobile sites.
I like React's paradigm of declarative programming, and I like that it has static site generators like Gatsby, react-static and Next.js. React is interesting because it combines both a programming paradigm with the method of delivery, so it's nice that one can decouple that from the other.
I can't count the number of times when download + rendering time has been the difference between a useful and a useless response to a "do I still have time?" query. Imagine being in that situation in an emergency.
I was really wondering if any of the FAANGs reached out to help websites like https://www.worldometers.info/coronavirus/, which while not that fancy cannot really be static 100%, as they update the virus numbers almost by the minute. They used to have some DB connections problems a couple of days ago and I think I also saw some 503 responses, but thankfully things have stabilised for them since then.
I'm practicing for system design interviews, I'd be very curious to have some design ideas for a site like https://www.worldometers.info/coronavirus/. Seems to be a good case study. Can anyone provide some hints or pointers?
A static site generator gives you many of the advantages from a "CMS" on-the-fly rendering, while also giving you all the advantages from static hosting.
If using PHP the way it was initially meant to be used (i.e. includes, variables, and helper functions where it makes sense, etc.), then it doesn't need to be 100% static in the way the author is suggesting, to be orders of magnitude more efficient than a full-fledged WordPress site.
https://jamstack.org/
"Fast and secure sites and apps delivered by pre-rendering files and serving them directly from a CDN, removing the requirement to manage or run web servers."
Any piece of functionality that can't be done completely statically now becomes a managed piece of SaaS that'll probably go out of business by next tuesday.
CDNs are not by definition "more reliable" than a single webserver, especially one with little traffic. If DDOS is your problem, you need proxies, not plain CDNs.
Plus you're probably going to use lots of different CDNs, because all of that of that static content is from different sources and it's "best practice" to get it from there, or maybe your SaaS forces you to use theirs.
Bankruptcy SPoF is going to be an important optimization to consider, but it is really only a bad corner case of vendor lock-in, so not a novel problem. I do think CDNs are becoming attractive relative to other hosting models for pretty much any public static content though. The more complicated services handling dynamic content and functions is going to require more thought as to how one chooses to serve it.
Nothing wrong with React. React has built in server rendering. With frameworks like Gatsby that render to static assets and netlify for deployment, this becomes radically easy.
A lot of the time, you use a CMS so that non-technical people can manage content. I'm also not aware of any static site generator that doesn't completely fail in this regard.
You can turn some CMS content into static content, but it's pretty limited. Even something as simple as a contact form is going to fail.
Static is not the way to go. There are technologies today which can scale fast without crashing. You don’t need static sites to handle the load.
Checkout a serverless CMS like Webiny - https://www.webiny.com
Really? People are hunkering down in their homes and have lots of time on their hands. If a web site is overloaded, they can try again later. Annoying, sure. Life or death? I am skeptical.
Moreover, “getting static” is nice and a fine way to improve performance, but it seems to me that really the goal should be, “be available.” What’s wrong with using a CMS or anything else as long as you have the capacity to meet demand?
I feel like few people even know about static site generators, and use general purpose solutions what anything that includes word "site" in it, whether it's dynamic social portal or single page static content.
Like, from the practical perspective, if you have a blog with content that is requested 1000 times per minute, but updates once per month (23 microtimes per minute), why on earth would you setup a database (with geo-redundant replica, of course) and run overly complex code to render the same HTML over and over every single time it's requested? The only possible answer here is being oblivious of other available options, and I believe author of the post is trying to raise awareness that there is another option.
Most websites are more dynamic than they look. Anything with frequent updates (news for example), user accounts, any kind of personalization / geo-based content, or running A/B testing will demand dynamic rendering.
That's not an excuse for terrible performance either way. At this point it's trivial to handle hundreds (or thousands) of req/s on common hardware if you just choose the right software to run.
The JAMstack pitch is to sprinkle a bit of JavaScript on a mainly static site to achieve personalization, geo-based content, etc. A recent intro video shows exactly how to do that with news displayed based on browser language settings, etc.[0] It is a bit more complicated than a plain static site, but with recent tooling, not much more complicated.
even then it's only a question of how long can "stale" data be served, or is the dynamic-ness worth it?
You can you lamba/cloud functions as webhooks for your CMS to ping when a rebuild of the static site is necessary. Netlify helps you set this up easily.
Even if you updated a dozen times a day, it would take only a few minutes or less to rebuild and deploy.
edit: yes per-user data is a primary use case for dynamically rendered sites.
But public pages that all users load make sense as static if possible
A government website, or anything of importance during an emergency, really has no excuse when it comes to being excessively heavy, both on the front end and the back end.
At times of panic you want to convey information as fast as possible and as efficiently as possible. Five nines availability.
I agree completely with the goal of availability. I don’t even disagree that static pages could be useful toward this.
What I take exception to is the idea that static pages are the one true way to achieve this goal, and TFA’s pronouncing that not using static pages is irresponsible.
Yes, because that next alternative could be a scam site. When people are desperate for information, they will get it wherever they could find it, even if it is garbage. It is much better to have the good sites stay up.
> get it down to static HTML and CSS and maybe a tiny bit of enhancing JS, and pare away every byte you can.
HTML is a language used for building your DOM. CSS is a language used for styling. JavaScript is also a language that can do those things, and focusing on the language used isn't going to help.
If the goal is to save space and power you can do it much better with JS. A lot of server side rendering, html caches, and static content is focused on good SEO and low latency, not small download size and power consumption.
You can fit a much smaller download into a stripped-down Gatsby site if you have a nontrivial amount of content. It also makes things like serving only much smaller versions of any images way easier. You can go even smaller with vanilla JS if you want.
Specifying content, styling, and layout are what HTML and CSS were designed to do, and they do it quite well. Throwing the bigger hammer of Javascript at this problem means anybody who likes to browse without Javascript (which is quite a few of us in this era of abusive ads and malware) is not going to see your site properly. It also means you've opened the door for some future maintainer to introduce a multi-megabyte Javascript library 98% of which isn't actually needed, and which will very much consume a lot more power on the client's platform than HTML and CSS.
Never use a Turing-complete language to solve a problem that's already well-solved with a non-Turing-complete language. Especially when that non-Turing-complete language is already present on the client's machine. Most especially when you're talking about static sites for essential public information.
Actually, JavaScript often is not good for that either. Static sites compress really well, and static HTML typically renders orders of magnitude faster than running JavaScript to create commands to eventually do the same thing. In addition, running JavaScript is typically far more power draining on a client than rendering HTML. Web browsers are extremely optimized to render HTML. There is also the advantage of working correctly when people do not have JavaScript enabled.
On the other hand, if you need a site to work even with spotty internet access, JavaScript is probably your answer.
But these are generalities, you can probably find a thousand exceptions. It is hard to give truly general advice because there are so many exceptions.
That said, there are many good reasons to have static sites when you have to scale up big. They typically can handle very large scales, and caching them the super easy.
Whether or not that actually meets the requirements of a site is a completely different story.
HTML is the serialized representation of a DOM. It's the closest to the metal that you can get. It's far more efficient at what it does (thus, faster, thus consumes less power) than JS does. Byte for byte, JS is the most expensive part of the web stack, whereas HTML is the cheapest.
This solution is 100% wrong for us technical people, but for non-techies who just want to get the information out there it's a really good option imho.