
Why WeWork.com uses a static generator - yale
https://engineering.wework.com/engineering/2015/12/08/why-wework-com-uses-a-static-generator-and-why-you-should-too/
======
orestes910
I'm still struggling to get on board with this. It seems to just favor speed
above all else. I found the linked article at
[http://www.smashingmagazine.com/2015/11/modern-static-
websit...](http://www.smashingmagazine.com/2015/11/modern-static-website-
generators-next-big-thing/) to be a bit more helpful in pitching the case, but
even so it seems like just pulling the complexity of dynamic sites into the
build stage of the static ones all because writing efficient DB queries is
"hard". For a companies front page, I can see the benefit of generating
content once instead of for every client, but what happens when you need to
present information beyond the generic? What happens when you need to show a
user's order history? Let them change their password? Allow them to see the
distance of that run?

I feel old and crotchety, but I just find it a bit strange that as we get more
and more obsessed with data, this emerges; something that seems less that
optimal for dealing with it. They make the comparison themselves, but it seems
like reverting from "Web Applications" back to "Web Sites" and just filling it
with API calls to actual Web Apps.

~~~
exelius
The "right" way to do this (edit: this is essentially assumed by the article,
so I'm not contradicting it) is to build static sites and use JavaScript to
pull in any dynamic information. You run data services that output JSON, then
the JS on the page calls those data services and renders the views.

This still allows you to reap the benefits of static content (namely CDN
distribution / caching) while still maintaining some dynamic content.

But basically, you do it this way to avoid pulling data that isn't user-
specific from the database. It's far more efficient to build a header + page +
footer once at build time than to have every client have to do that on access
-- even if you have 100+ variations of your site that need to be compiled and
updated on any change in the shared content. Storage and compute resource are
trivially cheap if they don't scale exponentially with number of users.

Yeah, it's more work for your developers, but it saves your DBAs a LOT of work
by reducing the volume of data served from the database. And at the end of the
day, it's full-stack effort that counts -- I'd rather have my developers spend
2x the time to build a system than have my DBAs fighting fires because the
site went down. With a static site, you can set it to read-only mode if the
traffic gets to be too much, and the static content will be handled by the CDN
(which can almost certainly handle any load).

~~~
Albright
Using JS to present data moves the work of generating the display of that data
from the build process, as with "pure" static site generation, or from server-
side code, as with a standard CMS, to the client, where it is most likely to
fail in unpredictable ways due to variances in browser JS or CSS engines,
network performance, browser extensions, etc, etc.

But given that the concept of progressive enhancement seems to have been
completely lost on the latest generation of web developers, who cares, right?

~~~
exelius
Well, the key here is that any non-JSON data you pull from a URL should be
static. That means it can be cached, and the client should only be pulling a
small amount of data that varies based on their account.

I wouldn't write an interactive web application this way, but for sites that
are mostly content the approach works fine. You still have to test on multiple
browsers, and you still have to write code that handles the differences in
browsers/engines/etc. But your servers are doing less work, and the content
reaches the customer faster. It's still up to you to optimize your JS (though
I guarantee most tracking cookies are taxing JS far more than rendering a few
divs will).

~~~
Albright
> the content reaches the customer faster.

Does it? Is it really slower for your customers to download a server-generated
page than for them to download a static page, (probably) download a JavaScript
file embedded in the page, execute the JavaScript, then download and process
server-generated JSON?

~~~
exelius
Considering they're likely also downloading JavaScript and executing it when
using the server-generated page, yes. Let's not pretend it's possible to do
everything on the server side. Just now instead of compiling the page in real
time when it's accessed, we compile large parts of it long before the user
requests it.

------
thecodemonkey
Static websites are great of course, and the article outlines that nicely.

The problem that I've been facing personally is that most of the times, the
people who are updating and maintaining content on marketing websites are not
developers.

So having to use the command line for static site generators such as Jekyll or
Middleman is just not a good experience. Don't get me wrong, I LOVE static
site generators in general and I'm using both Middleman and Sculpin myself.

Other people has seen this problem too, and some user-friendly-ish static site
generators have started to surface, but I want to chime in with a solution to
that too since I believe that the best way to adopt static site generators is
to use the tools that content creators and maintainers already love and are
familiar with, e.g. WordPress which now has a whopping 25% marketshare[1].

I built SpudPress[2] with a friend. It is a hosted static site generator for
WordPress. We automatically generate a static version of your WordPress site
and host in on a super fast CDN. You don't have to worry about any of the edge
cases of generating a static copy, we take care of all that automatically for
you.

[1] [http://ma.tt/2015/11/seventy-five-to-go/](http://ma.tt/2015/11/seventy-
five-to-go/) [2] [https://spudpress.com](https://spudpress.com)

~~~
icebraining
Movable Type, one of the original blogging platforms, worked (and works)
exactly like that, though it also supports dynamic publishing. For example,
Jeff Atwood's blog has always been statically generated:
[http://blog.codinghorror.com/coding-horror-movable-type-
sinc...](http://blog.codinghorror.com/coding-horror-movable-type-since-2004/)

~~~
thecodemonkey
That's really cool! I like that. It's nice to see that a platform like Movable
Type has had this feature right off the bat for such a long time.

The main reason that we decided to build SpudPress this way, is that you can
more or less take your existing WordPress site and instantly make it static.

In contrary to Jeff's Movable Type blog we're taking full advantage of the
static pages to host the entire site on a CDN + handle asset cache validation
automatically.

------
oconnor663
> One key requirement for us when evaluating different static site generators
> was the ability to hit an API endpoint and dynamically generate static pages
> based on the data returned.

...dynamically generate static pages?

~~~
jnbiche
Basically, this means you hit an API endpoint that returns an array of JSON
objects, each of which is used to generate an HTML page.

Contentful offers this as a service, along with web frontends for updating the
data for those APIs.

------
captn3m0
I run my personal sites and a couple other on static site generators (mostly
Jekyll) and it really works out well. The most complicated and interesting
case is that of hackercouch.com, where we are running custom Jekyll plugins to
even serve an API.

Another website, recently setup for the chennai floods by a friend uses a
Google Spreadsheet as a database, but is served as a static site:
[http://chennairains.org/](http://chennairains.org/).

Its far more easy to deploy a static site, they are portable and shifting
hosts is far more easier. Your database considerations are slightly less
relevant since it only affects deploy speeds, and not your site performance.

------
Dr_tldr
Some serious questions:

1\. Why not just use fs.writeFile to create all the site info as json files as
either a one off or a chron job, then run a gulp task to either use something
like jade or else roll your own rendering to put the JSON in the right place
on the page, then output the result as html to a public folder, set up an
express static server and have a catch-all splat after it for 404s? What am I
missing here?

2\. If your site is fairly static like WeWork's, why not just have it as a
SPWA, set all the links to retrieve a JSON file from the static server and
process them on the front end? You could even use history.pushState and
checking window.url to make sure that history works and links would load the
right thing.

3\. When someone logs in, are you doing a DB lookup then serving their
dashboard page statically (somehow, for some reason...), or does this static
build only apply to part of your page and not others?

4\. Could everything Roots does be replaced by one 6 line gulp task? Not
trying to be mean, just wondering if its target userbase knows what gulp is.

~~~
bobfunk
For point 1 - that will give you way worse performance than WeWork's current
setup, since all requests will have to go back to your origin where your
express server is running in order to determine if it's a static request or a
dynamic request.

With netlify this happens at the CDN PoP which makes a huge difference for a
site with a global audience like WeWork.

~~~
Dr_tldr
Sure, but that's not an advantage unique to netlify, is it? I mean, one could
run the express server on Amazon CloudFront and have the same situation and a
more competitive pricing structure. Or is there something else they do that
I'm missing?

------
tarr11
Looks like the engineers made this decision:

"If you consider the amount of information that changes on a daily, or even
weekly basis on a site like wework.com, it is actually quite wasteful to have
a server process each and every request that comes through."

A better solution for a marketing site is to use a CMS (doesn't really matter
which - Wordpress, Rails, etc) and then use a CDN like Cloudflare to proxy
static pages and speed things up. It's fast, efficient, and flexible.

The solution wework used optimizes for performance but reduces flexibility.
I'd bet the marketing team at wework wishes that they didn't have to redeploy
the entire site every time they wanted to change some text.

~~~
kavok
With some static site generators you don't have to redeploy the entire site.
Just the changed files. You can rapidly update the site with no issues. I'm
not sure if a CMS is really a good solution for a site that changes
infrequently or when updates typically require developer interaction anyways.

~~~
chriswarbo
> With some static site generators you don't have to redeploy the entire site.
> Just the changed files.

My static site does this. I use make.

------
orthecreedence
"Why I immediately click off of blog posts with titles containing 'and you
should too' and you should too"

~~~
aji
Judging posts based on titling patterns considered harmful

~~~
reinhardt
Premature generalization is the root of all evil

