Wait, are we really talking about a static website?
I'm a big fan of React and all the JS ecosystem, but when I need a static website, I just make my bunch of html files (and I copy/paste the same header/footer on each), my css file, eventually 2 lines of JS directly in a <script> tag to toggle a menu on mobile devices, I upload it on S3/CloudFront and that's it.
In this context, it means static HTTP responses, but dynamic HTML/DOM.
I want a word to mean really static.
I'm in the core Phenomic team.
The idea in Phenomic is that we generate for each page:
- the HTML entry point, rendered like it would be with ReactDOMServer
- the data-requirements of the page
That lets us offer the following:
- Any page is accessible directly, without runtime (which has advantages regarding performance and SEO)
- Navigation works even if JS is disabled
Phenomic really is static, it just uses a few techniques that we learnt from the modern front-end development workflows and capabilities :)
But the React app can also start consuming all kinds of webservice.
This way (1) a large part of content can be served as prerendered static HTTP responses (HTML, CSS, JS). (2) Static content not initially loaded can be served as JSON (as seen in the presentation). And (3) dynamic content (e.g. real time comments or a chat) can be non-static (one or more webservices, possibly on WebSocket thereby going beyond HTTP).
I think this may be very interesting when "big mostly static content", "need for modern web tech" and "big traffic" meet.
I can see good benefits of having a way to do static site generation with the React ecosystem if you're already familiar with it.
If the website is simple, you could use iframes for headers and footers.
<!--#include virtual="../snippet.html" -->
First, you have a need. Then you get a crippled native "solution", that emerges from some committee and therefore doesn't really solve anything. In lieu of alternatives, a lot of people use it anyway. Then the problem is ignored long enough so other people come up with hacks to solve the same problem slightly less bad. Then the problem is ignored semi-permanently because 'hey, you can just use one of those hacks!'. (Look up iframe seamless attribute.)
The workflow and output remains largely unchanged, but something like Jekyll would probably make the resulting codebase a little cleaner.
Otherwise, I completely agree. There are better tools for building static sites.
You could at least use server side includes (SSI) to include those snippets.
Static website generators like Jekyll are pretty cool, though. You should definitely check them out.
I added a game of life mostly to prove the extensibility covered the screen modes.
You copy/paste code? This is crazy.
Throwing together a small static website with a few pages, in my experience, is one of those situations.
I totally agree that for many sites, using Phenomic or Gatsby would be overkill.
But for something small, it doesn't really matter what tool you use as long as it's familiar. For small projects, familiarity trumps any other concern.
React came out of Facebook that needed a frontend technology that could scale to 1000s of developers.
Gatsby and Phenomic are both an attempt to port the best of the React ecosystem to the world of building web sites. They're designed so that as your website gets larger and you add more people, the code still feels simple and it's still easy to make changes and add new features.
So yes there's a learning curve and some overhead but it really pays off for larger projects.
And the nice thing is that once you understand them, they're now familiar so just as easy to use on small projects as any other solution.
What will it take to dispel the myth that JS rendering isn’t SEO friendly? There seems so much confusion on this point but the fact is that as long as your UI is rendered synchronously (i.e., any data needed for the UI doesn’t have to get fetched from a server), the page will be crawlable by Google.
Further, the goal of isomorphic rendering/rendering static sites in general is not to serve individuals who have JS disabled (that’s a bonus, sure), but rather to remove JS loading and execution from the equation with the end goal of faster paint times for your application.
To be clear, this is a really cool project but I fear that many of us throw out keyword spam like SEO and UX to make something more digestible, when the reality is that the performance gain is the real hero!
The other engineers and I were flabbergasted ("it's not supposed to make a difference!") but our SEO expert was not surprised that there was inconsistencies in the Googlebot documentation, and it goes to show that we still don't have a lot of transparency into the google algorithms.
I suspect that, like you say, it was a speed-to-paint thing. But when you're in a crowded keyword space it makes a difference. (As oppose to when you're just trying to rank for your own unique name--- looking at you, Preact ;) )
Yes but the performance gain can have an impact on SEO. I'll have to find the article but I believe the StackOverflow guys figured this out early on and blogged about it.
I have been fighting this battle for a while at different places by I can't definitely prove it's true, and most SEO experts that I work with seem to be afraid to upset the apple cart explaining that Google might be good at it, but what about yandex or baidu. I know there are small tests, but has a large corporation where SEO really matters made the switch without a meaningful SEO hit?
I take back everything I've ever said about web development. Those horrible days are behind us. Thank you to the react, typescript, and phenomic teams. I can easily base my next 5-10 years of web development work off the frameworks and directions you've set out for us.
Search can already be done using Jekyll https://blog.algolia.com/instant-search-blog-documentation-j... (same idea of indexing at build time).
Jekyll can hot reload with jekyll serve.
Anyone who is familiar with React is familiar with HTML and CSS, whereas the opposite is not necessarily true. This means if you are able to use Phenomic, you are able to use Jekyll, but not vice versa.
The NPM ecosystem is useful for authoring templates, not for the end user. If I were writing a Jekyll template I'd probably use webpack and npm, but I don't need that baked into the static site generator itself.
Finally, page load is a really dubious claim, here's some numbers:
On phenomic.io, the HTML for the index page weighs 3K gzipped (measured by copying HTML, removing what can be externalized and cached like styles + scripts, and gzipping index.html)
When using Phemonic to do client-side loading, the JSON for the different pages is from 600B to 2.9K (as seen in network tools when clicking on the links in the top nav).
A page load with phemonic therefore saves you:
* 2KB of bandwidth for some pages
* a few 302 not modified requests for static resources, which are negligible if you use HTTP/2
On the flip side of the coin, phemonic.js, the script bundle which makes all this magic possible, weighs 132KB, ie 44 times the size of the content the user wants to view.
I can't see the value here.
But in any case, client-side routing to me is a nice-to-have not the killer feature for Gatsby. Building web sites with the React component model is the killer feature for me.
FWIW, while I'm in the US so don't really understand developing for very poor networks, the methods I'm describing is exactly what companies in India and other places with poor networks are adopting: https://developers.google.com/web/showcase/2016/flipkart
This is also a good read https://developers.google.com/web/fundamentals/performance/p...
Which fits in with this discussion.
Been working well for us on https://DNSFilter.com
Is just me, or does it look like NextJS uses pure functions?
Or rather, next.js uses React Components. You can write those as pure functions or you can write them as classes to have more fancy features (e.g. setState, lifecycle hooks)
We've now reached the point where HTML and CSS served via a normal webserver is regarded as insufficient and anyone who wants to build websites of even the simplest kind is expected to wheel in an incredibly large client-side and development stack.
Something has gone wrong. All this to avoid a page reload? Should the terrible effects of a page reload not be solved elsewhere?
> without the hacky "pjax" solution.
Before you start using the word 'hacky' you might want a moment of self-reflection.
Angular did have its issues, but I dropped a reference to CDN's version into a web page and was off to the races very quickly.
I'm literally using Makefiles to build, test and deploy a set of about 7 microservices (including their web admin tooling). It's incredibly straightforward, flexible and clearly documented... Gulp, Grunt, WebPack... What are we doing?
Once you've accepted that JS is here to stay in browser-land (for now, at least), what alternatives are there for build tools?
As a younger dev, when I look at makefiles, I see the same level of complexity and "confusing-ness" that I encountered when I first learned about gulp or webpack configs. I keep reading posts like this on HN where people yearn for simpler days, but weren't the demands of websites and user interfaces simpler back then as well?
I've only been doing professional web development specifically for 4 years or so, so I don't have the same experiences of building in older technologies, but I simply can't imagine trying to build a large, immersive SPA without a modern framework. How do you manage global state or services? Just throw everything in the global namespace? How do you minify your js/css/html? How do you treeshake your unused code? How do you create a local proxied server to avoid CORS issues while developing? How do you autoprefix your css automatically? How do you develop with live-reload functionality? I know all of these can be done individually without these tools, but you can do these all individually without gulp/webpack/etc just by using node/npm scripts. These tools just package them into a more convenient format for common use cases, and tie complimenting tools together. It seems to me that everything would need to be custom built, and you would need to recreate complex scripts from scratch for every new project.
These tools obviously aren't perfect, but I'm not sure they're as bad as you're making them out to be either. Like I said, I don't have the same long-term experience as you or others might, so forgive me if I'm just being naive here.
> Angular did have its issues, but I dropped a reference to CDN's version into a web page and was off to the races very quickly.
Then you're not really comparing apples to apples here. There are <script> loadable versions of react on CDNs that you could have used.
Combining various libraries can be a pain and sorting out all of the dependencies and node modules like he stated can be cumbersome.
What? Says who? In my humble opinion this sounds to me like a case of "I don't like when people use things that I don't like". Nobody is expecting anyone to do anything for "websites of even the simplest kind". People just use technology they like or think is fun. Nobody is forcing you to like it or use it.
The impression given by looking around the web for discussions and tutorials is that you have to learn full client-side MVC. This is a huge burden to impose.
The other danger is that people are building sites using these stacks because they never learnt the simpler way to do things. All the complexity gets pulled in without question.
At least us old-timers have the experience to know when there's a positive cost/benefit ratio for all this tech. I'm not sure everyone will in the future.
Besides, as you alluded to, having the ability to weigh tradeoffs and make a cost/benefit analysis is a honed skill that takes time and experience to develop and making a few incorrect choices is a critical part of the learning process.
The bells & whistles are mostly added bonuses - not an argument that all static sites should be built this way. At least that would not be an effective argument in my book.
We've reached this point in 1995, that's why Rasmus Lerdorf created PHP !
Edit: Oh, I just realised that this is even older than the CSS spec itself (1996). So technically, there has been no point in time where HTML and CSS served via a normal webserver were regarded as sufficient :)
: what is a «normal webserver» anyway ?
You should be excited because if you're building a CRUD application that doesn't need to be a SPA, and you choose the traditional server-rendered route... your app is going to be better than the equivalent SPA hog. Your app is going to be better, users will feel that it's better, and they'll give you their money.
Oh, and you don't have to re-invent the wheel with things like routing, so you are saving a lot of development time that the SPA competitor wastes on. Another win for you and your customers, more time to work on features.
But if you could anticipate even some of these requests, and you request ahead of time and deliver the results via DOM mutation, then you might be decreasing net user interface latency.
not really, you just seem to be overreacting to someone experimenting with a different way of doing it. I doubt most users would be able to tell the difference when browsing a traditional static site vs. the one in this article. I agree with you that there's a lot of overkill in our industry, but engineers can't be expected to sit on their hands and think up new ideas :)
the advantage thoug is that you do not need a server. even with dynamic state/data the app can be hosted on a CDN or static web server.
This is a sensible assumption for a something like a blog, but falls apart quickly on most other sites I've had to build for clients (e.g. more than 1 column, homepages with lots of "modules", etc).
Kind of ironic that React's "big idea" is that it's components all the way down, but when it comes time to structure the content, it's like "nope, just one big monolith of styled text".
(YAML front-matter helps a little bit, but usually is only intended for "metadata", not the primary page content itself.)
- to share styling & code between an app and a related static site
- because it's really easy, iff you already have your full front-end dev stack set up already.
That should be done the browser actually. The Epiphany browser does it, but no other.
For search there is Google.
Well then it wouldn't be static would it?
What is the point of over engineering a static website that can be solved with far far less?