
Show HN: The static, static site generator - Xeoncross
https://github.com/xeoncross/Jr
======
chavesn
What a great idea, I'm really impressed with the cleverness and execution.

A few thoughts and questions for you (you may have already thought of/know
about):

\- Have you thought about "well-formedness" for the HTML? I realize that
adding anything besides the script tag would sort of ruin your point, and it
would be nice if browsers accepted that, but also I feel there are sometimes
hidden benefits to serving a well-formed document under the correct content
type that the server said was being sent (or extension).

\- This makes your site awesome to read with a text browser or curl.

\- About 1/2 or 1/3 of the times I click "back" or "next", I can't scroll the
page after it renders. (Chrome/MacOS)

\- The footer support is cool, have you thought about how you would do more
extensive template support? (Maybe there's no reason anything extra couldn't
be placed in the footer -- analytics, even a header -- although I think a
favicon might need to be in a `<head>` tag). Integrating this directly with
Markdown, too, could be really cool.

\- P.S. Your footer link says "Chanpter" :)

Great work, I love when I find something like this -- really clean, simple,
and challenges the norms in a clever way. The result even appears quite
polished.

~~~
Xeoncross
Thanks for the feedback. That is really what this project was about - thinking
outside the box to solve problems.

Like most projects, I would expect elegant solutions to some of these problems
to appear as more people think about the concept. I must admit, it works
pretty nicely for such an abuse of technology.

------
randomdrake
Another neat show of letting the client deal with the rendering.

From the example[1]:

"You see, there is really no need for the server to generate anything for
simple article-based sites like this. If the user wants to read your blog they
can spend a few processor cyles[sic] to render the page themselves."

This paradigm is beyond me and it seems to be growing for whatever reason.
Client-side processors rendering bytes through a separate engine, other than
the HTML one, only to have it eventually run through the HTML engine so the
client can actually read it.

The thought that everyone has a decent machine to do the processing is, in my
experience, _still_ a false one. This was true 5-10 years ago, and I haven't
seen evidence that it isn't still true today.

Your server probably has many, many cores. Probably SSD architecture. It is
barely blinking to answer a request and deliver text. Why on earth would you
leave the simple job of wrapping markup around text, to a machine that you
know nothing about?

The average Internet user isn't terribly savvy. Their browser is possibly
cluttered with add-ons. Their computer more than likely running 100s of
background processes they know nothing about.

Depending on the stats you look at, in 2014, we're looking at a lot of folks
with dual-core machines and a couple GB of memory, if they are lucky.

I can't understand why you would want to delay or hinder the experience to
getting to your content.

While it may load really fast on your Macbook Pro Retina, or ThinkPad X1, that
has a few text editors open and an up to date browser, the experience won't be
true for everyone.

When did it become trendy for developers to put burdens on their clients with
all of this front-end first thought? Just because it makes it easier for you
to write and deploy, doesn't give you the excuse to put the burden on the user
for rendering your stuff.

How many more times do we need to read about companies being forced into
dropping this ill-conceived paradigm because they realized it made the
experience for the client _worse_ , and in many cases, made the development
worse as well?

In this example, we end up with an HTML file that is all of 6,084 bytes.

To get there in this example, we used jr.js, a 5,616 byte file, to load
showdown.js, a 14,859 byte file, and render 651 bytes of text. Sometimes, the
loading and rendering is _so slow_ , that the code itself has handling for it:

    
    
       	// Empty the content in case it takes a while to parse the markdown (leaves a blank screen)
    	jr.body.innerHTML = '<div class="spinner"></div>';
    

21,126 bytes to generate 6,084 bytes of text which has to now be rendered (one
more time) by the browser.

Wouldn't it be great if there was some standard about the bytes that were
delivered over the wire that everyone could use and build upon? Folks should
get together and build a really good processor of bytes coming over the wire
that's delivered in a particular format to be rendered on the screen. It would
be great for the Internet! You could browse the entire thing!

[1] - [http://xeoncross.github.io/jr/](http://xeoncross.github.io/jr/)

~~~
Xeoncross
A server should serve. I to think too much front-end magic will slow down a
site.

However, I actually created this on that dual-core laptop you are talking
about. So not to take away from your point, but let me also take this from the
other perspective - bandwidth.

The 6kB + 15kB Javascript files only initially seem to be a waste. After you
think about the bandwidth you save transferring all the additional pages in
plain markdown and using the (now) cached JS build each page actually results
in much faster loading times even if the browser might have to spend a few
hundred milliseconds rendering.

~~~
randomdrake
> A server should serve. I to think too much front-end magic will slow down a
> site.

> However, I actually created this on that dual-core laptop you are talking
> about.

Right, which is why you have stuff in your code about dealing with the fact
that sometimes rendering the bytes coming down the wire is painfully slow. You
commented out the spinner GIF, which has replaced the Java applet loading, or
the Shockwave loading, we knew so well in the 90s and 00s.

> So not to take away from your point, but let me also take this from the
> other perspective - bandwidth.

It would take visiting 3.4 posts of the size in the example before you would
even reach the necessary bandwidth of the very first post.

Blog posts, by and large, get traffic for the single post in which someone is
visiting and very, very rarely get hit up for a 2nd, and even more rarely a
3rd story on the same site.

Taking that into consideration, who is wasting more bandwidth?

> even if the browser might have to spend a few hundred milliseconds rendering

Gather up 3 or 4 times waiting a few hundred milliseconds and now you're
waiting multiple seconds. Which is more than enough time for folks to hit the
back button.

I'm not dogging your efforts or your library. I hope you don't see it that
way. I am simply saying that the amount of work going into client-heavy
development these days, and the amount of folks hopping on the: "Wow, that's
such a great idea!" bandwagon, should be limited (educated).

A server should serve clients. There are human beings attached to the
requests. We, as developers, should hold ourselves to a higher standard of
doing whatever we can to remove burden from the clients.

I don't know where the idea that developers should be unburdened and servers
shouldn't work hard came from, but it stinks.

Pay $0.00000001 to ask the 16 CPUs to wrap text in markup. Sheesh.

~~~
notahacker
I'm sure there exist edge cases where the average user's browsing experience
is slowed more by the additional bandwidth used downloading gzipped HTML tags
on the fourth and fifth pages they visit than by the rendering script download
on the first page and their browser running a script to generate a static
layout features on every page.

But even then I'd wonder if the answer to save all that wasted time and
bandwidth wasn't "maybe we could do even better if we compressed those
images?"

~~~
GrinningFool
I think the "edge case" here is "many if not most mobile connections" \- not
everyone has LTE and even among those who do, it's a highly variable
experience.

In addition you are hitting those clients with a double-whammy: slow load over
a slow connection, and slow rendering on a [relatively] slow CPU.

~~~
notahacker
The edge case is one where the extra bytes in a set of plain old HTML files
are actually more of a significant overhead than the JS/markdown alternative,
which has a higher page weight for the first visit anyway as well as making
more demands on client-side renderers. (In retrospect I could have worded the
first post more clearly). Mobile is hardly likely to be this edge case since
as you point out yourself mobile browsers will have a more perceptible delay
when it comes to generating a page on the client side in javascript (and also
can't display anything until the script is downloaded which is possibly a
_big_ first page performance hit, and aren't necessarily effective at caching
the script for repeat visits)

------
davej
Interesting, I wonder do search engines crawl pages like this?

You'd need to add at least a `doctype` and `title` for this to be valid HTML5
(not that it necessarily matters for a search engine crawler).

Edit: Also if you added the script to the top of the doc then you could
`display: none` the doc and wait for the css to load before making it
`display: block`. This would overcome the FOUC effect.

~~~
nkozyra
As I understand it, wouldn't Google render this page while crawling? Perhaps
they'd punish it for doing so, but I think Google would have no issue with the
content itself.

I also wonder how much FOUC you could incite by increasing the size of the
markdown document.

~~~
Xeoncross
True, but don't forget that the JS and CSS is cached so after the first page
load - every other page is instantly ready to be rendered.

~~~
nkozyra
That's not true in this case - the JS and CSS are cached but the output of the
JS (the rendered HTML) is dependent on the _execution_ of the JS. If that
meets any delay (ie, through parsing 1,000 nodes in a document for example),
the page will look unformatted until the parsing is complete.

~~~
Touche
FOUC is easily enough fixed with some css rules. Can even give it a snazzy
transition effect after it's rendered.

------
lowmagnet
I did something similar years ago with xslt rendering xml in browser. It used
an xsl stylesheet loaded in the xml itself, similar to this approach. It was a
pain to debug, and I'd imagine this approach is easier because tooling has
caught up with this sort of thing.

I use httpsb so this comes through as a pile of text until I allow the js to
do its thing. I'm ok with this, since a browser plugin that does markdown
would work here too.

Sometimes I miss things like Archie that had very small network footprints due
to technical requirements of the past. They really were able to focus on the
content, like this solution.

~~~
X-Istence
I loved the idea of using XSLT rendering to take an well formatted XML
document and process it client side in the browser, but it came with more
problems than it solved.

The tooling was terrible to accomplish it, but different browsers reacted
slightly different to the XSLT, some had a flash of unstyled XML followed by
it rendering the page using XSLT, JavaScript didn't work right since the page
had to be served as XML not as HTML, Adsense my ad network at the time didn't
work with it either.

XSLT had potential, but it never really caught on, and now we just have
JavaScript frameworks that do all the rendering client side using JavaScript
instead.

~~~
th0ma5
I noticed a similar flash with this project, although the end result is very
cool, and to think except for links, this is somewhat all Lynx compatible.

------
jscheel
Now we just need a static site generator generator.

~~~
fournm
Have the server inject the javascript into the page?

------
j_s
Dang it, I was hoping for a 'pick your features & download your customized
version of jekyll'! I guess that would be a static site generator generator...

------
scorpion032
Static Site Generators seem like the Twitter client of today (which itself has
been the "Hello, World!" of Web 2.0)

Here is a site that compares 270 of them:
[http://staticsitegenerators.net/](http://staticsitegenerators.net/)

~~~
dangoor
Most (all?) of those 270 static site generators generate HTML files that sit
on disk on the server. This tool takes markdown files with a single script tag
and serves that up to the client. While I'm pretty sure I've seen this idea
before, it is at least different from the typical static site generator.

~~~
p4bl0
Many static site generators use (or allow you to use) markdown to write your
pages, and then generate static html from it. And I can only see benefits to
the no-javascript approach.

------
partomniscient
If there's javascript in the output, it's not really static is it?

~~~
k__
Well, you can deploy it with a simple web server. No server side processing.

Theoretically this is the best scaling solution.

Practically it makes the site slower for every client.

~~~
HeyImAlex
>Theoretically this is the best scaling solution.

Html markup on your pages is probably minuscule after compression (you're
using zopfli with 5000 cycles and minifying your html, right?), and amortizing
the upfront cost of that extra js over the average number of page views is
definitely worse than plain ol static html for your blog 99.9% of the time.

But let's get real; theoretically the best? I doubt markdown is even near the
optimum in terms of bits on the wire. Don't even talk to me unless you're
writing your own binary markdown serialization format.

~~~
k__
Well, with 1 client, you have one machine which renders the page and with 1000
clients, you have 1000 machines which render the page.

The processing capacity depends on the amount of clients.

With a static site generator, the processing capacity depends on your own
machines and is independent of the clients.

But yes, for a static site, this just doesn't help much, since every client
gets the same data, so why should every one process it on its own.

------
philbarr
So am I right in thinking that this is like a template, but the template gets
added dynamically by javascript?

~~~
nkozyra
You can confirm this by inspecting the document - the markdown is parsed into
nodes through regexp and then a full HTML document is constructed and injected
into the DOM (or rather, creating the DOM and then injecting it).

Cool, fun, but probably not something for which I can see a practical use.

------
joshvm
I guess the only downside is that if your client has NoScript, they just see
the raw Markdown. With HTML if the client has a text based browser things like
links, images, etc will still work. I still use Lynx over SSH if I need to
grab paywalled content from my work machine or check something on the local
intranet when I'm out of the office.

The text based browser is splitting hairs, but NoScript isn't. Unless there
are browsers that will natively render Markdown if served/detected (i.e. not a
plugin)?

~~~
Blahah
You could see that as an upside - Markdown is designed to be human-readable
and is pretty successful in that design goal. So a NoScript user will see a
rather nice plaintext.

~~~
billyhoffman
uhhh, Markdown is "nicely" formatted for geeks, and is great for easing the
burden of content creation.

however My mom (and I imagine any non-geek) would have trouble reading the
hyperlink format, and be completely confused by the strong vs italics, code,
or block quote sections of Markdown.

~~~
Kiro
Remember that we're talking about markdown being shown to people with
noscript, something I highly doubt non-geeks are using.

~~~
yohanatan
Wouldn't it also be shown to people who merely have Javascript disabled?

------
JasonFruit
Using Google Chrome Version 34.0.1847.132 on Linux, if I open pages (e.g.
[http://xeoncross.github.io/jr/](http://xeoncross.github.io/jr/),
[http://xeoncross.github.io/jr/john1.html](http://xeoncross.github.io/jr/john1.html))
in a new tab that is not immediately focused, they never visibly render. I see
that frequently with client-side-rendered pages.

------
michaelbuckbee
While not what I'd do for every site, it's a pretty neat tool for some use
cases.

In particular, Heroku has moved to a similar setup with Boomerang [1] - a JS
include that puts the nice Heroku branding at the top of your add-on
configuration pages.

It neatly sidesteps the need to make a component/template for every single
framework and backend in use by their different partners.

I could also see it being useful as an easy "drop in" way of tying the
branding+nav together on a number of different sites within an organization
(so your auto generated docs, your tutorials, etc all live on different
systems but easily look the same).

1 - [https://github.com/heroku/boomerang](https://github.com/heroku/boomerang)

------
BHSPitMonkey
What about accessibility?

~~~
JetSpiegel
This comes down to reimplementing a HTML rendering engine in Javascript.
Accessibility is the least of their problems.

------
untitaker_
And now i want to generate a TOC ;)

~~~
hrjet
Great point. Client-side can't do any meta analysis about the data, unless it
fetches all the data. Which is a big waste of bandwidth and cpu.

~~~
untitaker_
Furthermore, unless you do some hackery in your .htaccess or something like
that, there is not even a way to discover all existing pages.

------
anon4
At first I balked because I positively hate frivolous use of javascript and in
fact browse with noscript and only a few sites allowed, but then I realised
something.

This is actually really good for people like me. If I visit with javascript
disabled, I get a nice, readable markdown. If I visit with lynx, I get
markdown. I can actually read your blog with curl, if I want to. This is
pretty much the holy grail of graceful degradation right here.

------
gramsey
This is an awesome idea, and looks like it is very well executed. I have two
suggestions:

\- Add some sparse html tags (i.e. a basic doctype/body), which can help with
search engine parsing.

\- You'll notice that for a few milliseconds on page load, the text is shown
before the JS rendering takes over. This can probably be solved via
Javascript, just find a way to cache or pre-load the pages.

------
notJim
I was hoping this was going to be a generator that generates static site
generators, since they seem to be the hot new project.

------
nir
Neat idea. Is the "Download Jr" part required, or could it just be included
from GH pages of the original repo?

Could make for a very quick & simple way to put up some content online while
keeping it looking decent, and users could contribute new themes etc.

------
dwg
Neat idea for very small, quick and dirty sites.

Pros:

* No build process (yet), source == build & no need for dev server * Easy to integrate client specific code (e.g. browser compatibility)

Cons:

* How to transpile to CSS/JS? * Apples-to-apples, slower than static sites with "build" process * SEO?

TBD:

* Client processing speed

------
SimeVidas
That demo looks amazing w/o JavaScript:
[http://i.imgur.com/sgC4sdN.png](http://i.imgur.com/sgC4sdN.png) (</sarcasm>).
Adding JavaScript as a SPOF cannot be a good approach -.-

------
ClashTheBunny
I would wrap the markdown in a gigantic &lt; pre &gt; so that when noscript is
enabled, you end up seeing at leat markdown, and not a wall of letters.

As for advantages of this, it seems like it would be better for a more open
web. If you put this on the web, it doesn't matter where you serve it from, I
can send you really good well formed patches. On thing the web currently lacks
is the ability to participate at a web scale. If I see something that I can
improve and can get to the source, I'll send a patch or pull request. These
days 'view source' means 'view generated code that nobody has seen'. This gets
back to the roots of the web.

------
chenster
Do we really need this level of overly extreme optimizing? Today's modern Web
browsers are already doing the most layout and rendering with CSS plus CDN and
client side cache.

------
juanuys
Broken? "curl -i
[http://xeoncross.github.io/jr/"](http://xeoncross.github.io/jr/") Content-
Type: text/html.

------
zhte415
I have had a, static, static site generator for many years. It is called
gedit. I've heard Notepad++ is pretty good too.

------
dyadic
I think the idea is pretty neat, but the flash of unstyled content before the
js kicks in really ruins it for me.

------
exizt88
> $then = "email" \+ "@" \+ "davidpennington.me"

Is this supposed to be PHP or Javascript?

~~~
Xeoncross
So here is my dilemma. I wanted to write it in Javascript, but the "then"
looked lonely without some kind of starting context. I was going to write it
in Go, but "var then..." kind of messed up the sentence. So I wrote it in PHP
since everyone knows what that horrible $ is all about.

~~~
exizt88
'.' is the proper string concatenation operator in PHP.

------
haldean
I made a less-tricky, less-cool thing like this[0] and I still use it on my
site today. It's really great to not have to recompile markdown or do anything
other than a git-push on text files. The fact that view-source works on yours
is super cool, though; nicely done!

[0]:
[https://github.com/haldean/docstore](https://github.com/haldean/docstore)

------
s_m
This is cool. I value pageload speed though, so I wouldn't use this myself.

------
motyar
All we need is browsers that support and render Markdown.

Good work !!

------
mplewis
This is a fantastic little toy project! Thanks for showing me. I'm thinking
about building this into something for hosting on servers with extremely
limited CPU, such as an RPi.

------
atmosx
I'm fine with octopress but if I had to change to another static site
generator I'd probably go with 'Go' due to speed improvements.

~~~
spf13
Hugo is a fully featured SSG written in Go. It's considerably faster than
other SSGs and has a very easy installation.

[http://hugo.spf13.com](http://hugo.spf13.com)

~~~
atmosx
I know spf13 :-), that's what I had in mind.

------
Istof
This is a great idea that might be useful on free hosts that only allow static
pages but I don't think that I would use it otherwise.

~~~
Istof
I would be curious to see what other uses it has

