

Meteor 0.3.9 adds search engine optimization - debergalis
http://www.meteor.com/blog/2012/08/08/search-engine-optimization

======
mcgwiz
Their technique for generating the HTML representation of deep-link into a
Meteor app is to run the entire client app in a headless browser and serialize
the generated DOM?!

This is an area of vital importance to public, JS-based RIAs, and needs some
real innovation. Why even bother delivering this half-baked solution? The
processing cost makes it untenable for all but the tiniest of URL-spaces.

~~~
tomku
Seems like this would be a very easy way to DoS a Meteor app, if it's really
spawning a PhantomJS process for each request.

~~~
bct
It is. And the server doesn't even feed PhantomJS the HTML that it can
generate, PhantomJS has to make another request to the server. I wonder if it
can be made to recurse?
[https://github.com/meteor/meteor/blob/master/packages/spider...](https://github.com/meteor/meteor/blob/master/packages/spiderable/spiderable.js)

------
jarcoal
Who really needs this for their web app? Nearly 99% of heavy web apps require
a login, so Google is out of the picture anyway.

Anyone who is building a content site with DOM-manipulating Javascript doing
all the work have completely lost their way. Seriously, just render your
templates on the server and deliver them to the client. Why does the world
want app-ify everything?

~~~
arohner
Fat client apps are _fantastic_ for making some kinds of UI interactions
trivial.

The simplest example is the checkout form, or any kind of wizard linking
multiple pages together. In a fat client app, all state for all N pages of
your checkout cart are in the same page, with 5 lines of code to switch
between them. Doing that in standard MVC "fat server" model is annoying, and
about 10x more code.

> Why does the world want app-ify everything?

Think about GUI applications pre-web. They weren't written as servers that
generate PDFs, with an embedded scripting language. The current webapp
technology stack is a complete accident, and if we had actually sat down to
design the "optimal" stack, it would look nothing like what we have.

~~~
jarcoal
I completely agree with you that in some (many) cases it's useful.

But your checkout example proves my point; you wouldn't want a bot crawling
through there.

Bots should be crawling content-rich pages (blogs, articles, marketing pages),
and IMO those should rarely be handled by fat clients.

------
audreyr
Why is this interesting? Because 1) search engine crawlability matters and 2)
the more AJAXy web apps get, the harder it is to make them crawlable.

The more we move away from traditional web "pages" to rich web apps that do
everything through DOM manipulations on a single page, the harder it is for
the search engine robots to crawl what we build.

~~~
ceejayoz
Google's making moves towards having their crawler essentially be a headless
Chrome instance. Crawling AJAXy apps is rapidly going to get easier for Google
et al.

------
mortice
Easily the funniest link on this site this year.

~~~
scottkrager
Yeah I mean, sure SEO isn't exactly rocket science...but it's a little more
than making a URL crawl-able....

~~~
jfarmer
Being crawlable might not be sufficient, but it sure is necessary! :D

------
Xyzodiac
Wait, they added SEO before database authentication? So very logical.

~~~
mikebannister
Other core devs are working furiously on auth (getting pretty close). This
feature was just releasable first.

~~~
Xyzodiac
Sounds good, despite my first post Meteor is the most interesting framework
out there atm IMO.

------
dotborg2
So after year of developing this product, they finally realized that each page
should have an unique URL.

It's called lack of vision.

ps. ajax content does not rank in Google SERPs at all, it's a typical band aid
solution, so websites made with meteor will have some serious issues with
monetization and stuff

------
spullara
If they can tell which pages are public vs private (which I think they can)
they could just tell the client that they need a copy of the page when they
are done rendering it, have it post the serialized DOM back to the server and
then cache and serve that until the next redeploy.

------
AznHisoka
I know this isn't 100% on-topic, but does anyone know if you can choose your
own user agent when using PhantomJS such as Firefox, or IE?

