

Ask HN: Progressive Enhancement is gone?... What about crawlers and SEO? - brenfrow

Thats great we&#x27;ve moved to javascript client side frameworks and tossed away rendering everything server side, but what has always held me back is the fact that search-engines can&#x27;t index my pages. Lack of SEO is major downside. Have I missed something here?
======
ulisesrmzroche
[https://developers.google.com/webmasters/ajax-
crawling/docs/...](https://developers.google.com/webmasters/ajax-
crawling/docs/getting-started)

I dont know what ya'll are talking about. It's more than possible make your js
app crawlable by Google. Just use HTML snapshots if you want to do it by hand,
but I use this because I'm lazy, [http://seo4ajax.com](http://seo4ajax.com).

Actually, it's just incredibly bad technical advice to just dismiss js apps
like this.

~~~
brenfrow
Thanks this is what I've been looking for.

------
dougbarrett
I really stay away from javascript client-side frameworks unless I'm working
on a browser-app, which is behind a login, so there's no need for SEO.

If you're creating a service that doesn't sit behind a login, use javascript
sparingly. Why do you need a full-blown search/filter function on a production
or information landing page?

Make it basic, and then have the results pop up via AJAX. Sure, you're losing
SEO value from those search results, but you can create a table that basically
shows that a user went from page ID 123 to page ID 234 and you can populate a
list of 5 'articles' that people may be interested, based on searches. That
list output should be generated server-side and really should be cached for a
day, and there is your lost SEO value from search results which is going to
actually help more in the long run.

I think it's great that we can easily re-create any type of website we want to
using a javascript MVC, but it doesn't mean we should.

------
ianstallings
The real problem seems to be the crawlers/spiders aren't keeping up with the
technology. I would bet that Google is taking the lead on this because of
their love for JS. We will just have to keep pushing them for updates that can
index routed JS content. I think the main hurdle is that it requires more
machine resources because now they have to run JS in order to see the results.

That being said, if SEO was a big deal to me, such as when I create an
e-commerce site, I would use progressive enhancement and some of the semantic-
web enhancements to describe my content because sales drive development. I've
done that in the past successfully. But I've also just finished an entirely
JS-based single-page-application because we didn't need SEO, as everything was
hidden behind a login, like facebook.

My point? I don't think one size fits all, yet.

