
Ask HN: Why is crawling/indexing SPAs more difficult than server rendered sites - isuckatcoding
This is a slightly less touchy &#x2F; controversial question than my last one. Also I am a mobile developer but sometimes I like to explore the web development scene so this might seem like a silly noob question.<p>Why can&#x27;t web crawlers index websites as efficiently as a website that is the traditional server side rendered website?<p>Is it just a matter of inefficiency for the crawler to wait for JS to load the UI? Aside from doing some kind of hybrid approach of client-server rendering, what can web apps that are SPA do to increase visibility?
======
dragonbonheur
Single page applications are mostly logic. Front end linked to back end
programming. Regular web sites are content presentation - what the WWW was
originally conceived for.

Search engines index content first and usually discard what javascript renders
- that is why people are told never to use Document.Write() to render text,
however tempting it may be.

Now there is a difference between using Document.Write() to output text to a
HTML document and using something like JQuery which for most cases would only
influence how text within elements is to be rendered. If javascript was
disabled on a website properly using JQuery, a search engine would still see
the content within the elements, whereas on a page using DOM manipulation like
the Document.Write function the rendered content would still be hidden because
the function can't be executed.

As search engines often take page loading speed into account to render their
results pages and rank websites it is advised to put the JS at the end, when
the content has already been loaded and transform its presentation later.

