Hacker News new | comments | ask | show | jobs | submit login

A lot of people seem to think that Single Page App frameworks like Angular/Ember are suitable for use on the public facing client side. I've always believed that SPAs are meant to be behind a login, where you don't have to also deal with spiders and other sub-optimal browsing devices, and you have a little bit more wriggle room when it comes to routing and web history.

Just look at Blogger...their client-side rendering is annoying as all get out. It's just a blog post, render it server side and give me the content, then sprinkle on some gracefully degrading JS on top to spice it up.

I say this as a huge proponent of Angular who uses it for all his web app projects who also wouldn't ever use it on a public facing application.

I agree that things like blogger are a great example of what not to do, but I'd go further and say that treating something as a single page web app running on the client side throws away most of the advantages of the web:

URLs which can be stored and shared and are idempotent

Mostly stateless operation for anonymous use (fast to serve/load/cache)

Document formats that anything (including dumb crawlers and future gadgets) can parse and reuse in unexpected ways

What you call suboptimal browsing devices are what makes the web special and distinct from native apps. These are not trivial advantages, and most websites would benefit from these strengths of the web, even if they are an app.

As an example of where something like a single page app can shine on a public site, I've seen chat software which used it which worked really well (using socket.io I think), but only because people didn't care about sharing individual messages and the chat was ephemeral.

> URLs which can be stored and shared and are idempotent

If you use a decent router, you get shareable idempotent URLs: https://solvers.io/projects/7GTeCKo7rGx5FsGkB

> Document formats that anything (including dumb crawlers and future gadgets) can parse and reuse in unexpected ways

As in the article, you can use phantomjs to serve up static HTML to crawlers. They are correct in that it does slow you down and add complexity.

The main problem I think is that SPA tech is still immature and getting all the moving parts to build a public facing SPA working together is a time sink.

It's not really a single page application if you are serving separate pages is it? BTW, the page you linked to says 'Uh-oh! Couldn't find that page.' before loading and displaying the content... ouch.

One of the things I love about the web is that it uses incredibly simple building blocks like simple html pages at defined stateless URLs, dumb servers and dumber clients, and builds complex systems through their interaction. I'd be very wary of solutions that drop those advantages.

There are certainly technical solutions possibly to almost any problem with angular or client-side apps in general, but I'm not sure that rendering everything client-side really gives you enough advantages to warrant it for many websites. What do you see as the main advantages to this approach and do you see it spreading everywhere eventually?

Every website is different and what suits (say) a chat application will not suit a document-orientated website at all. There's certainly room to explore both approaches or even mix them at times.

You make some really good points, but I think simple json documents are much simpler and easier to re-use by other clients in interesting ways than simple html pages. I think the API + client (note, not just traditional web browser) rendering is actually a more "pure" interpretation of what the web can be - data sources and data consumers that interpret and present that data on behalf of their users.

I'm also not sure that rendering everything client-side is advantageous enough to warrant its current popularity (hype...), but I do see some advantages. Firstly, I think it is a better separation of concerns - the server is in charge of data discovery, persistence, consistency, and aggregation, while the client is in charge of determining how that data can be most useful in the current context. In practice, this means it is possible to have different front-ends for the same back-end. Admittedly, that is certainly not always a necessary or desired feature. The separation also makes it easier to build the front end and back end of an application separately from one another, and possibly even in whichever order you prefer. That can be a good thing, though I don't think it's really taken advantage of very often. I also think that true web applications can be made to feel much snappier and closer to native. The line between what should and shouldn't be considered an "application" is unfortunately blurry (the Blogger example is a good one).

I think simple json documents are much simpler and easier to re-use by other clients in interesting ways than simple html pages. I think the API + client (note, not just traditional web browser) rendering is actually a more "pure" interpretation of what the web can be - data sources and data consumers that interpret and present that data on behalf of their users.

This is an interesting point - if you are representing numeric data like chart datapoints, a representation like json might make it cleaner and more reusable by other services or clients. Of course much data is actually formatted documents or snippets of text, in which case json is not such a good fit and html is perhaps better. In many ways html is a worse is better solution, but that is probably part of its strength - it is very easy to get started with and munge to extract or insert data.

I'm not sure a separate of concerns between server and client is necessary and helpful to all apps, though I'm sure in some cases it is useful (for example serving the same json or xml to a mobile app, a client-side app and some desktop client, or separate teams working on both), but then a server-based solution can easily output both formatted html for traditional clients (web browsers, which are now and in the future ubiquitous), and a json or xml representation for other clients - this sort of separation of concerns between data and presentation is not really exclusive to client-side solutions.

I'm not sure it makes much sense to refer to a JSON packet as a "document". HTML is truly meant to represent documents, with embedded semantics. JSON is really meant to represent data or objects in the most abstract sense. It has no notion of embedded semantics.

" I'd be very wary of solutions that drop those advantages." They are called native applications. I can think of some useful ones over the years, particularly for people who produce rather than consume. I notice that my Bosch drill isn't available for seo and mashing :) Seriously though, it depends on your perspective. What's wrong with saying I'd like to make a native app but use the web as a delivery/installation mechanism and that's all?

Nothing really, there's room for all approaches to be explored.

I suspect the concept of native APIs (desktop or mobile) will eventually disappear, but it'll be an interesting journey if we ever do reach that point and would take decades.

There's a difference.

Photoshop wouldn't work as a website, and HN wouldn't work as a program.

Different forms for different use cases.

Photoshop wouldn't work as a website? http://pixlr.com/editor/

Hah, yeah, I finally got that one fixed this morning: https://github.com/solvers/solvers/pull/122

Turns out I wasn't using Iron Router properly. My bad.

It is a single page application if you don't make your browser reload the page from the server each time you navigate within your app. URLs here are implemented using HTML5 pushState -- the browser isn't loading or refreshing the page when the URL changes, except for the first page load.

My point is you get the best of both worlds there: static, representative URLs that live forever (as they should); and the responsiveness you get when you only need to load data and move some divs around instead of reloading everything from scratch each time.

In fact Meteor takes things even further with latency compensation: it predicts how things will change as you interact with the app and does eventual consistency with the server state. This makes updates/deletes feel even faster.

But yeah, it's a trade off. And right now it's a big trade off -- my productivity has dropped in some places, compared to writing a simple app in express or Rails.

I think the term "Single Page App" is a bit deceiving in usage sometimes. My idea of a SPA is one where the client downloads the bulk of the application code on first page load, and then only talks to the server with data-based API calls (JSON usually). The predownloaded client then just renders that data, rather than downloading an entire new template to render on the whole screen.

This interface style does not require any visual page refreshes to load new content, but it also still can support routing and deep-linking.

Discourse is essentially a SPA (see http://try.discourse.org/) and designed to be public-facing. It does a good job at providing a very bare, lightweight interface for people with JavaScript disabled and, I'm assuming, for web crawlers.

FWIW.. it looks overkill for me

I'd much rather use Forem.


Do they use open source libraries? And otherwise how do they differ from something like Angular?

Discourse is open source:


It uses ember.js with a Rails backend.

i agree 100%.

Most websites shouldnt be SPAs. One can still use Angular to code widgets in a regular page based site,without using a js router.

It's just that devs are getting lazy ,they throw a restfull server app quickly then dont want to deal with any view layer on the server and do everything in client-side js. For some projects it make little sense.

I think the key word is "App." There is a difference in nature between a web app and a web page. Both can be built using the same underlying technologies, but the goals are very different.

Knowing which one is building can greatly inform choice of framework.

Agreed - there is a wealth of applications that are not in the massive scaled consumer market - and making those clean, easy to maintain and deliver is an enormous win. That said, there is a wealth of consumer apps that don't have massive market shares either so the market for learning these lessons is pretty rarefied

What would you use for a public facing application?

Any sort of standard server side rendering, or possibly a front end framework that can be rendered server side as well (I think React can do that, same as Backbone).

Basically, SPA frameworks are useful when you are working with lots and lots of data moving back and forth. A good example is something like Intercom.io's interface. They have tons of tables and modals and data flying around. This isn't conducive to the standard browser request -> server render -> load whole new page on the client. It's just too slow. When you're interacting purely with data in a master interface, SPA frameworks are the way to go. And it isn't even a matter of literal page loading and rendering speed, it's the fact that refreshing the view with a whole new page on each link click is a context change that adds up when you're managing a lot of data or performing a lot of small tasks in an application.

But something like Blogger, where you're reading just some content, maybe some comments...there's no real benefit from loading it in a SPA environment. Render it server side, cache it, and fire it to my browser as fast as possible.

> Render it server side, cache it, and fire it to my browser as fast as possible.

Shameless plug, but we've been trying to do something similar with Forge (getforge.com). We built a Javascript library called TurboJS that precompiles a static HTML build into a JS manifest and loads it all. It's SEO-friendly and super-fast. Our other site, http://hammerformac.com/ uses it, for example.

Precompiling static HTML is a strict sub-problem of the problem most people in this comment thread are referring to, which is development incentives and SEO characteristics for web pages that have non-trivial amounts of dynamic content.

I recommend PJAX. It degrades gracefully for search engine indexing.

Yep. PJAX and it's ilk are wonderfully simple, degrade well and fit in with many existing approaches.

We added PJAX rendering to a Django site in under an hour. All the benefits of SPA's and few of the downsides.

I've started using PJAX where the rendered page does not have to change when its source data does, and where you don't have large tables/calendars that would have to be re-rendered when one data value changes.

Development is significantly faster, less error-prone, easier to maintain. Development can also be given to people with lower skill levels.

"node.js + express + jade" is a fairly common stack, you can even try to move an existing complex angular.js code there because it's written in the same language

Is node really a good choice for static non-real time sites? I'm thinking something like Django or Ruby on Rails is better.

Why do you think they are better?

In most cases Node.js is just faster than Rails or Django.

The Blogger example doesn't seem fair: Any app can overload the user with too many animations or other distractions.

At Thinkful (http://www.thinkful.com/) we're building our student / education app in Angular, and are moving all browser-side code to Angular as well – both public and private.

In a lot of our splash or SEO-enabled content we're not making use of all of angular's features, but the upside of using it is that we have a single, unified codebase in which we can share libraries, team knowledge and design patterns. Simply put: Using Angular everywhere allows us to keep DRY. Testing the front-end using purely Angular is yet another core asset at Thinkful.

One framework for writing code and testing is much better than a hybrid of server-side rendering and Angular.

Our biggest challenge was SEO, but this was reasonably easily solved with using BromBone (http://www.brombone.com/).

There are reasons to stick with non-angular or JS frameworks, so it's not always a slam-dunk. For example, if Thinkful had millions of SEO pages that we needed to load as fast as humanly possible Angular would be a bit much... But that's not what we're optimizing for: We're building a phenomenal user-experience that we can support long-term, is well tested, can have a lot of developers use, and can have non-developers do their job inside our codebase (everyone at Thinkful codes).

For all this and more Angular has proven a great choice for both logged-in AND public sites.

Totally agree, but you still have the analytics/tracking problems. Unless you're not tracking clicks/activities/views on your various features. But if that's the case, you have bigger problems :)

Can anyone elaborate on the problems? Are there problems with analytics while using Angular?

They don't belong there either.

Some people use screen readers, text-mode browsers, IE due to stupid work/school policies, etc. Some people like automating their workflow, which can involve scripted browser interactions. Some people actually care about security and privacy, and so run NoScript, etc.

And some people still use IE6, but that doesn't mean you should continue to support IE6.

So based on that logic, you can abandon all standards of usability and interoperability?

Screen readers and browser automation can run javascript. Sure, it may be more pure and perfect for everyone write websites that don't require javascript, but the economics of building websites doesn't support it.

fyi, services like PhantomJsCloud (mine) exist that let you avoid rendering SPA/Ajax yourself.

Here's a link with all the nginx config you need to make it work: https://phantomjscloud.com/site/docs.html#advanced_seo

Exactly, tried to hit that in my post

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact