

Forgotten Rails features: HTTP Streaming - robotmay
http://robotmay.com/post/24054884390/forgotten-rails-features-http-streaming

======
judofyr
My thoughts on this 630 days ago —
<http://news.ycombinator.com/item?id=1671437>:

I’ve been doing some research for this earlier, and my conclusion was: This is
very hard, if not impossible, to implement automatically. The main problem is
that it’s impossible to handle exceptions correctly without making the whole
stack aware of it.

Currently, when an exception occurs, the system can simply change the response
(since the response hasn’t been sent to the client yet, but is only buffered
inside the system). With this approach, a response can be in x different
states: before flushing, after the 1st flushing, … and after the xth flushing.
And after the 1st flushing, the status, headers and some content has been sent
to the client.

Imagine that something raises an exception after the 1st flushing. Then a 200
status has already been sent, togeher with some headers and some content.
First of all, the system has to make sure the HTML is valid and at least give
the user some feedback. It’s not impossible, but still a quite hard problem
(because ERB doesn’t give us any hint of where tags are open/closed). The
system also need to take care of all the x different state and return correct
HTML in all of them. Another issue is that we’re actually sending an error
page with a 200 status. This means that the response is cacheable with
whatever caching rules you decied earlier in the controller (before you knew
that an error will occur). Suddenly you have your 500.html cached all over the
placed, at the client-side, in your reverse proxy and everywhere.

Let’s not forget that exceptions don’t always render the error page, but do
other things as well. For instance, sometimes an exception is raised to tell
the system that the user needs to be authenticated or doesn’t have permission
to do something. These are often implemented as Rack middlewares, but with
automatic flushing they also need to take care of each x states. And if it for
instance needs to redirect the user, it can’t change the status/headers to a
302/Location if it’s already in the 1st state, and therefore needs to inject a
<script>window.location=’foo’</script> in a cacheable 200 response.

Of course, the views shouldn’t really raise any exceptions because it should
be dumb. However, in Rails it’s very usual in Rails to defer the expensive
method calls to the view. The controllers sets everything up, but it’s not
until it needs to be rendered that it’s actually called. This increases the
possibility that an exception is raised in the rendering phrase.

Maybe I’m just not smart enough, but I just can’t come up with a way to tackle
all of these problems (completely automated) without requiring any changes in
the app.

~~~
robotmay
Aye, I think enabling this automatically would do good things for user
experience but it would require a pretty significant change to the Rails
stack. It's a shame that there isn't a way to tentatively pass the status code
(i.e. '250: This is probably OK') then alter it in later chunks should
exceptions occur. Or at least there isn't a way to do that which I know of.

------
timriley
Aside from the obvious benefits, even just streaming certain responses can be
useful when your app lives on the Heroku cedar stack and you want to keep
connections alive for longer than the default 30 seconds. I put this to use
(albeit with an older Rails technique) for processing credit cards. Long
description of it here: <http://icelab.com.au/articles/money-stress-and-the-
cloud/>

~~~
robotmay
I read that article after reading your one about the NewRelic trouble. It's a
nice technique and it's great to see someone else actually writing about this
in regards to Rails (your NewRelic post saved me a load of agony) :)

------
raverbashing
Really? No

Let me repeat: don't do streaming using RoR (or Django, or PHP)

This belongs in the web server. There are several extensions that do exactly
that, in NGINX, Apache, etc. Including streaming, seeking, etc

Unless you want your server to have a ridiculous cpu usage and maybe only
serve 1, 2 concurrent requests, do it in the web server.

~~~
Loic
You can easily do it in PHP, Python or your language of choice using
Mongrel2[0]. Because of the asynchronous nature of the messaging between your
processes and the Mongrel2 webserver, a single process can efficiently serve
1000's of clients to stream mp3 or whatever file you want. This is because you
are not "locking" a process or a thread for a single client.

[0]: <http://www.mongrel2.org>

~~~
raverbashing
That's what I was trying to remember!

------
boundlessdreamz
Btw Dalli no longer requires a reset -

[https://github.com/mperham/dalli/commit/ee5193e206c3934cb175...](https://github.com/mperham/dalli/commit/ee5193e206c3934cb17596ae88099ea36da8d576)

~~~
robotmay
Ah ha, I see that's in the latest gem release - I'll update my version later.
I have a feeling I was the cause of some slowness at Memcachier when I first
switched to Unicorn without closing the connections, which must have been on a
previous version :D

