For the record, this may reflect my particular Ubuntu installation, or my room's ethernet port, as no other students have had this problem. Good to know I'm an edge case here.
For example, my ISP here in Austria drops my IP connection each 8 hours. After reconnecting a new dynamic IP gets assigned to my modem. In addition, the internal network is a NAT and the modem might have some default firewall rules that filter some types of packets.
After a new IP gets assigned to my modem Google applications like GMail typically hang for something like 10-15 minutes. I suspect that Chromium does not recognize that the TCP connection is essentially death and that control information that might tell Chromium that this is the case (TCP Reset replies?) are either not sent by Google or are filtered by my modem.
I have not experienced this issue with other websites yet, although a similar issue should occur when persistent TCP connections are used. Maybe the timeout for persistent TCP connections is lower or it was just a coincidence because I didn't yet surf on a given website will my connection was reset.
If interested I can try to provide some more debugging information or a login to a box within the internal network. See the link in my profile for contact details.
> The transparency with which Chrome did this
> was actually a problem for me
Just sayin'. The choice of words was a little ambiguous in that it could be taken either way, and that wildly changes the meaning of the word 'transparent.'
> the word "transparent", in this context, means "easily perceived or detected"
Not knowing was a bad thing in this case.
Indeed, that is how tiles used the word:
The transparency with which Chrome did this
(but meant the opposite)
if you had instead said that "the transparent implementation was a problem for me" people would have been less confused.
(Whatever you do, don't look at the WebSocket handshake...)
(Not surprising given that agl is also part of the Go team at Google.)
Switch to SPDY needs to be done on lower level than HTML, and HTTP/1.1 even happens to have dedicated feature for such case: the `Upgrade` header:
Sure, http-equiv could be parsed out by the webserver or a proxy and transformed into real headers, but since none of them bothered to do that, the browsers parse them. The end result to the user is largely the same (just that you're unlikely to successfully get a 304). Expecting proxies to do anything intelligent is what's "fundamentally incompatible", not http-equiv.
It looks like SPDY designers knew what they were doing when they picked custom `Alternate-Protocol` header.
Still, an HTTP header for advertisement seems to me a much better solution than tying protocol negotiation to a markup language (what if I have image-serving CDN or JSON-spitting API?)
I see in spdy-dev archives that SRV DNS records were also considered.
Nope, that time would be when there is an RFC
EDIT: The main concern I have over this is that it basically forces a state into an already stateless protocol. Streams need to be established, which means that they need a state(up/down/in-between), and both the server and client(browser, in this case) needs to maintain it.
1. Should the protocol allow browsers to combine streams? I.e. if a user is accessing 3 different youtube videos in HTML5 mode, can the browser do 1 stream for the various static elements on the page, and 3 or more streams for the video and other content?
This can make the browser feel faster even when loading non-cached pages, so it's definitely desirable. However, it's a lot more work on both the browser and the server to maintain it.
2. What rules are there for streams? The draft states that 1 HTTP request and response should be used per stream, but would it make sense to allow the browser and server to assign streams? How are browser-instantiated streams different from server instantiated ones? Do those need any additional rules?
3. Should SPDY be content agnostic(where a stream is a stream), or would it make sense to have several QoS levels assigned to them(i.e. high for video or other streaming data, medium for control, low for static elements)?
4.(related to 3) Does HTML, CGI, etc. need to be extended to handle this, or should it be left up to the server/browsers to implement it?
As developers, we need to be able to see things exactly as the people we're developing for who do not always have the luxury of using the latest and greatest browsers.