

Protocol-Relative URLs to Fix Mixed-Content Warnings - autoref
http://autoref.com/blog/2012/09/13/the-tech-behind-autoref-protocol-relative-urls/

======
rachelbythebay
This isn't as simple as it sounds, particularly if you have something like an
Atom feed. Putting an IMG SRC pointing at //example.com/foo.jpg in the HTML
served as a regular web page will work fine for nearly everyone. However,
putting that same construct in your feed will cause a nontrivial number of
people to "GET //example.com/foo.jpg" from your web server. It's irritating.

If you then change your feed to hard-code "<http://example.com/...> in IMGs
and such, you've just created the mixed-content hole when someone reads your
feed over https. So, then you really have to have a second instance of the
feed with nearly-identical URLs but with https protocols.

Why not serve https to everyone? Some places block it. They tend to be
oppressive regimes, but that's the way it is. They can get to you on port 80
but not port 443.

I had to go to a hybrid scheme. Web pages get //host/path, http fetches of my
Atom feed get <http://host/path>, and https fetches of the Atom feed get
<https://host/path>.

Even then, some browsers _still_ don't quite work with the web site, but I'm
okay with ignoring them, since they didn't send User-Agent strings and are
obviously broken. Besides, there have only been two of them so far this entire
week.

( Mostly recycled from a post about this not too long ago:
<http://rachelbythebay.com/w/2012/08/28/feed/> )

~~~
justincormack
And I thought people who used RSS were technically aware and wouldn't use
broken clients. How about collecting user gent strings and filing bugs, or
producing a test suite?

------
Jare
I thought the Chrome browser had recently started to block insecure content
loaded from https pages; it does for me and I don't remember changing some
related settings recently. Anyway yeah, it's very sane advice.

~~~
throwaway64
it did, temporarily in the beta, however so much of google's own stuff had
this issue (especially adwords), they scrapped the idea.

~~~
spullara
I'm now intrigued why this was voted down. I certainly saw Chrome not loading
mixed content in the past. Did they really revert this behavior?

------
ryetoasthumor
Third post from series: [http://autoref.com/blog/2012/09/08/the-tech-behind-
autoref-p...](http://autoref.com/blog/2012/09/08/the-tech-behind-autoref-
part-2static-asset-compilation/)

Full disclosure (bizdev at autoref)

------
ryankirkman
At cdnjs, the homepage shows protocol relative URLs by default. We got the
idea from Paul Irish: <http://paulirish.com/2010/the-protocol-relative-url/>

------
crisnoble
The website <http://www.htmlshell.com/> uses this technique when linking to a
CDN jQuery. One problem, it doesn't seem to work for me when I'm developing
locally, I get the error that '$' is not defined, so jQuery didn't load. Does
it matter for local vs. server development?

~~~
sciolistse
If you're using local files directly, // will resolve to file://, and not work
as you'd expect. It'll work if you use a local web server instead, of course.

~~~
crisnoble
gotcha.

------
lukeasrodgers
For what it's worth, the research I'm aware of on whether users actually
notice or care about mixed content warnings suggests that they generally
neither notice nor care: <http://www.usablesecurity.org/emperor/>.

~~~
eli
Sure, most people will ignore security warnings, but I still really, _really_
don't want my site to ever pop up security warnings.

------
bgmd
The problem this DOESN'T solve is that if the resource is not availale over
SSL, then this won't work at all and will show broken links.

I've got a project that addresses this by converting any URL to SSL:

    
    
      http://www.fixweb.co/  
    

Just take whatever URL you want to access, like

    
    
      http://example.com/test.gif
    

add the FixWeb.co address in front of it like this:

    
    
      https://fixweb.co/example.com/test.gif
      ^^^^^^^^^^^^^^^^^^
    

and it will return the file over SSL.

It's not designed for high security file delivery, obviously, but it will get
you around a normal Mixed-Content warning.

~~~
pronoiac
Oh look, you just implemented an open proxy. Those are rather prone to abuse,
aren't they?

------
zenazn
...or you could request all your resources over SSL.

~~~
RKearney
No no no! Unless you absolutely need your content encrypted, all this does is
add unnecessary latency to your site for the SSL handshakes. If your site
hosts zero external content, and all of your content is hosted from the same
domain, this wouldn't be as big of an issue. However, most sites host content
from 4-5 different domains, which means you're going to have 4-5 different SSL
handshakes, thus resulting in a much slower page load.

~~~
fotbr
If you're constantly pulling content from 4-5 different domains, perhaps you
should re-think your sites architecture to minimize that. Sometimes it's
unavoidable, but a lot of the time it's simply due to laziness or because the
web developers are practicing the cargo-cult method of development.

Yes, this means you probably need to re-evaluate whether your
"like/share/connect/plusone this" really needs to be on every page, or even
any page.

You also need to re-evaluate whether you really need statistics via third
parties, or if you can track them yourselves. The answer is almost always that
you can do it yourself, though it might not be as convenient as, say,
copy/pasting google analytics code into your template.

If you're using a content engine, take a close look at it. Turn off and remove
features you don't use, don't merely hide them. You might also be surprised
which plugins may be phoning home in some form, or pulling content from places
you didn't expect.

~~~
rimantas

      > If you're constantly pulling content from 4-5 different
      > domains, perhaps you should re-think your sites
      > architecture to minimize that.
    

And if you are not you may be interested in doing that — browsers have limited
number of connections per domain so splitting your assets across 2-4 domains
allows to download more resources in parallel.

<http://developer.yahoo.com/performance/rules.html#split>

~~~
fotbr
Or, simplify your pages. The current direction of web development and design
-- having hundreds or even thousands (yes, thousands) of assets in a single
page is often counter productive, and borderline stupid.

------
jc4p
Unrelated to the article, but just a small bug regarding your home page: If
you choose a maker/model/ZIP and hit next, then hit the back button on your
computer, you come back to the home page with the maker preselected but the
model as unselectable, you have to actually choose another maker then choose
the original one again to be able to pick a model.

------
paulsutter
We tried this at Quantcast in our tag. It worked great, but we had constant
complaints from people who expected to see an "http:" in there.

Finally, we went back to two separate tags (one for http, one for https).

I hope this becomes more widely understood, it make a lot of sense.

------
benologist
This looks like invaluable information for people looking to buy used cars
that have secure websites but couldn't figure out how to consistently write
https.

Thank you for finally acknowledging and serving this important niche.

