

URL Shortener Analysis - Tweets from 2009 vs. 2012 - jterrace
http://ewencp.org/blog/url-reshorteners-over-time/

======
NameNickHN
I'm not sure what the point of this analysis is. That Twitter is wrapping each
and every URL, however short, is a known fact. And since people are using
different URL shorteners, URLs are bound to be re-wrapped when a tweet is
retweeted.

~~~
Terretta
It's a known fact, but nevertheless annoying that Twitter _URL LENGTHENS_ ,
and annoying that Twitter thinks it's ok to rebrand publisher shorteners like
nyti.ms or tcrn.ch which help users recognize where the redirect will take
them.

~~~
justincormack
But these publisher specific shorteners no longer serve any function surely.
They were because you had to use a shortener, so they created their own. Now
everything goes through t.co you effectively do not need a shortener yourself.
For your own site you get no more analytic data from a shortener than a real
url.

~~~
Terretta
I don't generally use sharing services; I email links while reading the page.
I like emailing a link that I know doesn't line break. That just needs to be
less than 70 chars, but many news sites are longer. So as long as they're
shortening, might as well shorten for all venues. Twitter's shortener is
useless in this direct sharing case.

Another publisher specific function is, as I mentioned, letting a user know
through your branded short URL what this shortened link is likely to take them
to.

------
thenilly
I thought the point was that Twitter can add a few lines of code that check
whether a URL is shortened before re-wrapping. Fewer redirects, less time till
I get to the real content.

~~~
justincormack
Twitter want the data so they decided you have to wrap in t.co. They dont care
about indirection. Though they could try to undo other shortners.

~~~
ewencp
At least on the Twitter web site, they already have the unwrapped URLs -- see
the data-expanded-url and data-ultimate-url attributes on links (and, of
course, at least one unwrapping is used for displaying the link).

They probably don't want to completely undo other shorteners, however, because
these services aren't only used for shortening. I absolutely believe it's
beneficial for them to have multiple levels of indirection so others can
gather stats, I just don't like it as a user who is consuming the links. If
the amount of indirection, delay in loading pages, and brittleness of links
gets bad enough, maybe it eventually will make sense for them to try harder to
avoid it.

But they could, in some cases, avoid increasing the amount of indirection and
still gather the same stats. For example, on their site, since they open links
in a new tab anyway, they could make that link direct and capture the click
event with a separate AJAX request in the original Twitter tab.

If you focus only on page load latency, other sites/services besides URL
shorteners are making things worse. I regularly notice some small delay
waiting for my click on a Google result to go through Google's servers since
they (annoyingly, in my opinion) swap out the target URL for a wrapped version
at the last minute. Of course, Google has other good reasons for doing this as
well -- stats and removing referral information immediately come to mind.

I also didn't show that this negatively impacts user experience. It could be
that t.co, on average, requires so little overhead that the extra level of
indirection doesn't matter. But I think it's worth looking into, and if it
does have a negative effect, figure out ways to mitigate these effects.

~~~
justincormack
Another issue is that people who use twitter want stats, and t.co has no
public stats interface, so people use bitly etc as they still want data.

On mobile I notice the slowness of t.co a lot. On desktop I have been noticing
waiting for the google redirects too recently. So I think they are a problem,
but potentially more resources thrown at the problem could fix this, but using
other means would be better as you suggest.

