
SingleHop Are Cheats - charliesome
http://charlie.bz/articles/singlehop_are_cheats.html
======
SingleHop-Andy
Hey All,

Andy from SingleHop here. Thought I'd "hop" in to clear some things up.

Everyone was correct in saying that this was an honest mistake, and that the
speed test download files were in fact generated with dd and an input file of
/dev/zero. We've since corrected this by generating new files using
/dev/urandom, and the results are more accurate now.

We were by no means trying to "trick" or "deceive" any customers, or anyone at
all. I am personally happy to see the community bring this to light so we
could tackle it.

~~~
tlrobinson
I'm curious why you're serving the file over HTTPS in the first place?

~~~
simcop2387
My guess would be that it will prevent an ISP from trying to do naughty things
like caching or otherwise cheating the speed test and giving you a result
that's better than what you are really getting.

------
ableal
I downloaded the first couple dozen megabytes. It's all nulls.

    
    
        $ od 500megabytefile.tar.gz 
        0000000 000000 000000 000000 000000 000000 000000 000000 000000
        *
        27200000
    

(besides not being a .tar.gz, it's not 500 MB either, it's 512 000 000 bytes =
512 MB, or 488 MiB)

~~~
tomjen3
That isn't strange though -- if I were creating a file that I knew would only
be used for speed testing, I would make it al nulls too.

It is just much easier to find a file that is some exact distance if you
create it first.

~~~
hyperbovine
$ dd if=/dev/random of=nonbullshittestfile bs=1M count=500

~~~
treo
That would take forever, at least on my system, as I don't really have a lot
of entropy available. Just for kicks I have started this when I began writing
this reply. And now I have the huge amount of 200 Byte in this test file. So
thats about 10 bytes per second.

But if you take /dev/urandom you can get quite a bit more. On this system it
is about 7Mbyte/s.

~~~
jcromartie
What's the difference between /dev/random and /dev/urandom? I've only ever
used urandom and it spits out data as fast as I can consume it.

~~~
stillinbeta
/dev/random will block while waiting to collect entropy from the system.
/dev/urandom will be satisfied with pseudorandom numbers. That's fine for many
applications (e.g. a file filled with garbage) but not acceptable for things
like cryptography.

~~~
ibisum
That entropy, by the way, is derived from the keyboard and mouse devices. If
you want /dev/random to go faster, move your mouse and type more ..

~~~
pavel_lishin
Can it be configured to use other sources?

Does this mean that if I've got a server with no mouse/keyboard attached,
/dev/random will block forever?

Logging into my slicehost server, and running cat /dev/random | hexdump -C
seems to support this, more or less - only a few lines get output unless I
start typing into the terminal - then it goes marginally faster.

~~~
tlrobinson
You can buy hardware random number generators.

<http://www.entropykey.co.uk/>

[http://www.idquantique.com/true-random-number-
generator/prod...](http://www.idquantique.com/true-random-number-
generator/products-overview.html)

<http://www.gamesbyemail.com/dicegenerator>

...among many others.

------
trotsky
If you go to the testing page
([http://www.singlehop.com/why_singlehop/data_center_details.p...](http://www.singlehop.com/why_singlehop/data_center_details.php))
you have two choices for testing locations - downtown leads to https and goes
fast, but elk grove leads to http and is uncompressed.

Given how easy it would be to get caught doing this (the https download makes
chrome claim 40mbps over a T1) I think it's more likely this was accidental.

~~~
SoftwareMaven
I can certainly see a lath that led to this through incompetency rather than
malice. /dev/null is really fast to creat a big file from and an SSL server
that is already configured to do compression.

You'd think they would notice the CPU bottleneck on their system, though, as
they recompress that file for everybody. That's the only thing that makes me
wonder.

~~~
lubutu
Nitpick: you mean /dev/zero. /dev/null is an empty void.

------
shabble
I'm reminded of a story about trolls back in the bbs/early internet days
creating zip files of several gigs worth of zeroes or other trivially
compressible data.

They could then upload a very small payload, and DoS the server through
filling it's disk when it tried to decompress it all (when disks were measured
in 10s of MB, if you were rich).

Pretty much the opposite of this trick.

~~~
slug
<https://secure.wikimedia.org/wikipedia/en/wiki/Zip_bomb>

------
Cymen
One reason to use SSL is to avoid any intermediary caching servers. I hope
those days are gone but some providers might still use them. Data retrieved
via SSL would avoid the local cache and be more accurate. So perhaps this was
one of the factors in the decision to use SSL. The decision to turn on
compression could have been completely separate.

------
john-n
Off topic but...This story has rekindled my dislike for re-targeting in online
ads. Having visited singlehop this morning, I'v spent half the day with blue
singlehop boxes glaring at me.

------
mschonfeld
Funny thing is - as a VPS hosting solution, they're actually pretty decent...
If this is intentional, and I do believe it is, its border line genius. Way to
go for catching this.

~~~
0x12
> its border line genius.

Genius is cheating your customers on a benchmark?

It's borderline criminal.

~~~
robtoo
You've never heard of an evil genius?

~~~
jerguismi
I don't see it as a genius either way. It's not technological breakthrough or
anything, just a simple trick.

------
rmc
The are cheating too well. By having a massve speed increase (45MB/sec) they
are immediately raising warning flags, and asking to be caught out.

------
tacoe
<http://en.wikipedia.org/wiki/Hanlons_razor>

------
DrJokepu
I would be hesitant to attribute this to malice; it easily could be a honest
mistake by an inexperienced employee.

~~~
ominous_prime
They are presenting a fake compressed tarball, containing long strings of 0s
so that it can be easily compressed in transit. That's not something that
comes about accidentally, then pair that with the oft unused option of SSL
compression, and there seems to be intent behind these actions.

~~~
DrJokepu
Maybe the SSL compression was enabled on the web server by a sysadmin who
likes to tweaks settings. Maybe he doesn't work at the company anymore. Maybe
it was a default setting in a previous version of some exotic distro. Who
knows?

Maybe the 500MB file was created by an intern who has no clue how compression
works or that the webserver compresses data. Maybe the manager told him to get
it done ASAP. Maybe he doesn't care about his job anyway and just solved it
the quickest way possible. Maybe initially he uploaded a real file with real
data but then his manager told him to replace it with something that's not
real. Maybe he tried to create a file with random bytes but there was a bug in
his script and all the bytes were set to zeros and never bothered to check.
Maybe they tried to download the file in their office but they didn't realize
that it's too fast because they have attributed the speed to the fact that it
was downloaded through their local network.

Shall I go on? I mean, this kind of stuff happens everywhere every day. There
are a lot more stupid people than evil people. Just assume stupidity.

~~~
ominous_prime
Point made. I am often astounded by what stupid can do when acting in concert.

BTW, it appears they have replaced said file with real data

    
    
        curl -H "Range: bytes=0-100,bytes=100000-100100" https://leap.singlehop.com/speedtest/500megabytefile.tar.gz | hexdump
    

looks like real data now

------
cvander
Good discovery. Let's see if they change the practice or if you get more
webhosting companies to notice.

~~~
bwb
I've been using SingleHop for 20+ servers for a year now and I love them, I
also know the owners well and they wouldn't ever do something like this on
purpose. Honest mistake, not something so sinister :)

Thanks, Ben

~~~
SingleHopDan
Thank you, Ben. It was a mistake, and we have owned up to it on our blog and
in Andy's post above-- <http://www.singlehop.com/blog/our-honest-mistake/> and
I hope that people don't judge us as bad guys for it. We really see no reason
to mislead people into signing up with us, it sounds like its just bad
alignment. Thanks, dan ushman.

------
flashmob
I've been with Singlehop since they started, and I have had no complaints
whatsoever! I always see them as a great little startup run by friendly and
innovative people who worked honestly very hard to get their business to the
current level. I find it hard to believe that they did this intentionally - it
would be too easy to get caught. It is really disappointing to see such harsh
criticism - can I ask if the author of the article checked with Singlehop
first before making such an outlandish accusation?

Edit: Fine, down-vote this all you like, but my comment still stands.

------
arantius
<http://en.wikipedia.org/wiki/Hanlon%27s_razor>

    
    
        Never attribute to malice that which is adequately explained by stupidity.
    

I don't have a lot more to say. Seems to fit this situation perfectly well.
Given (in hindsight) that they've fessed up in public.

------
rshm
I think @charliesome is not even a customer with single hop. If he was instead
of showing what isn't, he could have easily created that test files and
checked the correct one. It's very unfair to mark people as Cheat forever at
the first encounter.

I am using 3 singlehop's dedicated servers for more than 2 years. They are
small group of people who pick their phone at the middle of night to assist
you with your pet projects.

