Hacker News new | past | comments | ask | show | jobs | submit login
SingleHop Are Cheats (charlie.bz)
208 points by haileys on Sept 16, 2011 | hide | past | favorite | 50 comments



Hey All,

Andy from SingleHop here. Thought I'd "hop" in to clear some things up.

Everyone was correct in saying that this was an honest mistake, and that the speed test download files were in fact generated with dd and an input file of /dev/zero. We've since corrected this by generating new files using /dev/urandom, and the results are more accurate now.

We were by no means trying to "trick" or "deceive" any customers, or anyone at all. I am personally happy to see the community bring this to light so we could tackle it.


I'm curious why you're serving the file over HTTPS in the first place?


My guess would be that it will prevent an ISP from trying to do naughty things like caching or otherwise cheating the speed test and giving you a result that's better than what you are really getting.


I downloaded the first couple dozen megabytes. It's all nulls.

    $ od 500megabytefile.tar.gz 
    0000000 000000 000000 000000 000000 000000 000000 000000 000000
    *
    27200000
(besides not being a .tar.gz, it's not 500 MB either, it's 512 000 000 bytes = 512 MB, or 488 MiB)


This also means that opening the tar.gz does not work. Not knowing about the cheat, I'd want to try. My 7-zip choking on the file would make me download again. After the second time, my trust in Singlehop's transfer abilities would be very low - after all, files they send appear to get corrupted.


So it can basically compress to something extreme, too.


Exactly. I watched the total traffic tick up in Activity Monitor while downloading it, and it was only 2.2 MB in total.


It is sort of a funny cheat though. A good prank to pull on friends to claim you have a very fast internet connection, but for a company to do this in order to promote their product, it is very dishonest.


That isn't strange though -- if I were creating a file that I knew would only be used for speed testing, I would make it al nulls too.

It is just much easier to find a file that is some exact distance if you create it first.


I'd use random data to make it hard to compress. You don't know if there won't be some link layer compression along the way.


That doesn't really fix the problem, what if that random data happened to compress well? The solution is to not compress anything while downloading.


"Random data' [1] will not compress well [2].

[1] high-entropy in any of the usually-available ways

[2] for any compression system actually deployed, as opposed to one contrived to match specifically the 'random' source


A large quantity of random data is incredibly unlikely to compress well.


$ dd if=/dev/random of=nonbullshittestfile bs=1M count=500


That would take forever, at least on my system, as I don't really have a lot of entropy available. Just for kicks I have started this when I began writing this reply. And now I have the huge amount of 200 Byte in this test file. So thats about 10 bytes per second.

But if you take /dev/urandom you can get quite a bit more. On this system it is about 7Mbyte/s.


What's the difference between /dev/random and /dev/urandom? I've only ever used urandom and it spits out data as fast as I can consume it.


/dev/random uses environmental noise for entropy, and can be depleted rather quickly.

/dev/urandom supplements its entropy with a PRNG so that it never blocks..


/dev/random will block while waiting to collect entropy from the system. /dev/urandom will be satisfied with pseudorandom numbers. That's fine for many applications (e.g. a file filled with garbage) but not acceptable for things like cryptography.


That entropy, by the way, is derived from the keyboard and mouse devices. If you want /dev/random to go faster, move your mouse and type more ..


Can it be configured to use other sources?

Does this mean that if I've got a server with no mouse/keyboard attached, /dev/random will block forever?

Logging into my slicehost server, and running cat /dev/random | hexdump -C seems to support this, more or less - only a few lines get output unless I start typing into the terminal - then it goes marginally faster.



It will also use other interrupt timings in the creation of the entropy, most notably hard drives since they're rather random on when they reply back due to it having to rotate to the right place. Not sure how this works on an SSD. I also believe if you've got some kind of hardware RNG it will use that too.


Interesting. I don't see /dev/random blocking on Mac OS X (13" MacBook Pro). I wonder what the source is?


On OS X, /dev/random and /dev/urandom are the same thing (both acting like the traditional /dev/urandom).


I'm guessing it may be even smaller than a file filled with zeroes. If I were doing it, I'd use a sparse file, so it only takes a few blocks, but (sans compression) would take a long time to download. Like this:

  tsuraan@macbeth ~ $ dd if=/dev/zero of=abigfile bs=1M count=1 seek=1000000
  1+0 records in
  1+0 records out
  1048576 bytes (1.0 MB) copied, 0.00320282 s, 327 MB/s
  tsuraan@macbeth ~ $ ls -lh abigfile 
  -rw-r--r-- 1 tsuraan tsuraan 977G Sep 16 11:18 abigfile
I probably wouldn't actually make it ~1TB, but that's the idea.


Surely you wouldn't get as greedy as they did though? Nobody would've suspected them if they had've been a little less "cheaty".


If you go to the testing page (http://www.singlehop.com/why_singlehop/data_center_details.p...) you have two choices for testing locations - downtown leads to https and goes fast, but elk grove leads to http and is uncompressed.

Given how easy it would be to get caught doing this (the https download makes chrome claim 40mbps over a T1) I think it's more likely this was accidental.


I can certainly see a lath that led to this through incompetency rather than malice. /dev/null is really fast to creat a big file from and an SSL server that is already configured to do compression.

You'd think they would notice the CPU bottleneck on their system, though, as they recompress that file for everybody. That's the only thing that makes me wonder.


Nitpick: you mean /dev/zero. /dev/null is an empty void.


I'm reminded of a story about trolls back in the bbs/early internet days creating zip files of several gigs worth of zeroes or other trivially compressible data.

They could then upload a very small payload, and DoS the server through filling it's disk when it tried to decompress it all (when disks were measured in 10s of MB, if you were rich).

Pretty much the opposite of this trick.



One reason to use SSL is to avoid any intermediary caching servers. I hope those days are gone but some providers might still use them. Data retrieved via SSL would avoid the local cache and be more accurate. So perhaps this was one of the factors in the decision to use SSL. The decision to turn on compression could have been completely separate.


Off topic but...This story has rekindled my dislike for re-targeting in online ads. Having visited singlehop this morning, I'v spent half the day with blue singlehop boxes glaring at me.


Funny thing is - as a VPS hosting solution, they're actually pretty decent... If this is intentional, and I do believe it is, its border line genius. Way to go for catching this.


> its border line genius.

Genius is cheating your customers on a benchmark?

It's borderline criminal.


You've never heard of an evil genius?


I don't see it as a genius either way. It's not technological breakthrough or anything, just a simple trick.


The are cheating too well. By having a massve speed increase (45MB/sec) they are immediately raising warning flags, and asking to be caught out.



I would be hesitant to attribute this to malice; it easily could be a honest mistake by an inexperienced employee.


They are presenting a fake compressed tarball, containing long strings of 0s so that it can be easily compressed in transit. That's not something that comes about accidentally, then pair that with the oft unused option of SSL compression, and there seems to be intent behind these actions.


Maybe the SSL compression was enabled on the web server by a sysadmin who likes to tweaks settings. Maybe he doesn't work at the company anymore. Maybe it was a default setting in a previous version of some exotic distro. Who knows?

Maybe the 500MB file was created by an intern who has no clue how compression works or that the webserver compresses data. Maybe the manager told him to get it done ASAP. Maybe he doesn't care about his job anyway and just solved it the quickest way possible. Maybe initially he uploaded a real file with real data but then his manager told him to replace it with something that's not real. Maybe he tried to create a file with random bytes but there was a bug in his script and all the bytes were set to zeros and never bothered to check. Maybe they tried to download the file in their office but they didn't realize that it's too fast because they have attributed the speed to the fact that it was downloaded through their local network.

Shall I go on? I mean, this kind of stuff happens everywhere every day. There are a lot more stupid people than evil people. Just assume stupidity.


Point made. I am often astounded by what stupid can do when acting in concert.

BTW, it appears they have replaced said file with real data

    curl -H "Range: bytes=0-100,bytes=100000-100100" https://leap.singlehop.com/speedtest/500megabytefile.tar.gz | hexdump
looks like real data now


>Maybe the SSL compression was enabled on the web server by a sysadmin who likes to tweaks settings. Maybe he doesn't work at the company anymore. Maybe it was a default setting in a previous version of some exotic distro. Who knows?

I think it's pretty doubtful that the main marketing demo of your site is going to use SSL for a lark.


Good discovery. Let's see if they change the practice or if you get more webhosting companies to notice.


I've been using SingleHop for 20+ servers for a year now and I love them, I also know the owners well and they wouldn't ever do something like this on purpose. Honest mistake, not something so sinister :)

Thanks, Ben


Thank you, Ben. It was a mistake, and we have owned up to it on our blog and in Andy's post above-- http://www.singlehop.com/blog/our-honest-mistake/ and I hope that people don't judge us as bad guys for it. We really see no reason to mislead people into signing up with us, it sounds like its just bad alignment. Thanks, dan ushman.


I've been with Singlehop since they started, and I have had no complaints whatsoever! I always see them as a great little startup run by friendly and innovative people who worked honestly very hard to get their business to the current level. I find it hard to believe that they did this intentionally - it would be too easy to get caught. It is really disappointing to see such harsh criticism - can I ask if the author of the article checked with Singlehop first before making such an outlandish accusation?

Edit: Fine, down-vote this all you like, but my comment still stands.


http://en.wikipedia.org/wiki/Hanlon%27s_razor

    Never attribute to malice that which is adequately explained by stupidity.
I don't have a lot more to say. Seems to fit this situation perfectly well. Given (in hindsight) that they've fessed up in public.


I think @charliesome is not even a customer with single hop. If he was instead of showing what isn't, he could have easily created that test files and checked the correct one. It's very unfair to mark people as Cheat forever at the first encounter.

I am using 3 singlehop's dedicated servers for more than 2 years. They are small group of people who pick their phone at the middle of night to assist you with your pet projects.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: