Andy from SingleHop here. Thought I'd "hop" in to clear some things up.
Everyone was correct in saying that this was an honest mistake, and that the speed test download files were in fact generated with dd and an input file of /dev/zero. We've since corrected this by generating new files using /dev/urandom, and the results are more accurate now.
We were by no means trying to "trick" or "deceive" any customers, or anyone at all. I am personally happy to see the community bring this to light so we could tackle it.
$ od 500megabytefile.tar.gz
0000000 000000 000000 000000 000000 000000 000000 000000 000000
It is just much easier to find a file that is some exact distance if you create it first.
 high-entropy in any of the usually-available ways
 for any compression system actually deployed, as opposed to one contrived to match specifically the 'random' source
But if you take /dev/urandom you can get quite a bit more. On this system it is about 7Mbyte/s.
/dev/urandom supplements its entropy with a PRNG so that it never blocks..
Does this mean that if I've got a server with no mouse/keyboard attached, /dev/random will block forever?
Logging into my slicehost server, and running cat /dev/random | hexdump -C seems to support this, more or less - only a few lines get output unless I start typing into the terminal - then it goes marginally faster.
...among many others.
tsuraan@macbeth ~ $ dd if=/dev/zero of=abigfile bs=1M count=1 seek=1000000
1+0 records in
1+0 records out
1048576 bytes (1.0 MB) copied, 0.00320282 s, 327 MB/s
tsuraan@macbeth ~ $ ls -lh abigfile
-rw-r--r-- 1 tsuraan tsuraan 977G Sep 16 11:18 abigfile
Given how easy it would be to get caught doing this (the https download makes chrome claim 40mbps over a T1) I think it's more likely this was accidental.
You'd think they would notice the CPU bottleneck on their system, though, as they recompress that file for everybody. That's the only thing that makes me wonder.
They could then upload a very small payload, and DoS the server through filling it's disk when it tried to decompress it all (when disks were measured in 10s of MB, if you were rich).
Pretty much the opposite of this trick.
Genius is cheating your customers on a benchmark?
It's borderline criminal.
Maybe the 500MB file was created by an intern who has no clue how compression works or that the webserver compresses data. Maybe the manager told him to get it done ASAP. Maybe he doesn't care about his job anyway and just solved it the quickest way possible. Maybe initially he uploaded a real file with real data but then his manager told him to replace it with something that's not real. Maybe he tried to create a file with random bytes but there was a bug in his script and all the bytes were set to zeros and never bothered to check. Maybe they tried to download the file in their office but they didn't realize that it's too fast because they have attributed the speed to the fact that it was downloaded through their local network.
Shall I go on? I mean, this kind of stuff happens everywhere every day. There are a lot more stupid people than evil people. Just assume stupidity.
BTW, it appears they have replaced said file with real data
curl -H "Range: bytes=0-100,bytes=100000-100100" https://leap.singlehop.com/speedtest/500megabytefile.tar.gz | hexdump
I think it's pretty doubtful that the main marketing demo of your site is going to use SSL for a lark.
Edit: Fine, down-vote this all you like, but my comment still stands.
Never attribute to malice that which is adequately explained by stupidity.
I am using 3 singlehop's dedicated servers for more than 2 years. They are small group of people who pick their phone at the middle of night to assist you with your pet projects.