

Developing When Your Computer is too Fast - gsteph22
http://dustin.github.com/2010/11/12/labrea.html

======
dlsspy
I don't believe I communicated clearly enough in the creation of this project.

This is not ``Yay, I don't ever have to worry about any other computer because
I can simulate anything.''

This is me sitting next to my QA guy (who really understands what I did better
than I do so far) and having him say, ``Why don't you just make it fast for
the first ten minutes, then slow down socket operations briefly, then switch a
pattern of speedup and slowdown on disk reads?''

I want to do more fault injection stuff. I'm being asked to do stuff like
manipulate the data (sometimes), or lie about the results of certain
operations.

~~~
chunkbot
I understand your motivations better now. Being able to run these ad hoc tests
in a repeatable manner isn't the same as firing up IE6 on a Pentium 4 box with
256MB of RAM.

------
chunkbot
Fun, but the good thing about slow computers is that they're also the
cheapest; it's far more economical to just buy an old box for a couple hundred
dollars (or even cheaper).

~~~
dlsspy
Sure, but then I have to carry around another computer or two when I want to
work within a particular assumption.

Now, I can say, ``What if this box were really slow to respond to network
requests 10% of the time?'' and just do it.

~~~
InclinedPlane
Carry around, repair, keep patched, etc. It's only cheap if your time is
cheap.

------
51Cards
I have been programming since the late 80's and one thing I learned early was
to always develop (or at least extensively test) on older hardware. My primary
work machine is usually a generation or two old. At present I code on a T42
Thinkpad and there is a new T60 waiting to take over in a couple months. If
you're forced to develop in a "slow" environment you learn to optimize your
code from the get go, all the time. You automatically rely on faster
techniques as the norm vs. going back and fixing later. As a result now I
frequently see my applications running at client's offices and think to myself
"holy crap, that's fast". Another side benefit is that it keeps all my "fun"
applications over on another more current machine and separate from work.

~~~
vlisivka
I also started programming in late 80's, and I use that technique too.
Currently, my netbook is more powerful than my primary laptop. :-)

My laptop is also able to variate CPU clock speed from 2GHz to 200MHz. It
makes me easy to execute CPU-bound benchmarks of my programs.

------
w1ntermute
The point he's missing is that you don't have to develop software on the same
computer you test it on. The best solution is to have a separate test computer
that's appropriately slow. In fact, most people probably already have an old
one lying around that they could use.

And regardless of speed, it's good to test your software on a variety of
systems.

~~~
dlsspy
I certainly do test on many different systems, but not all of them.

This doesn't take the place of real-world testing. It takes the place of
setting up a network and putting a modem between my client and server then
attaching a tape drive to my server to see what happens.

So far, it's been quite useful, but I've got a ways to go.

------
jrockway
Another site that rips off the HN comments verbatim?

Hi everybody, I'm doctor copyright-infringement!

~~~
cheald
That's Disqus (YC S07). Its "reactions" piece pulls in comments from sites
like HN, Reddit, and Digg into the comments widget it puts on a given blog
post.

~~~
jrockway
I see. Does HN say somewhere that "anything you submit can be used for
commercial purposes by YC startups"?

~~~
baha_man
Interesting point, I googled for 'who owns copyright blog comments' and found
this (Google cache, the site itself is down):

[http://webcache.googleusercontent.com/search?q=cache:6GfoIFr...](http://webcache.googleusercontent.com/search?q=cache:6GfoIFrkhikJ:www.reasonableman.com/archive/2005/02/who_owns_blog_c.html+who+owns+copyright+blog+comments&cd=1&hl=en&ct=clnk&gl=uk)

I'd like to hear if anyone else agrees or disagrees with this article (it
seems to be a bit of a grey area).

~~~
kmfrk
I think it was when talking to Jonathan from Plagiarism Today
(<http://www.plagiarismtoday.com>) that I was told that forum posts should be
regarded as the authors' own, legal copyright property. I may have some old
notes on this around that I can dig up.

Some forums (and sites like YouTube) write some Terms of Use/EULAs to waive
their content rights to the site owners (not for any nefarious reasons),
although I doubt that these "agreements" hold up.

If I had any comments or forum posts that were copied verbatim in a manner
that upset me, I would definitely pursue it legally. But I'm a fanatic like
that.

------
dugmartin
I recommend doing the same thing for web development.

There is a nice Firefox plugin (Firefox Throttle) that lets you throttle
uploads and downloads with a single button click. If, like most web
developers, you do development on localhost using the plugin lets you see how
your user's experience uploading large files or downloading image rich pages
with network latency.

Unfortunately its been pulled from the addons library search but you can
download it directly (for now) using the ftp url in the comments here:

[http://support.mozilla.com/en-
US/questions/755876#answer-107...](http://support.mozilla.com/en-
US/questions/755876#answer-107035)

------
slewis
Nice work. But be careful about relying entirely on something like this. There
are more than just seek time differences between hard disks and SSDs: they
have different internal cache behavior, different latency variances, queuing
differences (depending on how the devices are configured and how you're using
them) etc.

This seems like it'll get you 80% of what you want. But it'd be useful to have
an actual disk to test on as well.

~~~
jamn
To further this.

I think this is a great idea, but having worked on benchmarking database
structures in the past, I'd be weary of using them for any type of real
benchmark.

For one, trying to model a real disk would get very complicated very fast.
Say, the access time of the disk will be a function of the position on disk,
so it would be unrealistic to get random delays when scanning a large chunk of
contiguous data or by just having a few delays in a random-access heavy load
would also be extremely unfair.

In short, trying to model complicated disk latencies is pretty hard, and
usually if you are programming with some model of disk in mind, building a
disk latency simulator under that same model may end up giving you a false
sense of security.

For what it's worth, I'd favor getting a cheap hard disk and trying the load
there instead.

------
cheald
I actually have a dev box that's an older Athlon XP with 1 GB of RAM that is
surprisingly useful for finding performance issues. Things that don't show up
on my monster desktop show up -really- fast on the clunker.

I'm a firm believer that developers should have old hardware to test on. :)

------
radioactive21
Wouldn't all this depend on your target customers? If your target customer is
big business that have fast computers you are okay.

If your customers are average joes, then yes I would say you should have a
test environment in place that closely represents those customers.

Even after all that the question I have is which set of customers brings in
the most money for you? Lets say you have a mix of big business that run fast
machines, and average joes that run slow machines. If 90% of your money comes
from big business, why bother trying to make software for the slow machines?
Just slap on a system requirement and it should cover them, so-to-speak.

~~~
qqqq2010
Incidentally, as a webdev who targets big businesses, they in fact usually
have -slow- computers. Javascript in IE 6 is pokey, and typical use-cases of
Outlook and Excel don't need much firepower.

A joe-blow at home on the other hand is either at least at the "corporate"
level, or even higher for gaming or media-centering.

------
StavrosK
Great work, but I think a library to do the opposite would be more useful.

~~~
whimsy
... to make your computer go faster than it really is?

That WOULD be pretty useful.

Or are you merely talking about libraries that do optimization for you?

~~~
StavrosK
The former!

------
emmelaich
I appreciate this sort of software, since our development and production
hardware has vastly different performance profiles.

Development is on fast intel single or dual core machines with slow disks and
production is slow many core sparc machines with fast disks.

------
tomjen3
I can see this work, but honestly wouldn't it make more sense not to use an
older computer? They are typically cheap and you don't have to suffer when you
don't develop speed critical code.

~~~
qq66
You wouldn't want to be compiling on an old machine.

------
meemo
What about developing or testing in a virtual machine?

~~~
dlsspy
VMs running on my box might not get a lot of CPU time, but will still get good
IO performance.

It's going a bit beyond just simple performance, too. For example, I can add
logging of reads on client network file descriptors like this:

[https://github.com/dustin/labrea/blob/master/examples/lognet...](https://github.com/dustin/labrea/blob/master/examples/lognetworkreads.lua)

