

On testers and testing - sriramk
http://sriramk.com/blog/2012/01/testing.html

======
wilhelm
“Most product teams don’t need a separate testing role”, quoth the author. And
he may be right. Most products don't need to be of high quality. Most products
have few moving parts and few things that can go wrong. If your product is a
web application, you can fix stuff quickly once your users start complaining,
too.

But then there is the kind of software that must not break, ever. The kind of
software that is so complex that every time a developer touches it, he is
bound to break something, somewhere, because no human can keep all that
complexity in his head.

Your operating system falls into that category. Your web browser. The machines
that bring humans to the Moon, keep you alive at the hospital or your car on
the road. You know, the difficult stuff.

I've spent a few years as test manager for a web browser engine you may have
heard of. The team consisted of some of the best developers I've ever met.
Razor sharp guys. But despite their brilliance: for every bug fix they did,
there was a 30% chance of them breaking something. In the most complex parts
of the layout code, that number was closer to 50%.

50%!

Having less than one tester per two developers on that particular project
would be madness. And I'm not talking about outsourced monkeys pushing random
buttons ten time zones away. I'm talking about people more evil than the devil
himself – able to conjure up the kind of tests that will rip your software to
pieces in ways you could never imagine. I'm talking about proper test
engineers that can automate away all that boring shit humans don't want to do.

A sparring match between a great developer and a great test engineer is truly
a thing of beauty. And in the end, both of them win.

~~~
patrickyeon
I interpreted points one and two as being against the culture of "oh well, let
testing catch the bugs". He's not saying that there's no need for testing,
just that the developers shouldn't count on some other department to make sure
their code works. I've only had a very short career so far, but I've already
seen the people who resist any review, make it as hard as they can for
testers, get upset with testing when they find bugs, or just throw stuff at
the wall to see what sticks. Even worse: a lot of this was in hardware design!

When you need to go back and forth on a bug 4 or 5 times, every time the
programmer claiming it was fixed but failing an old test case, a test group
won't fix the problem, programmers taking responsibility to ensure their own
code is correct will.

> A sparring match between a great developer and a great test engineer is
> truly a thing of beauty. And in the end, both of them win.

Absolutely. Unforunately, if the tester is not up to snuff, they're no help
against a great developer, and a great tester against a sub-par (or worse,
insecure) developer is nothing but trouble, no matter who's right.

------
cobralibre
Note that Evan Priestley's post on Facebook and testing, linked in the
article, ends with the statement "This process works for Facebook partly
because Facebook does not, by and large, need to produce particularly high-
quality software."

~~~
sriramk
Though Evan's post made me write this, I'm actually talking about different
aspects to testing than he is.

But to respond to you - neither do a lot of others (need high quality software
i.e). Agility->quality is a continuum. On one end, you can instantly deploy
any piece of code to prod as soon as it is written. On another end, you test
code for several years to make sure it is rock solid (stuff that goes into
nuclear reactors). Large organizations move more to the right than they need
to. Sometimes it makes more sense to optimize for bandwidth of code over
quality (I believe Zuckerberg said this first).

~~~
jrockway
This is not what I've seen. Large organizations are resistant to change, but
it doesn't mean they test anything. I used to work for Bank of America.
bankofamerica.com was down for over a week, despite a change control policy
that basically prevents code from ever going into production. Big
organizations like policies, but the people that make the policies do not have
the knowledge required to implement or enforce them. And when they do, they
don't work anyway; you can wish for perfect software, but when you hire dumb
people to write it for you, it's going to break.

Big companies don't want downtime, but they hire non-programmers to make this
happen. The result is downtime.

~~~
absconditus
I work on medical software for a large bureaucratic company. We have a large
QA department and more processes and sign-offs than I have ever seen before.
The software is the lowest quality software that I have ever worked on.

~~~
bmajz
There are a bunch of different issues to be parsed out here: First and
foremost, I think what this comes down to is understanding the priorities for
your particular organization. There are some places where a high test:dev
ratio may be critical e.g. the core pieces of a new OS or firmware for factory
robots. For things like that, there is a really high cost to rolling out
updates and that cost can be mitigated by paying for a second look at quality
up front. For a consumer-facing web app, the cost of making a fix is often
pretty low and can be mitigated by other means e.g. limited rollouts or beta
previews. I think the real problem is treating various pieces of software in
the same way, when in fact the fundamentals of the model are quite different.
This also applies to using the same metrics (be that coverage, test pass time
etc.) on each project. You have to find what works for you and not assume that
the problem can be solved the same way.

Second, it's not fair to go easy on the quality of test/QA folks. This is
where a lot of organizations that actually invest in dedicated testers fail.
Testers must be held to a similar bar as developers, just with a different
functional role. As others have pointed out, the disbalance that widely
varying talent levels cause in organizational health is quite disastrous. It's
the same as any other piece of hiring. Make sure you have the right person for
the job. That almost always means someone who knows
computers/code/workflow/users at the same level as developers or product
managers.

Finally, as a former Windows engineer, I don't agree with the narrative that
the Vista fiasco was a result of over-automation and not enough end-user
scenario testing. It was more a problem of poor processes that couldn't handle
the scale of the team and a lack of clear planning/leadership to account for
that. Windows 7 had, if anything, more automation and more testers than Vista
and was obviously a much better product. A lot of things have to go wrong to
get something as ugly as the Longhorn Reset
([http://en.wikipedia.org/wiki/Development_of_Windows_Vista#Mi...](http://en.wikipedia.org/wiki/Development_of_Windows_Vista#Mid-2004_to_Mid-2005:_Development_.22reset.22))
and the one that OP pointed out was likely one of many.

------
forrestthewoods
This issue with testers in my experience is that anyone who is a really,
really good tester has the skills and determination to very quickly move up
and out of testing.

~~~
johnwatson11218
I really hope this changes in the future. I don't think testing gets nearly
the respect that it deserves. I have thought about making the move from dev to
test but I know that there are very few organizations that give the same level
of respect to the test team as they do to the development team.

~~~
absconditus
Google's test engineers are pretty impressive. I am convinced that they are
better developers than the average developer at most companies.

~~~
bmajz
It's all about the organization valuing one's work. If testers feel their job
is important and the organization rewards them commensurate to their impact, I
don't see why a good tester would want to leave. There's already so few of
them.

------
zalew
> _Inspired by this post on Facebook’s testing_ <http://www.quora.com/Is-it-
> true-that-Facebook-has-no-testers>

and in this post _"Ex-Facebook employees have some privileged channels they
can use to report issues; I personally report around 13,000 bugs per month"_

huh? That's about 18 bugs per hour on average. Does it say that the guy on
quora is working for them on bugs although he's not an employee, or am I
missing something??

~~~
anothermachine
It's a humorous exaggeration.

~~~
zalew
oh...

------
ramkalari
This approach may work in a best case scenario when you have great
programmers. However, there are very few companies in the world that have the
kind of developers that worked for NT. What would you recommend for the
average case?

------
gph
The second anyone mentions the 'bozo bit' I immediately stop listening to them
on principle.

~~~
sriramk
Why? Seems like a fair-enough term.

~~~
gph
It was meant as an ironic joke, I wasn't serious. Re-reading my comment I
guess it doesn't come across as a joke, should have put a smiley :)

~~~
jmitcheson
This place is tough sometimes: if you are obviously joking you'll get
downvoted; if you are subtly joking you are often misunderstood!

