

Ask HN: How do you ensure you web app goes live without a major bug? - benhurds

I have not worked at a lot of companies, so I am a bit in the dark how companies typically test their software (I am specifically interested in web applications).<p>From talking to some friends at mid-size or larger companies, they seem to rely significantly on manual QA testing from their dedicated QA teams. I was actually surprised to hear that many of these people told me that automated testing is virtually nonexistent. Or if it does exist, it is simply as a supplement to the manual verification steps.<p>How do smaller companies ensure there are no show stopper bugs when they push some code live? Do they rely simply on automated testing, or do even the small guys typically have people doing manual QA?
======
arctangent
1\. Ensure the requirements are documented in sufficient detail that the
actual problem to be solved (or feature to be implemented) is crystal clear to
both the customer and the developer.

2\. Write automated tests. You can do this up front (TDD) or afterwards
(different people will tell you different things). The idea is that you can
run the tests in an automated way to quickly detect deviations from the
desired behaviour. It also helps you spot if changing code in one place breaks
code in another place.

3\. Test the application functionality manually to ensure that you (the
developer) think the functionality is behaving as described in the
requirements document.

4\. Deploy the application to a testing environment, ask the customer to test
it and (when they are happy) sign off on the functionality.

5\. Deploy the code live.

6\. Wait for the inevitable phone calls to tell you that something the
customer wants (but which was most definitely not in the requirements
document) is apparently not there ;-)

------
jwedgwood
Well written automated tests will accrue value over time by saving you manual
testing time (and increasing your confidence) with each release.

A lot depends on your situation at launch, but I think a lot of teams
overestimate how much testing they should do, when in fact their bigger issue
is not having enough customers, or not engaging their customers with a decent
product.

If you have time, my suggestion would be to focus on a few items:

1) Notification tools to help you know when something goes wrong (e.g. email
to the dev team). If you are using ruby, then <http://hoptoadapp.com> is a
great tool, but if you can't find a tool, write your own. It's worth it.

2) Monitoring tools to know if the site is down (e.g. pingdom)

3) Figure out how to cleanly do site updates with zero downtime. This will let
you fix the inevitable bugs without inconveniencing users who haven't hit the
bug yet.

4) Write automated tests for the things that you think really can screw your
users the most.

5) Develop a culture that, when you find bugs in production code, you write
tests to reproduce them, then fix the bug, then use the test to validate the
fix.

My two cents.

------
spooneybarger
I've worked at places where we've done both.

Manually is quicker to get started with but pretty quickly the time you put
into greatly surpasses the time that would be spent on automated tests.

