* Clock time to deliver from requirements gathering to feature completion
* Developer time to complete an effort in money
* Defect quantity
* Defect severity
* Training/learning time to ramp up a new technology or feature
* Cost to produce documentation. Industry expects are amazed that the $84 million cost to produce the Air Force One documentation is actually so astonishingly low
The bottom line is that measures are a form of evidence, which is defense against arguments from bullshit. Developers typically measure absolutely nothing, so it comes to software performance developers tend to invent bullshit assumptions on the spot. They are wrong about 80% of the time and most of the time they are wrong they are wrong by several orders of magnitude.
* Time to build/compile
* Resource cost at execution time
* Clock time to deliver from requirements gathering to feature completion
* Developer time to complete an effort in money
* Defect quantity
* Defect severity
* Training/learning time to ramp up a new technology or feature
* Cost to produce documentation. Industry expects are amazed that the $84 million cost to produce the Air Force One documentation is actually so astonishingly low
The bottom line is that measures are a form of evidence, which is defense against arguments from bullshit. Developers typically measure absolutely nothing, so it comes to software performance developers tend to invent bullshit assumptions on the spot. They are wrong about 80% of the time and most of the time they are wrong they are wrong by several orders of magnitude.