
Ask HN: KPIs for a Development Team? - klenwell
I was tasked this week by executive management to come up with annual KPIs (Key Performance Indicators) for our development team by the end of the month. They also want individual developers to establish their own personal KPIs.<p>Do you have any suggestions for sound useful KPIs for a team of web application developers?<p>I&#x27;m thinking about things like the code quality metrics you see on Github repos. At the same time, I have some reservations. Our team has worked successfully to really sharpen our development process and improve the delivery and quality of our products. I worry that this is going to interfere with that and (to cite a new term I recently came across) run into Goodhart&#x27;s Law:<p>https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Goodhart%27s_law<p>I also have this recent discussion fresh in my mind:<p>https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=19035438<p>I&#x27;m prepared to push back against the initiative. But I thought I should give the idea a fair hearing. Links, suggestions, arguments against the whole idea are welcome. Thanks!
======
ljf
Cycle time is a hugely useful metric - how long does it take the team to go
from idea to the user being able to actively use your change.

This can be bracketed if needed - if you have issues with product you can just
take it from the time the team pick the work up to when it it delivered. If
you are bound by a release cycle - then you can just take it until the code is
packaged up ready to release - but both of those brackets hide inefficient
parts of the flow.

I'd steer away from developer commits etc - and look more at other measures of
quality - how many bugs do you have (live bugs) in your various priority types
an how long is their average age.

Some starters for ten.

~~~
williawmgant
I've struggled with this as well. While open bugs (and length of time open)
seems like a useful metric, that would be a problem if management regularly
tells development to not work on bugs (new features instead). I suppose that
could work if you had some way of marking the bug as "blocked by management"
or something.

Cycle time measurement would be wonderful. But that would seem to require 1) a
reliable and fast build/QA/Release pipeline and 2) either consistent-sized
"chunks" of work or a weighting factor. We're working on getting the former
right, and the latter seems like it could be gamed in either direction.

Another issue that comes up is how do you deal with KPIs impacting salary when
your environment trends towards chaos? For instance, if you have a big client
who must be pleased - should they find something that displeases them, it
could easily pull people off of other work until they are happy again. If KPIs
are based on cycle time, now developers are being penalized simply because the
client exists and has more leverage than the development team.

I'm wondering if KPIs are the answer, rather than simply having rewards at
milestones.

One concern I have with a KPI-based approach is how to deal with efficiency
improvements. Say the team figures out how to double the amount of work they
can reasonably do in a sprint (for example). Now that's the new baseline, so
now it becomes harder to improve one's personal KPIs to stand out. Done badly,
the team ends up caught under a productivity asymptote and feels less
motivated rather than more.

Another concern with individual KPIs is how they can sometimes cause the team
not to work together. A senior dev under a bad KPI system might be
incentivized not to mentor and support the junior devs, because doing so
damages their KPIs in favor of the juniors.

Team-based KPIs can lead to resentment too, as some team members won't carry
their weight and will end up being resented.

I've really been thinking about this a lot lately and the harder I look, the
fewer answers I have that might be correct. Thank you for responding and
thanks to OP for asking the original question - it comes at a perfect time for
me.

~~~
ljf
As I allude to above - if your release cycle etc isn't great today, then today
is a great time to start tracking cycle time. One you can show people how bad
things are today ('look, idea to dev complete is 2 weeks, but idea to live is
2 months'), and you can also show how well the small changes you make to the
release process, actually went towards improving things.

Also don't forget - try not to measure the 'amount of work' \- instead focus
on the amount of value released. x10 developers don't do 10 times as much
work/code/releases - but figure out and engage in what they can do to make the
company more money, and the most benefit - /with the LEAST/ work.

Focus your team on the problem at the heart of the user story, and not on the
solution too early.

~~~
klenwell
The distinction you mention here (idea-to-dev vs idea-to-live) could be really
useful for us as we've been debating two paths for an MVP that we're currently
working on.

It's actually the management team that has been putting off release in favor
of a more fully-formed MVP, whereas my team is eager to get a more minimal
working version (that we're pretty sure our users would appreciate and is
ready for production) into our users' hands.

One of my big concerns is that if I devote my time to focusing on figuring out
these metrics (which still feel somewhat academic for our team at this point)
then I'm going to be pulled away from more pressing concerns like reviewing
PRs, providing technical guidance for the team, and simply working to get shit
(or, rather, software of acceptable quality) out the door.

~~~
ljf
I'd argue that you are in a 'strong' place if the mgmt is putting off releases
- not the team.

If you can be in 'can we release yet, can we release yet' frame, rather than
'when will this be ready to release?' frame, then you have a strong
feature/value focused team. Again measuring the delays that are caused by
mgmt, rather than you and you team, is really valuable.

And as to time to sort out these metric - lean on jira etc to create these
reports automatically - metrics and reporting should be as easy and as simple
and as valuable as possible, or people simply won't track them - or care about
reviewing them.

