Personally, I actually like OKRs at the organization or large sub org level. I think they are great at getting everyone moving in the same direction and letting everyone know what matters. My problem is that as you drill down into an organization you need to switch from the "Why are we doing this?" and "What are we measuring?" questions and instead as "Exactly how are we going to do this?" which doesn't really fit into the ORK structure. Personally, I find that once you get down to a group of about five people or less, creating a list of tasks is far more affective than OKRs.
Just filling this out makes it clear that most of the company is about bullshitting each other.
The hilarious thing where I work is at the end of each week I have to manually enter the hours I worked and I have N charge numbers to split up those hours. If you read the line item for each charge number it's basically a synopsis of one or more OKRs.
All this to say, from an engineer's point of view it seems with just a few tweaks in JIRA and Gitlab, review input could be created by running a report on commit history over the date range the review covers.
Having the choice of having a developer or designer a visionary KR of improving x or y by z% vs getting a project, milestone or task done, in many cases the latter keeps them focused vs them rolling their ideas on the % KR.
> Key Result #1 – Lifetime Customer value increase from $N to $N+5
> Key Result #2 – Decrease Customer Churn Rate by 10%
I'm a developer on a team. How am I supposed to know why customers are churning? I'm three levels removed from talking to customers when they cancel. I don't have a deep relationship to know how to add value to them.
Someone needs to do the research, analysis, and leg work of finding potential areas to exploit. And then someone needs to have at least some amount of vision or inspiration about how to solve that problem. Or do whatever trendy new "design sprint feedback loop" brainstorming session someone is tweeting about. This is the kind of stuff I hear designers and "product people" talking about wanting to do.
If you want to do that work and then loop in the engineering team to talk about feasibility and planning that seems fine, but the last time I worked in an environment with OKR no one seemed to have any brilliant ideas about how to, you know, actually achieve the Key Results.
(Until these questions can be satisfactorily answered to developers, OKRs are going to be perceived as a buzzword, flavor of the month process air-dropped in because someone saw a blog post about it)
- Board to CTO : Increase Retention (decrease churn)
- CTO to DevLeads:
* Build a daily report emailed to the board showing 30 day moving average of churn
* Build a business event log - every time a user does something on the system log it to a easily queryable system (customer signs in, customer raises invoice or customer deletes widget) This can get very deep very fast. Start with Graphite / carbon and get more sophisticated later.
* hire a data scientist / convert DBA into one and find correlations between customers who churn and events in the log. Not logged in for 30 days looks like a good start.
* Write split testing into the (SaaS) app such that we can randomly segment customers at risk of churning (indicates by event activity) and see if we can keep them
* also add in "have you tried this feature emails", or "holy crap what do you mean the invoice page does not align right anymore"
All of these things (for a SaaS app) are doable projects for any development team.
This is of course based on the idea that the Board has told the CTO "fix this thing as top priority". If they have not that's their problem. The CTO should then go to the board and say "I am going to fix this thing as top priority"
Then we start the fun job of actually monitoring what developers do work on compared to what we planned to work on. Most times priorities chnage, legacy weighs us down and friction burns is.
I've been in the situation where the metrics were used at a detailed level. Net Promotor Score (NPS) was used by the company and I tried, somewhat successfully, somewhat unsuccessfully to use OKRs at the team level. I agree with everything that has been said in using these to set the overall the direction and strategy at a high/medium level but down on the front lines it's very hard to do anything that will move the needle in the right direction in a clear cause and effect way.
If you try and map tasks to strategic goal then you just up window-dressing everything to keep management happy and since the components of goal X are many any varied your chances of success are limited.
If you don't understand how your daily work is related to management's priorities then probability is high that you are going to do a lot of work that isn't valuable to the organization.
So, either it should be a trivial exercise to rationalize how a task is related to the objective, in which case this is nothing more than a small overhead of working in an organization. Or if you find you have to make leaps of faith to make the connection to the objective then that's a signal that you need to consider why the task needs to be done at all.
Well yes. I am just posting that it is possible to take a top level metric and build a backlog that represents sensible solutions to that metric.
But you first need
- a top level metric (preferably that measures what will make or break your business)
- a way to determine what things under your control drive that metric.
But yes, the things you do in the trenches will probably not move the needle far. That is for two reasons
- at some point the code base is so big that doing "one thing" won't make impact (i think this is around the 100k SLOC) level which is still fairly small
- and even if you can affect the whole code base, the code is at the bottom
of an inverted pyramid of "leverage to affect the business" - the CEO can chnage the business far far more be deciding to triple the price tomorrow than any bugs you fix.
But yes - in the end, if you have a working product right now the best thing to do is to go
find real customers, work out why they are upset (either with clever telemetry analysis or just fricking ask) and go fix that bug / missing feature.
If you don't have a working product there is no telemetry so ... fricking ask.
But find what's not working and fix it is a good plan. If what's not working however is "the business model" we are in interesting territory
I think a non working business model
is exactly the purview of software. I think that we shall
replace all non-coding business people
with coders who can business in a generation. But that this generation will see real opportunities
I'm always amused by where one job ends and another begins. It seems strange to me that every web software company in the world is looking for "full stack engineers" -- you need to be an expert at everything from CPU instructions up to the CSS3 color module and its implementation in the browsers that our users are using -- but doing research and analysis of user behavior is off-limits. That's where we're drawing the line?
When you're a salesperson, you don't have to agonize over how to illustrate that you've "Decreased Customer Churn Rate by 10%". When you're an assembly line worker, you don't have to find a way to figure out how you contributed to "produce 15% more widgets". You either accomplish these things or you don't - they are literally descriptions of your performance in your job. Either way, nobody cares how you did (or didn't) do it.
When you're a developer, how are you supposed to show that you've helped "increase customer value from $N to $N+5"? Because "shipped the new version of the message queuing system" is not in anyone's list of key results.
OKRs feel like "organized SMART goals", and so I have the same criticism of them as I do of SMART goals: they're just another way of conceptualizing goals around things that are already easily and directly measured. No one has made any progress in quantifying the contributions of roles that don't have direct percentage impacts on dollars earned or dollars saved.
Which is not to say the OKR system doesn't still have issues, they just look more like the ones discussed in the OP.
Edit: just to note that I have some other criticisms of OKRs, but having them be way too high level, broad, and not actionable should not be the problem.
The problem I encountered was having, or the notion of wanting, a product roadmap which is produced from stakeholder, C-level, product- and IT team input parallel to OKRs. I feel this is an anti pattern. You have OKRs and they make your quarterly roadmap or you don't apply OKRs at all to the producing team.
It doesn’t have to be perfect, but most teams would have a lot more impact if they spent some of their time getting at least very rough estimates of the impact of their current and future projects.
Maybe the new system requires 75% of the servers the previous one did, leading to increased revenue. Maybe it’s quicker, resulting in customers experiencing better service, and so churn is reduced from the pool of people who said “I like it, but it’s too slow”.
A queueing system in itself isn’t of any value to the business as a whole.
If you're an average developer at a non-startup you've likely been asked to put together a queuing system. You didn't decide to do that work, and you probably shouldn't be spending weeks trying to gather the information you might need to justify that work.
Your job wasn't to figure out 13.2% of your customers opt out of your paid reporting services because they experience slow responses and unrecorded data.
Your job is usually much closer to -
PM - Can we make this service faster? We're losing customers on this feature because it's slow.
TeamLead - Probably, we can re-implement our queuing to be quicker and more reliable.
Dev - Ok, I'll investigate [x] queuing library or service
Then you should be spending your limited time and energy on actually producing that result. The technical task is usually quite complicated (ex: here's just the table of contents for RabbitMQ https://www.rabbitmq.com/documentation.html).
The justification for the work wasn't your job to put together (although asking sane questions is usually a good call). Your justification was simply "My PM/TeamLead asked me for it".
I sure as hell don't want every junior dev on my team going out and trying to tease out the intrinsic business value of every task I give them.
That's a waste of my resources. That doubles up the effort that I already expect my PM to be doing. That leads to disagreements about priorities when those junior devs don't have the context about why a business decision was made and either infer it incorrectly, or spend lots of time asking when it really just doesn't impact them all that much.
Determining the business value of your team’s various goals should be your PM’s responsibility, with input and help from your team.
In your example interaction, the only missing piece is a more specific impact estimate. Rather than “We’re losing customers on this feature because it’s slow.”, you’d want your PM to say “If this feature was X% faster, we estimate that it would reduce churn by Y% per quarter, which is worth approximately $Z/quarter to the business.” Your team can then estimate eng cost to make that improvement, and see where the benefit/cost ratio falls relative to the other things you can be working on.
I hope no one is expecting every individual developer to know the numbers on churn, but I do think it's important that someone on the eng team would know that number. In my experience, some combination of the product manager and the tech lead for the team should have some insight into how the changes the engineers are making affects the customer.
A lot fewer engineers talk about how they like to listen to business people.
But I'm someone who will learn enough Node/React/python to understand what can, and can't, be done.
OKRs are designed to exist in levels, “trickling down” in a way that narrows them down more and more as they reach specific teams and individuals.
A goal such as “Increase customer retention” could be a legitimate higher-level objective. From key results on that objective, we get objectives for specific teams working on various aspects of the product.
(For example, “Reduce churn rate by X” likely isn’t going to be just about engineering; copywriters and others may be involved. Down the line at some point there may be an engineer’s personal key result such as “launch customer feedback collection system by X date”, or something else specific to circumstances.)
I believe that wholeheartedly adopting OKRs in this multi-level fashion is helpful even to companies with a just a few employees, and how objectives and results are translated across levels is a good measure of management health overall.
 Rick Klau talks about it in “How Google set goals” https://youtu.be/mJB83EZtAjc?t=1951 (2013)
Part of it is how the goals are translated. A business outcome trickles down to a specific technical one -- and a specific team and person -- which on the face of it makes sense, but it's surprising how often pursing that that derived outcome totally loses sight of the big picture.
It's similarly hard to map backwards, which can lead to a lot of the company feeling "mission accomplished", when the objective was still a total miss. That's a painful disconnect to have happen.
One approach is to keep in touch with customers and find out what their pain points are. Everyone should be interacting with customers to some extent. This could entail helping out with some support requests (escalations), or joining customer calls, going to conferences, etc. Listen silently to sales calls, or support calls with important customers - or be a named and introduced participant. If you have a mailing list or forums, keep tabs on that and help people. You can usually get an intuitive sense of what’s important to customers by interacting with them.
Sure, maybe it’s “not your job” to do these things, but a bit of time spent more than pays off in insight most of the time. If this is difficult to arrange, then the next best thing is to talk to the people who themselves speak to customers, and listen to what they have to say.
This isn’t a replacement for surveys and analysis, but I find it useful to have my own intuition as a human based on interacting with other humans.
It’s also good to be a customer or user of your own product. I strive to be in the position to use the things that I’m working on first-hand myself. If your product has an involved onboarding or setup process, then ideally everyone on the team should have gone through that setup themselves personally.
Most of the product design vision that I’ve developed for products I’ve worked on has come from a studied understanding of the customer problem and interaction with real customers. I think the best way to stay customer-obsessed is to ensure you’re interacting with customers, or are one yourself.
You should obviously increase lifetime customer value to the optimal value (after accounting the cost of increasing it) and likewise for churn rate.
It was very valuable. It turns out that listening to people is a good way to figure out what they are frustrated with and what they want to see. Who'da guessed?
You are the first line of defense against things that seemed like a good idea, until they are implemented. You should pay attention to what you are building, once in a while (not every developer will have this happen) you will see something and go to your boss with a "stop, look at this, it we should cut our losses now because it is bad for the metrics".
You as an engineer know what is possible. I've seen several projects go from ideas that management/marketing thinks are too difficult so they don't bring it up to the top must have feature when an engineer who understands the customer sits down and writes it in a couple days thus making them realize their idea was actually easy if only they had asked.
Say, for the above: KR A being test the hypothesis for churning using XYZ analysis method and KR B bring peer review that with specialist named Klopvital, Tartirius and Mochalatet. Then, once you improved an answer, I would think you have achieved a partial goal - and if that is data to someone else then it comes the choreography of things.
. . .
The whole reason of a system of goals is to have one work and what OKR helps relates to accountability - what is a measure that validates the goal. Of course, the complexity comes when one goal system is chained with the other - the orchestration.
Your case "Someone needs to do the research, analysis, and leg work of finding potential areas to exploit." partially opens the door to it for the idea of Committed OKR (the kind of OKRs that one can do, solvable).
On your point "And then someone needs to have at least some amount of vision or inspiration about how to solve that problem." this seems to be about the idea of Aspirational OKR. In this regard, I agree that many things in the entrepreneurial outset starts with a north and not a precise thing. Certainly OKR is not a system to work on requirements - do it and that is that. The OKR approach is influenced by short feedback reviews for goals — that is derived or inspired by Andy Grove insights about MBO vs feedback vs planning that goes on like a) point to a roadmap b) work on it with a temporary plan in an accountable transparent way c) review the plan and review the roadmap.
But I hear your strong statement that "but the last time I worked in an environment with OKR no one seemed to have any brilliant ideas about how to, you know, actually achieve the Key Results." and would love to talk with you to understand more your experience. From the book I read I came upon cases that some of the success cases too years to fix their OKR system.
* Some of the ideas here I was inspired by the Measure What Matters book by John Doerr. But of course it's my limited interpretation.
Most design decisions you make as a developer affect the value users get from your product, and therefore the churn rate. Sure you can get more useful data by talking to customers, and you should seek it out. But even lacking that, it's up to you to put yourself in the place of a user and make the most practical decisions available for their benefit that your powers of intuition allow. You don't get to say, I don't have the best possible data for that choice therefore it's not my problem. It's still your problem.
First, a developer putting themselves in the place of a user still may not be able to see all of the issues with their product because they are (not sure how to put this better) "too close to the work". It relates to something we talk about here on HN every now and then, namely that "technical people" often miss things that are hard for others because they just can't fathom that x thing that's easy for me could be hard for someone.
Second, I think that, in a lot of places, management makes it super hard for developers to get any data on problems other than general, abstract numbers with zero specifics included. I seems quite unreasonable to expect a developer who is being stonewalled on data or access to users by another business area to work with "our overall churn rate is x%, fix it" and actually figure out what's wrong and what needs improving.
I know that the answer is: spend lots of time talking to customers, doing research, empathizing, etc. But then who is going to write the code while I'm doing all of that? It needs to get done, but I'm not sure why that is the job of the "development team".
That means your OKR will have to do with responsiveness to your customer or addressing the things your customer believes are associated with the poor no-show rate. Is your "patient reminder" system broken and failing to contact patients? Do you lack such a system? That's something you can address and control, so the OKR will have to do with that (server uptime, quality of service, features of the service, timeliness of changes to the service).
- Send an email to the person 2 days before their appointment
- Text them the morning of their appointment
- Build something into the CRM of the front desk that reminds them to call people the day before an appointment
- send a pre-generated google directions result to the patient N minutes before the appointment
I could go on for hours here. The job of a developer, at least in the startup world, is to understand the objective of their team and contribute to figuring out ways to reach their objective. It is not just to code up tickets that are put in front of them.
But I, as an engineer, don't know accounting, or how to setup a healthcare plan, or the intricacies of VC funding documents, etc. Thinking about how people use what you are building and how to make it better seems like table stakes for engineers in the startup world.
Thinking about the product and users is the job of engineers. So is coding a solution. And so is not coding a solution when there is a better option.
I guess if you get to a later stage startup, where you are working on very specific technical problems, you might be forgiven if you don't know what it means to the larger organization, but I can't imagine working in that environment and being happy. A fancy algorithm is cool I guess, but if I don't know how it's moving the business forward it's basically meaningless to me.
I would imagine the Objective is something like "Maximize the number of people we can help at our clinic". And a key result would be "The number of no-shows for appointments is below 5%".
It then follows that the product development teams (product and engineers) would get together and say something like "Great, we can think up 14 projects that might help us reduce the number of no shows. Here they are in order of easiest and/or highest likelihood to succeed to hardest and/or least likely to succeed".
How else would you use OKRs?
Yes but probably not in a way that matters.
Churn will be decided by a few small, hopefully well targeted issues (including price and switching cost) and so it's really Product Marketing/Managements job. The Eng. should be to meet the expectations of Product.
The problem with OKRs is goals can be hard to quantify, so it's attractive to simply make OKRs out of things you CAN measure. This is why we get companies that optimize for clicks at all costs, and twitter accounts that buy followers.
If you're thinking about backlog bankruptcy, it's probably because your team is considered to be struggling. If so, it's attractive to think the process of delivering on the OKR is the problem. However, in my experience the real problem is that leadership hasn't done a great job of defining OKRs that properly align teams on the mission at hand.
But good luck telling that to leadership.
How many OKR rollouts are plagued by lack of adoption in the C suites? Who here is shocked to find a system is malfunctioning because employees are trying their best to engage with it, but leadership can't or won't do their part? OKRs in particular are hugely reliant on leadership defining goals for others to align with.
Measurements and progress are cultural. It either starts and continues from the top, or it's defective.
Been working under this system for 1.5 years and I think it's nothing but a detriment. I did research into doing OKRs "right", watched videos, really tried to give it a fair shake, too.
It's totally fine (and encouraged, in fact) to not pass every OKR, but I don't see any point to them. All of the failures of waterfall that everyone has always been aware of, but with the additional work of stating how you'll measure it, so that finance knows who to pay more.
So, clearly, you have some goal. It's likely also measurable. If you pick it so specific that you can't stick to it, yeah, OKRs are nonsensical. But "make money" or "increase conversion rate" or "find market fit" are clearly meaningful goals, no?
The difference from waterfall is that OKRs don't prescribe a "how". They describe a desired outcome. Guardrails within which you can be as agile as you want, just get some results.
But it's okay for that to change, and then you just write new OKRs.
Start of quarter -
Management: We need to increase X.
Dev: Okay, here's how I'll break that down... (key results)
One month later -
Management: Everything's changed... Y is the top priority now!
Dev: Okay, that means we should focus on fozbuzzles.
One month later -
Management: Oh, man, what we really need to focus on is Z!
Dev: Alright, we can do that if we de-emphasize X and Y. Let's get Z done!
End of quarter -
Management: You didn't meet any any of your OKRs other than Z!
Dev: ...but you told me to drop X and Y...
Honestly, I've never seen an OKR that was relevant by the end of a quarter... We finish them or not, but the objectives are always wildly different every few months and priorities change.
Frankly, the start-of-quarter OKR system is incredibly disheartening... None of my major accomplishments or the primary focus of my work ever winds up being recorded, as only the items set at the first of the quarter are evaluated.
I've worked at other places (outside of tech) where the emphasis was on recording what you worked on and why at the _end_ of the quarter. In my experience, it's smoother for everyone. I'm a bit surprised at the focus on quarterly, pre-set OKRs in tech... It seems like a more adaptive process would be better for everyone.
I think there are two types of priorities, "fires" and "nice to haves".
Fires are high priority, unpredictable, and often need to be solved quickly. This is things like "our databases crash every morning" or "our AWS bill is too high and going to put us out of business".
Nice to haves can still be needed, but they tend to take a back seat when a fire happens. Additionally, their priority tends to be much more subjective. This is things like "Our build process takes too long" or "our deploy process has too many manual step which leads to human error".
OKRs tend not to last a full quarter because any tasks/priorities that you can schedule and make fit nicely into a 3 month period are "nice to have"s. I'm not saying they don't matter, but the timeline to get them done doesn't really matter so things get delayed or moved around.
On the flip side, you can't schedule fires and you usually can't delay them either. The things that actually important enough that they can't get delayed tend not to fit the OKR framework.
- I don't decide what projects come my way. Product or management decides that.
- My concerns are: "Is the company going to be around for > 6 months?" because if the answer is yes than my concerns are "In 6 months, make sure we don't hit a wall that will be a stop all development situation". I've seen it.
- OKRs are always going to be goals of the company. My question will always be: What can eng do to actually affect these OKRs? Are customers leaving due to bugs? Eng will prioritize bugs. Are customers leaving due to lack of features? Eng will build features. Is the company about to run out of money? Eng needs to build as much and as fast as possible and ignore any long-term problems because we need to get customers.
Basically either department heads need to take the current OKRs and transform them into department specific OKRs or they become meaningless. There needs to be engineering OKRs around things that engineering can affect. Not product OKRs which product will affect. Not sales OKRs which sales affects.
Example of a product OKR:
- increase customer conversion rates by 0.5% every 2 weeks for the next 2 months.
Example of an engineering OKR:
- enable product to A/B test
- enable product to measure more data points to make decisions to reach their OKRs
The CEO would define the top-level OKRs. Their direct reports would ask themselves, "How can I contribute?", and build a more detail set of their organizations OKRs, that rolled up to support the CEOs. Every layer of management down the chain did the same. And the individual contributors set personal goals that supported their direct manager's OKRs.
No system is perfect, and this one had its annoyances. But it did tie everything from personal goals all the way up to the top-level corporate goals, which resolves many of the concerns from the article.
Either leadership doesnt engage with OKRs, tasks are deliberately misinterpreted as goals, or unmeasurable goals are set. The rest of the company's OKRs then follow this example, and everyone suffers.
The author's suggestions aren't terribly complete or practical. I don't know many teams who could just dump their backlog every quarter. You're going to have work that can't be related to OKRs. We do use the OKRs to drive our weekly meetings and sprint planning.
One suggestion I have is just to accept all work isn't OKR related and reserve capacity for that work. If some PM or manager complains, push back.
I'll be reading and re-reading this article for a while.
I think one observation that I have is that OKRs are really a framework to help teams understand what direction/goal to head to and understand progress toward that goal. And really, this mindset needs to be adopted not just by the development teams-- it needs rigor and consistency from leadership as well.
For example, the leaders are accountable for setting direction and describing what they think is important. If they then start asking you for things that aren't tied with an OKR, it's a perfectly fair question to ask leadership "Why are you asking me to work on this when you indicated it's not important to our company strategy?" If the team feels like they're going to be working on something unrelated to their OKRs, that's symptomatic of leadership not sending a consistent message on strategy or prioritization.
I've also seen it from the other side as described in the article. I've seen development teams really struggle to initially understand OKRs and the value. As mentioned earlier, it's really designed to help clarify direction and progress-- if a team is ignoring the OKR and fitting their backlog, that's symptomatic that they're really not trying to understand the direction leadership wants to head.
What I like about the author's suggestion is that it really forces the team to understand the problem the organization is trying to solve and think up solutions how to achieve it. Backlog bankruptcy is one way to do it, though there are likely some items that can still solve the problems outlined in the OKR. Those items just shouldn't automatically be transferred over without verifying they solve a problem that needs to be solved.
Teams shouldn't be fitting problems to work-- teams should be fitting work to problems.
- having the amount of time spend on planned work vs adhoc work
- source of distraction being measured (cross team requests, adhoc meeting)
I don't understand how OKRs are appropriate for employees without the power to make business decisions. My only goals are to come into work, do what is expected of me, get paid, not get fired, and go home. I do not have power to decide anything with quantifiable results. If I did, writing OKRs and working towards them would be extremely easy.
If I did have any power to make business decisions one of the first things I would do is make any employee without any power exempt from having to even think about stupid OKRs.
You can measure milestones completed, and you can also apply some type of scoring to the feedback you collect from your beta users. If your beta user feedback is "This is terrible, it doesn't do the key thing we need to do for our jobs!", that's actually a great outcome, because you can course-correct early, rather than having to revisit 1+ years of eng work to address the feedback.
> My only goals are to come into work, do what is expected of me, get paid, not get fired, and go home.
OKRs should just be the process of you writing down "what is expected of me" in a somewhat structured and measurable way. Team OKRs of "Hit milestones #1, #2, and #3 in development roadmap of system X" are fine, and are pretty common for greenfield projects. The main key is to have a "definition of done" for each milestone - does the milestone include documentation? Monitoring? Who decides that work is complete?
Software is usually written to help reduce effort. Measure effort with and without the software. Reduce effort usually means better bottom line for the company which give you a great positive for your next performance review.
Meaure the usual software engineering metrics, ie defect count, defect closed, velocity, delivery timing etc.
I think that everything that we do in a company has a cost for the company. At the most basic level, you and I cost money as the company is paying us money for the work we do. You need to think from that perspective.
"Most of the companies I’ve worked for in the last 5 years have used the objectives and keys results (OKR) system."
I'm not sure why this common practice in text is seemingly disappearing on the web.
My personal opinions of OKRs are that it's possibly the most cumbersome form of waterfall without many of the few benefits of waterfall. It's really a tool for making leadership feel good about being bad at their job.
Any measurement that is a metric (goals in OKRs) will cease being a metric. It will be gamed.
Teams then work with different tools, e.g: Marketing use Trello, Eng use Jira, you name it and now you have to trace back tasks from different tools to some OKRs defined somewhere in the cloud ... good luck with that.
It's nearly impossible to measure and reconcile progress made on team/sprint basis with OKRs therefore you just shovel random numbers in the spreadsheet and management is happy.
When an OKRs is updated good luck cascading the changes bottom <=> up .
I have yet to find a company that has implemented OKRs in an effective way.
Most of the time managers themselves tell you : "just write something .. it does not really matter" hahahah and there you go ... At the end of the day people have
to do OKRs but they keep asking themselves ... why ?
My problem is less that no tools exist to manage things outside of spreadsheets, but more that older management tends to take the position of "Jira (or whatever) is complicated and I already know Excel, and I'm in charge so we're using whatever doesn't require me to learn something new".
> a team is encouraged to set as goals about 50% more tasks than they are likely to actually accomplish.
> If a team scores significantly higher than that, they are encouraged to set more ambitious OKRs for the next quarter.
I lived through this in Scrum sprints. There is a baked in incentive to sacrifice some iterations every once in a while to ajust the average expectation, while keeping an overall positive look.
Otherwise without having used it seems to me that OKRs are another rather plain tool that works for clever organisations but fails when applied dumbly. Is there anything specific to it that makes worst case scenarii better ?
I think more and more any methodology will work if followed honestly and adapted to real world problems. In the end it's always the disconnect between what an organization claims to do vs what it really does. If that disconnect is small things are good, but in a lot of cases the disconnect is big.
Company-level OKRs are only as good as your leadership team's clarity on strategy and long-term direction and conviction to largely stay the course for at least 3 month chunks.
If you feel your company OKRs are bad or you are unable to connect with it, it is because your leadership team has not done the necessary homework to define and socialize it well.
If you agree your company OKRs make sense but you are unable to connect it to your work, think of it this way:
For a functional feature engineer, the team OKRs should clearly and directly connect the features they are working on to the company OKRs. If not, then you need to engage with your product managers to achieve that clarity.
For a senior platform engineer responsible for evolution of tech platforms, it is critical to have a good sense of the direction in which business will evolve and expand. (Annual OKRs and strategy articulations are crucial for this).
From this, you should be able to draw out a mind map of the kinds of features and capabilities needed by your platform.
Then you can articulate this to your team to build those required enhancements to the platform while also ensuring immediate feature building activity is moving as productively as possible.
If you are unable to connect the dots between the business OKRs and tech platform OKRs, it is usually a sign that there isn't a good functional model for your business domain and your tech platform isn't really a platform that supports that functional model. For architects, this should be the most important deep work – to keep the functional model of the business and that of tech platforms in sync. Without this common model, teams cannot collaborate effectively.
The more important thing is whether it's actually true, not whether dev teams (or leadership) feels it's true, no?
But the OP indeed seems to describe an environment where it is not actually true either.
Clearly, thought and planning is needed on how people work, in a way that will be directed to actually addressing OKR's. The OP offers suggestions about abandoning backlogs, and focusing weekly meetings on OKR's, which seem possibly beneficial, but probably not sufficient. I think it probably requires more fundamental shifts in how the whole organization works, which are a lot harder than just publisizing OKR's.
It's such an obvious key objective, it should be in every top-level OKR, yet it almost never is, which means that there's nowhere the engineering department can slot in that work. And how do you measure it?
Understand OKRs for what they are, and you'll be a much happier developer. Don't treat them too seriously, use them as a productivity mechanism _for yourself_. Now, even if nobody knows what needs to be done, you can refer to your (approved) OKRs and do _that_, whether it makes sense a month after the quarter started or not.
I don't know how it is now, but at e.g. Microsoft in the 00's you'd set your goals once a year. A month later those goals were completely irrelevant to what actually needs to be done. And Microsoft is objectively one of the most successful software companies in the world. Waterfall, agile, OKRs or yearly goals, it doesn't matter. The reality is always dictated by circumstances. What matters is that every single thing I worked on while there makes money now, and a couple make billions a year.
And I think it was because we were doing what the author says and shoehorning in our backlog into the OKRs!
I was really soured on OKRs because of it. Now I want to try again doing it “the right way”.
The way we thought about it was that OKR should not dictate 100% of effort. OKR is about prioritizing the changes that leadership wants to make at the productivity frontier of the organization. OKRs alone are not enough to direct company efforts. Leadership also needs to specify what % of energy should be spent on OKR vs other responsibilities. The OKR effort goal could be between 0-100 for different teams, or the company as a whole.
For instance, a company that needs to pivot or will die is probably close to 100% OKR effort on all teams.
This approach gets around the need to assign OKR to every activity and allows OKR to be more salient and less diluted as a tool for alignment and goal setting.
You MUST have a deep understanding of exactly what that metric is measuring and all of the things it doesn't take into a account.
For a concrete simple example. An A/B test may sure that forcing users to create an account before seeing shipping charges increases order completion by 10%. But then it turns out you've reduced return visits to the site by 30% and decreased overall conversion rates due to damaging that funnel.
Reality is very complex. Trying to boil everything down to a few numbers to make decisions based on seems like it simplifies things but often times you are just ignoring the complexities and flying blind by chasing metrics.
I don't recall the product development group I was in using them. Maybe they were implemented at a much higher level in the company.
Why the fuck should I help the owner get rich by identifying how to grow their business by increasing conversion, retention, etc.? I don’t benefit at all.
At my current company we have weekly meetings where the owners tell us all the business metrics we should be driving and ask for ideas on how to increase them. I’ve learned to say “I don’t have any specific ideas right now.” because it’s insulting they expect me to do their work and get nothing for it.
For example, reduce bug count report by 10%. Or, increase LOC by 10% without reducing quality (defined as bugs, DRY, PR comments), increase documentation written by 10%, increase Slack karma by 10%, etc
Obviously other metrics other than the OKR need to be maintained.
PRoduct owner is responsible for OKRs that scope across stories implemented, not ICs. Or at least not junior ICs. Architect level IC for example should be able to complete more broad scope.
Interested in feedback.
My single biggest issue with KPIs is that, for the most part, in a large company, one (or even several) metric(s) cannot cover the whole story.
They are useful for very specific targets.
I’ve ended up working towards KPIs which I’ve known have zero relevance to improving service because they are (a) poorly defined (b) irrelevant to the actual problem (c ) force a metric to exist where no sensible metric can.
Unfortunately this tended to happen more often than not.
OKRs have their place. But they don’t sound like something new (KPI rehash) and are wide open to misuse, just like KPIs.
Some organizations do indeed move from KPIs to OKRs. I personally think they are different tools for different purposes, and they could work well together.
I believe KPIs are a great tool to define and monitor your business as usual, whereas OKRs is more about realizing your ambitions and pushing the company further ahead.
If it helps, more info here: https://www.perdoo.com/blog/kpis-okrs-the-goals-that-drive-b...
> Joe Cannatti
Anyone else put off by the author quoting himself?