A few years ago I was technical lead for a team that worked one of the largest IT systems in the UK that you've probably never heard of. The customer was truely enlightened and would fund short technical de-risking studies to understand and scope challenging tasks before asking the contracting organisations to produce a firm price proposal for delivering the capability.
This worked well and was driven by his understanding that most of his effort needed to be focused on identifying and mitigating delivery risk. Most often the key risks lay in trying to estimate the level of effort required to develop key new functionality when all involved had limited prior experience in developing similar functionality. In those cases funding a short time bounded study to explore the technologies involved and figure out the art of the possible reaped huge rewards downstream.
Yeah, I love it when customers can work like that. The way I often explain it is that a product manager can buy software, but they can also buy information. Information like, "Is X possible?" or "What would it cost to build Y?" or "What would the performance be if we switched to Z?".
Technical discovery is baked into traditional project management (e.g., civil engineering) in a way that continues to surprised me as a former IT project manager. There is a culture of investigation and research, such as materials research, proof of concept development, and needs-driven public-private academic research in other technical fields that IT project managers just don't get exposed to.
For instance, engineers planning a drainage project have a very good idea of how long a particular type of pipe will last before it needs to be replaced. They also have the tools/software/training needed to estimate how local stresses affect functionality and lifespan, and they're expected to use this information - even if the lifespan of the project exceeds their expected tenure.
IT PMs can't say the same thing about their infrastructure - who knows how long AWS or Azure will be around, for instance, or how long a local server blade or piece of network equipment from (x) company will actually last. When it comes to software development, the prevailing attitude is 'I want (x) by (y) for (z)', without having any justifiable basis for (x), (y), or (z).
I'm now head of product development at a different company and one of the key activities I instantiated was using internal R&D funding to conduct technical discovery and remove, where possible, software development of new functionality from the critical path of project delivery.
I'm convinced IT project management could learn a lot from civil engineering project management.
I think this is why it is very helpful to use tried and tested tools and tech stacks in an IT project, those with large ecosystems of libraries and communities of people who have solved similar problems in the real world in the past.
The "problem solving" risk is much lower as a result.
At the same time the board will finance the fancy new buzz words though. So combining both requirements I would say the art is in trying to use tried and tested technology to fulfill buzzwords.
For stuff that's new I always tell management to give us a certain amount of time (weeks or even months) to figure out the major unknowns. After that has been done they can get a reasonable estimate or cancel, whatever they want to do. This seems to be the only realistic way to go about things.
I think engineers need to start talking about this concept in a much more official way, and approach it with the business side as another full task type called "Technical Discovery," with clear goals and a clear time-box.
One thing I've been thinking lately is that this should be baked into the process for every unit of work, not just those people expect to be difficult.
Instead of estimating by guessing based on reading the requirements, work on an issue for say an hour or so and then estimate it.
It still wouldn't be perfect, but I think this would catch a lot of cases where, for example, a seemingly simple feature ends up requiring an extensive model change.
It would also do a much better job of shaking out cases where the requirements aren't as clear as you thought when you start trying to code it.
Oh, you'd think so. Then you might be unlucky enough to work with some 'senior programmers' who have been 'doing Agile for years' and are able to convince the manager that a spike that doesn't result in code that will go to production is a waste of time. I never worked out whether that was malice or stupidity.
On projects that are breaking new ground it is even more important to follow XP practices like getting frequent feedback from customers, taking the simplest solution that could possibly work to avoid overengineering before you actually understand the problem, to embrace change because your original understanding of the problem probably isn't right.
There also is plenty of risk involved in cookie-cutter commodity work. Those kinds of projects fail all the time.
In my company everybody goes over spec sheets and endless meetings to determine the right approach instead of just sitting down and using the time to try things out. A big problem is that there is also no culture of being allowed to say that an experiment has failed and we'll either cancel or try something else.
At my company projects always start with P1 - research, and P2 - plumbing which are exactly what they sound like. The idea being to uncover and confront risk/unknowns as early as possible before some massive misguided investment has been made.
Plumbing is basic project setup, CI, testing, etc but also all core functionality and any technically questionable bits like integrations with client APIs that are frequently not what is advertised. Research is always conducted by one of our "CTO"s (partners who have > 10yrs experience).
Projects vary at the low end from 2 people, 2 months up to 5-7 people for a year or longer. We do smaller stuff like prototypes also, usually as a first step in a longer series of releases. We are local in SF and frequently work in client offices. Budgets are between $100K - $1M.
The author touches on this indirectly a couple of times, but I think the main realization this gives me is that we tend to conflate two things when doing software project management:
* Getting the project done well and on time.
* Evaluating the performance of invidual programmers.
This is probably a natural thing to do because one measure of "good programmer" is "gets projects done well and on time", but I think directly mixing those two goals together interferes with both.
If you have any experience, you've certainly been put in a situation where the project went poorly through no fault of your own. I shipped "Superman Returns: The Videogame". Every developer knew the game was going to suck. Conversely, you've probably known (or been) someone who coasted through a successful project without contributing much of value.
So that argues that project outcome is a poor individual performance measure.
Worse, when people know project outcome is tied to their performance review, it puts pervese incentives into play. It encourages even well-meaning people to over-estimate tasks, bury technical debt where it won't be seen, point fingers when projects go poorly, choose conservative projects to work on, etc.
Perhaps a way out is for the organization to very clearly and explicitly separate these two concepts. Repeatedly acknowledge that you can do good work on projects that go poorly and vice verse. Do not take project outcome into account in performance reviews.
That still leaves the question of how do you evaluate performance then? Peer review is one approach — everyone on the team probably knows who does and doesn't carry their weight. Unfortunately, this can lead to nasty biases. People whose gender/culture/whatever lines up with the majority of the team are likely to get rated higher because we like people similar to ourselves. People that are more outgoing, outspoken, or charismatic will get unfairly better reviews — a particular problem in programming where some of the best people lack those attributes. It might trigger nasty competitive behavior with cabals forming and other stuff like that.
I don't know if those problems are worse than the status quo of using project outcome for performance evaluation too. It's difficult.
It's pretty refreshing to read posts that admit 'management is difficult' on HN. We can get a bit lost in our bubble of tech work, and forget how important management is to actually accomplishing things.
Unfortunately unlike software development there's no simple way to check if the 'output' of management is working correctly or not. This combination of 'hard to do' and 'hard to verify' means that there are many projects humming away with incredibly poor management.
The really terrible part is the de-correlation of management quality with rewards for management. Supposedly management is better compensated than individuals due to their impact upon organizations being greater than individual contributors, but this flies in the face of the modern trend of the "subservient manager" philosophy of management as well as management sometimes just carrying out orders from above that don't require skill (the most common one that we all recognize requiring little insight or talent is "reduce costs"). Managers are sometimes given truly terrible situations that nobody can manage their way out (no budget, bad morale, bad organizational reputation of the group, and no authority to create change to reverse the situation)
If you have a poor individual contributor, they are almost always fired. If you have a poor manager, oftentimes they have worked a relationship with someone that has power that makes it very difficult to fire them, so they're oftentimes "promoted." This kind of nepotism and cronyism exists to some extent among individual contributors but ICs usually disperse more across business verticals throughout their careers while managers and others in primarily soft skill roles tend to be extremely connected (whether through intrinsic personality or due to the nature of their jobs is another question) and seem to act more as cohorts rather than as individuals.
I think that really is the rub. Manager performance is hard to measure, and most organizations don't really want to measure management performance because it's costly and easier to cycle through ICs, especially where there is an abundance of IC supply. I worked for a bad manager that caused over 60% of the team to leave (including me), and finally a year later they were "demoted" by being put in charge of a smaller, less-significant team. An IC that dropped 60% of their deliverables would have been on the chopping block in weeks, if not months.
I would argue bad management is pretty easy spot if you have any reference point for good management; you don’t even have to be a manager. Sit down with a few employees and ask them a few questions. The effects of bad management are everywhere in an organization. Employees are frustrated, resentful, unheard, and happy to talk.
Programming is hard, too, and good code is a much subtler thing that requires an expert to give you any sense of.
there's no simple way to check if the 'output' of management is working correctly or not.
What an odd thing to say - ask any accountant or banker and they’ll say they’ve had methods for this for literally hundreds of years. Ever since “share price” was a thing, and even before that.
The problem is doing something with the information. Look at IBM for example (one of many). You can see simultaneously that managers are doing a terrible job - zero revenue growth for years - but also see that they are lavishly rewarding themselves for it.
I think a great way to treat that is to look at the individual contribution and measure against how that person delivered on the micro of the project rather than the macro of the business success.
I've had a successful career(delivering what I consider good work) on some real stinker projects. Usually though it's only been when things are framed in the above way.
One of the things that drives me crazy about agile is the strawman that is "waterfall". There's this notion that if you're not doing kanban or scrum you're a dinosaur. I've worked at places that didn't have some sort of Methodology, and you know what? It was fine. Things still got estimated, work got done, we just didn't arbitrarily shove things into 2 week windows or have painfully elongated planning sessions because of "planning poker"
As far as I can tell "agile" mostly is about predictability, not about efficiency. Most of the places I've worked, when we switched to agile, the efficiency of the team got worse, but management was generally happy because they felt like they had much more control.
Sometimes agile is the right way to exert control back against management, if management is prone to chaos and disorder.
I took an organization that couldn't put out two releases a year to one that put out once a month, and the owner/manager complained bitterly and pushed new features into the discussion ON THE DAY OF RELEASE.
2nd level management had to constantly, repeatedly attest to the good that the tempo and structure brought to the organization and customers.
It sounds like you had a bad manager, but no methodology can fix that.
My entire experience has been: good management + talented developers = success, and bad management or untalented developers and it fails, but it really has nothing to do with the methodology in play. I've seen (and been part of) hard core agile teams burning to the ground, and teams with no methodology be wildly successful. (And vice versa)
Agile bothers me because it generally aims to commoditize developers, and yet the best places and teams I've worked on always were they opposite: it was about finding the best way for each individual to work, and designing processes around the rhythms around whatever industry they were working in, rather than assuming everything should fit into 2 week sprints.
I would argue that many people try to commoditize developers through scrum (edit: and call it Agile). Agile is just a set of principles. Which pretty much state the opposite, "Individuals and interactions over processes and tools." As the article states, scrum can work in some cases, but doesn't in others. If you keep trying to force it where it doesn't work, you aren't following the agile principles anymore. (edit: I'm probably making a "no true scotsman" augment and should've said something like "If someone keeps trying to force scrum where it isn't working they should reflect on the principles and ask themselves if they are putting processes first".)
I cringe when I hear people criticize agile for trying to "fit" everything into a two week sprint. Competent developers always chunk their work into small, comprehensible pieces, and if anything, two weeks is actually bigger than most of those chunks, not smaller. Agile never requires that everything be done in two weeks, it only requires that there is demonstrable functionality every sprint (which doesn't even have to be two weeks, it could just as easily be three, or four, or six, if that works better for the team).
The best places to work adapt the process to the team.
The worst enforce a process on the team because it makes managing them easier.
The number of workplaces I've been in who equate the number of hours that people are in the office with their productivity, because managers could measure attendance easily, and had no idea how to measure actual productivity...
Agile methodologies are weird. They seem to go directly against the manifesto, particularly "Individuals and interactions over processes and tools"[1]. Yet Scrum was created by two of the manifesto signatories.
Still, a lack of a strict methodology doesn't necessarily make it waterfall.
SCRUM predates agile. Though SCRUM has had many changes over the years, the manifesto was responding to problems of the day; that is, a giant statement of work up front, hundreds of pages of design documents, a giant build process where everyone figures out what the product really should do in engineering while product forgets about it except for occasional questions.
Then, that whole steaming pile was thrown over the fence to QA which tried to figure out heads from tails and developers started fixing the issues they already knew about but had to leave in so that they would make their deadline.
THEN came user acceptance testing, where people actually looked at the product at the first time. This is when people realized what the miscommunications 6-12 months ago, and the change request process started.
Finally, over budget and after everyone was finally exhausted and tired of fighting, it was released to users with promises of changes after the release.
From this world, SCRUM and the manifesto makes sense.
If you're happy to accept the Manifesto in 2001 as the start of agile, then https://www.scrum.org/about under 'Creating Scrum' states "Jeff Sutherland and I had been using Scrum for ten years prior to the meeting at Snowbird where we and others signed the Agile Manifesto"
I’ve seen waterfall work at a VERY high level where entire projects were the tasks.
Within those projects, waterfall was completely removed from the numerous tasks needed to get the project done so the developers never saw it.
Getting waterfall involved at the project level tasks day today has a lot of issues though. It can run a lot smoother if your team has a low expectation of pivoting due to new projects or production support needs.
It’s not without possibilities, but a lot depends on how far removed it is from day to day.
I agree about the hostility to waterfall. Waterfall put men on the moon, so it can't be such a terrible thing. I don't think there is a conflict, personally---there's no reason that you can't plan sprints in the context of an overall waterfall schedule. In the real world, there are deadlines, and any process, no matter how agile, has to recognize that fact and work within schedule constraints. I think that what agile buys you is the opportunity for near-continuous integration and feedback, so that you know fairly early when you are either going to have to apply more resources, trim functionality, or extend deadlines. Properly applied, agile is better than traditional waterfall at flushing out unknowns and potential architecture issues, because of the focus on working, tested, demonstrable code from the beginning.
There's a lot here. People have been writing books about this for decades.
Two things that need to be succinctly said:
1. Project Management is just another skill, like database management, security, or any one of a hundred other skills a tech team might have. PMs keep trying to take themselves out of the trenches and claim a special place. Every time they do that it is a mistake.
2. He touches on "waterfall" a few times. Physical things decompose into smaller physical things. That's why bridge-building is the way it is. Creative technical work is not a physical thing. It is infinitely decomposable. You take a job creating new tech and split it into 10 pieces, you've got the same job with just a lot more overhead. You can go on forever. Very quickly this natural tendency for decomposition leads to doomed projects.
Intuition will lead you astray in tech project management. Every time.
I prefer the U.S. Marines fire team version of a tech team: everybody codes, everybody looks out after everybody, you should be able to do your "boss's" job, it's critical to over-communicate, and ego has no place in a good team. Having said that, sometimes somebody has to make decisions. And somebody always has to go to meetings and report upstairs. Poor schmuck.
That's unavoidable. He's called manager for a reason. I would replace powerpoint for excel in my experience. Everyone wants a leader they can look up to and admire, not a PM.
Estimates are usually wrong. You can do T-Shirt sizes, that's fine. However days and hours is very hard to estimate even for one person. For a team, impossible. There are too many factors to consider: team understanding of the problem, team experience, technology, etc.
Whatever methodology you use, delivering usable increments at regular intervals is the way to go. You can see progress, determine if the direction is wrong and change course rapidly.
Software is usually reproducing human processes. Well guess what, processes change when they don't make sense in a given situation. So its logical that the software change over time as we discover more efficient ways of doing things. As such, you cannot plan for the unforeseeable and even less estimate how long it will take.
I therefore agree that optimizing throughput of a team is the one thing to do. Never commit to a delivery date if you can. Instead, commit to delivering usable increments regularly. With that, project management becomes expectations and change management. Lots of communication with the client on showcasing, and hopefully using the increment and provide feedback for the next iteration.
This is really hard to do. Devs have a tendency to stay cooked up in their ivory tower, and product (read project manager as described in the article) have the reflex to protect them from client distractions. The client may not be available or interested either. Nurturing dedicated time slots for client and devs to meet and interact is crucial. If you get that right, and never miss delivering an increment, good things are sure to come out of the project.
Fortunately, with CI and the Internet, we have all the tools to deliver usable increments at regular (even blazing fast) intervals. Power to the people who can leverage those to deliver successful software projects to their customers.
I agree with pretty much everything and I also advocate for that. However, how do you invoice your clients? Its hard to say: hey I'll charge X for each two weeks increment. Client: Great, how many increments will there be? Me: dunno, we'll figure it out somewhere down the increments.
The agile approach is the way to manage projects, but how do you quote them?
In theory, it ought to be "we can't tell you how many increments there will be because you'll have working software in your hands every two weeks, and we don't know enough now to say when you'll ask us to stop. It should be clear whether we are on track after the first 2 or 3, but you can pull out at any time after the first with a usable product." The whole premise is to get the client to buy into a subscription, rather than a single deliverable.
Usually just by that, in increments. It helps to time-box something early into a new client relationship and once you're delivering proven value, clients tend to be fine with just incremental delivery and billing.
You still need to know how much it'll cost you. If you can't estimate how long it'll take, you may spend more in paying employees than it pay for the project.
Any client worth working for is going to have an annual budget and the focus won't be so much on how much things cost but rather how soon you can start delivering business value.
> knowledge of existing libraries, algorithms, systems, permissions
I can't help but notice, also, that no project management "methodology" emphasizes (or even allows for) time spent researching/learning existing libraries, algorithms, systems, permissions, even though I think everybody would agree that this is where you're going to get the biggest payoff.
I'd say the direction you are going is correct. And I would add that the big payoff is in understanding of fundamental problems, not libraries or algos. For instance you should know how TCP/IP works and HTTP works (not just theoretical but hands on, i.e. know when to use curl, when the browser and when tcpdump for debugging). But you don't need indepth knowledge of the currently hip web framework because in 5 years it won't be hip anymore.
every pm methodology talks about getting to know the problem domain, risk minimalization (start with the riskiest assumption, do a minmax check on it, the pivot to the next riskiest, and so on).
selecting the right tool for the job is an inherent part of the process.
I really like this article, and agree with a great deal of it. But while what he's doing looks a lot like Kanban, it's not. Kanban is a method for process improvement, not a method for managing software projects. We write about this in both Learning Agile and Head First Agile -- here's an excerpt: https://twitter.com/AndrewStellman/status/100022571573903360...
When Jenny Greene and I were working on our book, "Learning Agile: Understanding Scrum, XP, Lean, and Kanban," we were lucky enough to have David Anderson (the guy who adapted kanban for software development) review our chapter on Kanban, and he really helped us nail down this particular issue.
Other than that (and a few things he says about the PMP certification), what he says in the piece is spot on. Nice work -- I'd love to see it expanded into a book!
That was five years ago and I have no plans to pursue my PMP at any point in the future. Because what you learn to get your PMP and what you need to manage software projects are not the same set of skills...and nobody seems to realize it.
That could be said for any kind of project in any industry. I speak as a PMP. The only reason PMP is useful to me is to convince employers that I know how to manage projects. It has nothing to do with reality, and so it is unfortunate that employers view it as a positive signal. It's not an effective filter because you still need to fully assess potential employee candidates in order to avoid false positives.
I'm an introvert, I have imposter syndrome, self-doubt, perfectionist tendencies and ADHD-PI.
Fulltime pairing has dramatically helped me with all of these.
Many folks feel that it sounds bad. The thing is that it's not really a skill that's easily learned from a book or a blog post. It's much better to learn from an experienced pair.
Disclosure: I work for Pivotal, so that's a large part of our whole thang.
I think what a lot of people fail to take into account is the amount of effort it can take to "estimate" tasks. From my experience, the more pressure there is to give an estimate the less time they want you to spend researching the work. To increase the accuracy of an estimate, you need to do more research and if you take that to its logical conclusion then the best estimate is given after the work is completed. So there is some area in between of extremely inaccurate guess with little effort to actually doing the work and knowing the time it took. It's best to accept that estimates are near useless (unless you want to spend a lot of time basically doing waterfall to come up with requirements/design/etc...) and instead just try to find out how to chop up the work into smaller pieces that each deliver value and plan how the feature will be delivered. This way the team can be focused on actually getting value out the door instead of spending time doing a futile exercise (estimating).
I’ve got to do a lot of contract software work back in the day and was the one working first draft requirements with clients before the actual contract.
It was common to spend a week into planning for every six man/month of project
Using that as ballpark we never got an estimate having more than 10% error.
Anectodal, I know, but without that kind of effort one is not doing an estimate is doing a guess.
That also why I don’t like the agile approach of piling up efforts and then use actual time to tune speed hoping the inaccuracies will level off one another. It just feel like a sequence of dice rolls hoping you don’t end up at the far tail of the gaussian.
> It just feel like a sequence of dice rolls hoping you don’t end up at the far tail of the gaussian.
That's exactly, precisely what it is, by design AFAICT. The thing about gaussians is, what with the central limit theorem and all, you provably don't end up on their tail very often.
That said, gathering a bunch more data can make your gaussian even narrower. I think you and (platonically ideal) Agile are just at different spots on the same spectrum of risk tolerance.
High level estimates aren’t optional for most development. Customers waiting for an important feature won’t take “ it’ll come when it comes, estimates aren’t real” as an answer.
You can skip low level estimates, but the consequence of that is that high level timelines are uninformed by them. That’s how you end up in the archetypal situation where all the engineers know a project will slip while their managers report it’s right on track.
In my experience, knowing when work on some functionality will _start_ plus a rough estimate on effort (talking hours/days/weeks level) is quite acceptable for most stakeholders. It is also far easier to provide, and less likely to be off by an order of magnitude, which some low-level estimates may suffer from.
My point is you need to balance the effort you're putting into your estimates vs getting work done. Customers also won't be happy when you give them a date and don't meet it. Honesty is important too.
With paired programming I have always felt comfortable sharing a Googling of the problem with either the co-driver or the driver, depending which role I am in.
Can you explain what you perceive to be the problem with accessing something like StackOverflow to help solve a persistent problem or even to simply jog one's memory?
I find that this fear of being shamed for using the obvious resources at our fingertips to be one of the things that makes people less likely to want to pair. IMO there is nothing wrong with consulting books, websites, other people's ideas/opinions, or other knowledge repositories to help make good decisions or solve problems elegantly.
Sorry, I probably need to re-word that. I didn't mean to suggest it that way.
It was more of a reflection of being unwilling to ask team members for help and just depending on Google/SO for answers if you're stuck. I'll re-evaluate the wording there and get it corrected.
I would add that if engineers/developers are unwilling to ask team members for help you have a big problem with the team, likely a recruitment or culture/leadership problem, not just a project management problem.
Teams work best together and to work together egos need to be checked at the door and people need to trust each other. Any problem on a team is the whole team's problem, and while one person may be accountable for introducing bugs, a good team will view that team mate as valued and capable and perhaps pitch in to mentor or help fix the problem quickly.
The Five Dysfunctions of a Team obviously points out a lot of what I am talking about much more elegantly than I can. What I am saying is that project management can only go so far, team leadership, culture, and recruitment is also hugely important to fix many of the problems you identified in the article.
Great article though, I shared it with my work-mates :)
I can imagine that this is really tied to the confidence you already have. If you feel that you are a capable programmer then looking up something on Stack Overflow is just looking up something you kinda know but forgot. No big deal.
For someone in the grip of impostor syndrome, it is admitting that you are indeed just faking it and don't know enough to 'deserve' your position.
Now it is probably not true, but that does not matter for how people react to it...
This is one of the best software books I've read, that's not about strictly about software. The concept of small batch sizes in particular maps really well to prototyping and agile approaches. But it gives a basis for why those approaches make sense and work. It's like quantum mechanics compared to classical mechanics...
I came out of the article with the same question. I was almost jumping up from my desk with excitement through the first 3/4 of the article, but then the conclusion felt a bit lacking when it never directly addressed the estimation problem...
You don’t. You focus on efficiency and throughput.
Complexity and high variability of estimates is the real problem. You can address that some by training developers on how to estimate, which I’ve heard called calibration, and then spend a lot more time digging in up front on those estimates.
Long term, a ball park can give a good idea but most aren’t going to be more accurate until we dig in. If you are choosing between projects based on expected ROI then diving in ahead of schedule to get a clearer picture makes sense.
If you are delivering a specific product no matter what, then you optimize for throughput.
It’s more valuable to tune the engine so you can drive 70mph than to slow down to 20mph with the goal of accurately predicting your speed.
While I agree it is more valuable to optimize for speed for each person speeding there is a traffic cop worried more about what the sign says than the final result. Sure a company might want to go as fast as possible because it makes them lots of money but the accounting people now can't write budgets very easily so they push for more accurate estimates.
This is a very good read. The only issue I really take with it the notion of the estimates constantly being correct, while later adding all of the other organizational factors that can create a change of course.
IMO that means that the estimates he's talking about being accurate are fairly short term.
Totally agree with him on a lot, especially only the developer working the task can truly estimate it. When you're talking about time measures, that's the only hope of them being accurate.
When I talked about story points reflecting complexity and not time, it was specifically because time is variable by person. If the team isn't creating a time estimate on a task, that's much less of an issue.
" Those principles actually do work extremely well in many, many industries. PMP focuses almost entirely on Waterfall methodology and gaining the certification is largely driven by terminology and math.
But software is a special snowflake in the project management world and this is why countless books around managing software continue to be written."
Thanks.
I highly recommend you read the book I asked about in a sister comment. I wish the business people would read it. The issue I think is it correlates and requires an understanding of other "techie" and STEM subjects like queues, math (related to queues), network protocols, and traffic flow. Until the business people understand the maths and tech topics that book appeals to, friction will continue to exist.
Hey! Loved the article. I am currently employed as a scrum master in a scrum environment. About your proposed "fix" to management: even with the light pairing you recommend, wouldn't kanban eventually cause siloing? Even using scrum at my job, we have individuals becoming the "Feature X" guy or the "Feature Y" guy. This person then essentially disappears into a hole perfecting that one feature for weeks/months, even if there are more pressing/interesting matters. I don't see how kanban wouldn't increase that even more.
Also, how do you estimate? Individual or group? Does everyone on the team possess the skills to estimate every story?
When using kanban, people take the first card from the top of the pile and work on that. There's no siloing in that, because you're taking whatever happens to be at the top.
You don't estimate in kanban. You'll generally try to break things down into small stories, but the goal is to always be working on the most important thing and focus on improving how quickly work goes from idea to done.
This was a great write up! I’ve been thinking about this problem A LOT while building a project/team planning software startup. What do you think about having every engineer estimate one task for the upcoming sprint? Also, I’ve found that the estimation process ( I.e voting ) in my experience is better if blinded, otherwise there will be extreme peer pressure to over or under estimate.
That pressure can be very real. Personally, I don't see a tremendous amount of value in overthinking estimates as much as re-evaluating estimates once the task starts.
The methodology I mentioned calls for exactly that, estimating best case, realistic and worst case...and then re-evaluating once the work starts.
You never really know until somebody dives into the work and it's rare to actually get time before an estimating meeting to individually look into each story. Treat the estimate as a best guess and then re-evaluate when you have better information.
>Treat the estimate as a best guess and then re-evaluate when you have better information.
Any suggestions for how to effectively communicate a "best guess" and a "re-evaluation" to a business side that basically demands "hard deadlines" for projects?
The Software Estimation book mentioned in the article does a really good job of breaking it down.
Hard deadlines come with a lot of questions, like why? Is there a trade show? An investor meeting? A marketing campaign?
Is it really a hard deadline? Specifically, if the product isn't ready yet are you planning to launch no matter what because we have to?
If not, then it's not a hard deadline and deadline needs to be reframed around goals.
You'll never hear Apple announce a product until it's actually ready to ship.
Usually hard deadlines come from announcing something that doesn't exist yet. The outcomes are usually buggy, rushed systems or missed deadlines that make companies look bad.
If it's for a trade show, the question is closer to "what exactly do we need for this trade show?" because management will always say "everything!"
I always tell the story of a particularly bad client that I had years ago who insisted that he had to have a project completed 3 weeks ahead of when we told him it would be ready. My partner and I killed ourselves, gave up weekends, worked crazy hours to hit his deadline for him.
And we also monitored the parts of the application that he had us create. He didn't even look at them until 2 months later.
After that we learned to create "Rush Rates" when requests like that come in, because if demanding something be done faster has no cost to the person asking for they are going to continue to ask for it.
At a company with salaried employees, management has to understand that rushes and scope creep come with tech debt and quality issues in order to hit "the deadline". Understanding things like "tech debt is not just a task you do later, it actually makes everything you do worse" are critical in that regard.
When you can get a manager to say "I don't care of quality suffers as long as we can show it on the floor at the show" then you know it's a deadline. Ideally, point them to that book to help them understand the difference between estimates, commitments and targets.
I like the idea of digging more into real business requirements, but I'd question one thing.
>You'll never hear Apple announce a product until it's actually ready to ship.
Apple is a consumer tech, flagship-product-focused company. B2B timelines, contracts, and deliverables work somewhat differently. Apple can always "miss" a deadline and just choose not to debut the new product yet (worst case). Many B2B contracts hinge on explicit future product deliverables, because there is a TON more power on the buy side when dealing with business clients as opposed to individual consumers. Do you have experience with these estimation issues in a B2B context, any flavor you can add there?
Please don't put out assumptions like "PM = Psychology + Economics" without an indepth argument about it. I completely disagree with such statements and without arguments I can't even attempt to see it your way.
I apologize for not responding to this sooner. I just missed it.
Project Management is about resource optimization, where for the case we're talking about those resources are people.
Optimizing those resources involves balancing the supply and demand for people's hours, knowledge and skills. More experienced and highly capable people on your team will be in more demand, despite the always limited supply of their hours which increases the cost of those hours to your timeline.
Psychology is the study of human behavior. Because the resources are people, a huge portion of your job in optimizing that timeline is optimizing the people. That means understanding motivations, weaknesses, burn out, focus, etc as well as a number of other factors.
Nice. It's not the greatest thing in the world either, tho. (Not the worst thing either, tho.) Here, a related conference papers (there does exist a staggering amount of project management science - mode of it little applied, however.):
Oh and, on an unrelated to this but related to Project Management matter, you might find some use for this very Russian tool & programming language, originally developed for the Buran space project (development continued after the project failed for unrelated reasons), and still to this day used in space flight software development:
This and the Joel on Software piece both have their heads in the write place...which is putting a lot more time into estimating by actually diving in and planning your tasks with more detail.
It's the trade off of investing a lot more time in your estimating process to say that you had more accurate estimates. If you can afford to pull that time from somewhere in your organization, I'm sure it does produce more accurate estimates.
The value proposition boils down to how often you're going to do that? Are you going to spend 8 hours every sprint to dive into estimates to try to make them more accurate? If you're making a decision between multiple projects, the time investment to decide which path to travel down the road will make more sense.
Allocating the time every 2 weeks on the path you're currently travelling though? Is the value still there in that situation any more than re-evaluating the weight of a story when it's being worked?
This is where I steer clear of hours. Formula's like that also disregard things like turnover on teams, learning curve between different team members, on boarding time, experience differences, etc. The time trade off is rarely there.
Estimation is a skill. You have to learn it and practice it. Estimation is not something you can just do. You have to be taught how.
OA touched on this, for many companies and types of projects the majority of development is doing things you do not understand and/or don't know how to do. Yet.
You don't figure those things out until long after someone wanted an estimate and tonolan release date. It's not infrequent to have shipped v1 before you figure out how things work. Or needed to work.
The quite "no plan survives contact with enemy" applies to softdev as "no design survives contact with production"
I've seen a lot of people suggest variations of this, but after 25 years, I still suck at it. Everybody I've ever worked with has sucked at it. Everybody I've ever even heard of has sucked at it. Maybe I've had a two-and-a-half decade of bad luck, but in 25 years I've encountered exactly two types of people: people who insist that "estimating is a skill and you have to develop your skill" but don't actually practice estimation, and people who suck at estimation.
If you only gauge yourself on point estimates on something, then constantly being wrong doesn't help you build the skill. (Where point estimate is boiling it down to a single number.)
Instead, try estimating several factors of it. Some as seemingly worthless as lines of code, new dependencies, old dependencies touched, use cases, etc. Odds are high that you can actually start estimating one of those with a bit better accuracy. Odds are also good that the one you can estimate well, happens to correlate well with time/cost.
So, find what you can estimate, and then calculate a rough translation of that to what you are being asked.
I really like this, and I'm saving it for next time I need to help somebody with a PMP orientation understand my perspective.
But I think it misses the mark a bit on pairing. Having somebody shoulder-surf once in a while is not a bad idea, but it's definitely not pairing. Even if you're only doing it a little, I think it's worth setting up for true pairing, so that both people can easily jump in and work on the code as they're working together. Even as little as adding a second keyboard and mouse can make a big difference. People are just more engaged if they're participating.
I'm not against pairing more but there are some people who are incredibly uncomfortable with it.
For a short period of time, you don't need much more than a person having a suggestion and dictating if they want to jump in.
I'd have serious reservations about mandating too much of it.
There's a gap between, "I need to type in this environment for a few minutes" and "I need to become proficient with this environment so I can work in it just as well as my preferred."
In small doses, the proficiency and work environment preferences don't need to bend much (even to share the keyboard a little).
I get the thing with environments, but it got a lot easier for me when I'd just ask people to do the magic I didn't know in their tools. Especially if it's two keyboards and two mice, it's easy enough to say, "Ok, now jump back to the test" and have them seize control for a few seconds.
Having worked at a company where we did all pairing, all the time, I think it's absolutely "the right way" to do software development. After doing it enough, I think that most people will get over their dislike of it.
But you're right that it isn't for everyone. You kinda need to build your team around it and have your interview process select the kind of people that will enjoy it.
I really like the idea of pairing with someone for two hours after you finish up a ticket. I think that would be a great way to introduce the team I'm currently on to pairing.
I highly recommend this video of a Dave Thomas talk about what agile was supposed to be vs what it has become. Most of us are probably stuck working with some sort of Agile methodology regardless of whether or not we'd like to, but it can still be nudged in the direction that Dave talks about.
I think the big part of every movement is that initially it's done by people who like to do things, and who want the doing part to be more fun for people like them.
But the huge public success comes through people who lead huge businesses or countries, and who are faced with the situation where they have a set of mostly unmotivated, unskilled people who should do something for mostly unmotivated, unskilled customers/citizen. So for them it's not at all about making it more fun for people who love to do it and who have skills, but about making it simple, controllable, and most importantly look successful although they already know BEFORE they start that it won't be successful at all, due to the lack of motivation and skill in their general userbase.
So that "The values have been totally lost behind the implementation" is neither a coincidence, i.e. it doesn't just happen in software dev, nor a surprise, i.e. it was known before it became popular that it couldn't become popular and keep its values.
Sometimes when I watch/listen/read about people explaining this as a surprising and undesired result, I wonder whether they really are so naive, or if they are looking for a way to profit from a younger audience who hasn't experience such a complete popularity loop yet.
Every step in a software project is solving problems.
You don't know how long it will take to solve the problem.
You don't know how many new problems you might run into along the way.
I continue to be unshockingly shocked by accounts which describe project management as something rather low-skilled. How much skill does it require to take feature requests from customers and talk to devs once a week to ask them how long do they think it'll take them, to try to get product out the door quickly, akin to a restaurant shift manager? I really don't see it as taking very much skill at all. Take somebody with a bog-standard bachelor's degree (in any subject), ask them to complete a basic PM course at a community college, something that explains stuff like Agile/Scrum/Kanban etc. and let them loose. Of course, if you actually do that with a software project, you'll have a project trainwreck on your hands.
Good PMs are a lot more like the coach for a sports team. Ultimately, the team (and its coach) is judged by its ability to colaesce and prepare for execution by a deadline (i.e. a game date), with a KPI measuring the team's collective ability to deliver (i.e. whether the team is winning or losing).
Does the coach, in the weeks leading up to a game, collectively ask the team, "so how many points do you think you can deliver in the game?" Of course not. It's a nonsensical question. How about, in the context of a new kid in a junior league, "how fast can you run from here to there on the field?" Of course the new kid doesn't know, he's never done it before. Good coaches, rather, focus on the players. One player is a really fast runner, but needs help on scoring. Another player is great with scoring, but doesn't work with his teammates on the field. How can the coach a) take advantage of his players' strengths, b) work on their weak points, c) build a more balanced team that d) coalesces by game day and can execute different plays, reliably and on-the-fly, so that the team can go on to win?
So one developer really knows the database layer, but has a lot of difficulty with UI work. Another developer is more balanced, but they've only worked on one section of the code. Is the PM/coach cognizant of this? Has the coach assigned different tasks to different developers based on their strengths, or is every task seen as inherently "unknowable"? Can the coach take a task that would take developer A reliably two days to get done, and give it instead to developer B so that B can become familiar with a part of the system he hasn't worked with before, pairing occasionally with developer C who does know that area of the system, so that B can learn that area of the system and become a more well-rounded member of the team? Is doing so OK with the business (i.e. is the task part of the critical path or not)?
This kind of work is very high-skilled. A coach has to a) be intimately familiar with each developer and their work, essentially needing to review all the work being done (even if the coach's approval isn't required for delivery itself) b) take higher level requests and split them up into tasks which correspond to the skillsets of individual team members and/or team balancing/development goals, understanding therefore that the subprocess of task formation is team-dependent and cannot occur in isolation, and do so independently if a dedicated architect is not available, c) internally use a Kanban board while still adopting sufficient slack such that unplanned work does not threaten time commitments made to the business, unaware of how Kanban boards work, while keeping track of the commitment dates in the backlog so that tasks whose deadlines are imminent can be prioritized and put onto the board far enough in advance so as to meet the commitment, i.e. cycle time + slack < due date, and also ensuring that d) there exist enough cycles so that all items in the backlog can be fulfilled by their due date, or else working with the business to reschedule the due dates for existing tasks so that other, more important tasks can be placed into the backlog with earlier due dates.
Such a coach then has two KPIs, a) the stability and possible gradual improvement of his team's cycle time, thus measuring how well the coach is familiar with the capabilities of the team, how well the coach is developing and balancing the team, and how well the coach is splitting requests into tasks that are well-matched to the skillsets of the teammates they are assigned to, b) on-time delivery ratios, measured against the due dates of tasks when they enter a cycle (and not the original due date, so as not to penalize the coach for helping the business reprioritize older work against new requests).
If you want to split the duties of a tech lead and a PM, say because multiple tech leads are required for a large project, managed by a single PM, then the PM still needs to juggle various customer requests and split them into team-sized requirements for the tech lead to split into developer-sized tasks. That you've introduced a higher level of project abstraction over multiple teams does not change the fundamentals - PMs being intimately familiar with the output of individual teams to identify strengths and weaknesses, ensuring that teams are adequately staffed and qualified, monitoring the backlogs of teams so that the PM does not commit to work which the teams do not have the capacity to achieve.
Sadly most people who care about PM and team leading are of the first category, with zero skill in anything and only a basic set of buzz words from the PM world, which sadly doesn't even give them the opportunity to see how management skill can be a much bigger factor than process.
The biggest flaw in Agile stand-ups is that it was developed before Slack became a standard.
For our small team, we created a Team Scrum channel solely for the purpose of allowing devs to check in at the start and end of day with work progress. This reduced a ton of overhead.
Not take focus away from that good idea (and it is a good idea), the biggest flaw in "Agile" is that people think that stand-ups are for reporting work progress/status. A stand-up is for getting clarification on what you are doing, arranging to get help, clearing up misconceptions between people, ensuring that the interfaces for different stories are meshing (and people aren't going off in wildly different directions), etc. This is why you have scrum master running the stand-up and not a manager. This is why you especially do not have a project manager running the stand-up! It is a technical meeting.
Status should be communicated through artefacts. For example, if the software has shipped, then the story is done. If you are doing a spike, when the new stories have been written as a result of the spike, then the spike is done. The only time you should need more status than that is when there has been a problem and your story got set back. In that case the scrum master should just tell it to the (project) manager -- it's a 30 second conversation and does not require the presence of your entire development team.
In the (very likely) case that your (project) management is not happy with the frequency of status reports (because it takes to long to finish something), then you need smaller stories. Getting your story size down to the size where you can complete about 1 per day will go a long way towards making your management happy. Of course the downside is that you have to be organised -- which, very ironically, most (project) managers really hate. They'd rather have stories that say "Deliver the product" with no other details and go back to their meeting (where they will undoubtedly discover new extremes of productivity, finding endless stacks of 3 word stories).
Especially for remote first teams, standups are as much a social connection as an accountability, information sharing and blocker highlighting opportunity. I'm a huge fan of synchronous standups and really like zoom to get the added connection of video and audio with "async standup checkins" being the exception case if someone is at the doctors or whatever. All of this assumes a team with at least some timezone overlap.
Slack is great, but it doesn't completely replace the ability to see and talk to people once in a while!
Why conflate standup with a social checkin? Why not have a pre-lunch time to hang out and shoot the shit, instead of cramming it in as an ad hoc part of "the process" of standup?
Put your details, questions, concerns, etc in text, where anyone can read it any time they like, and make "social hour" something related to being a better teammate on an interpersonal level, instead of a mandatory part of the process.
> The moment that story estimating becomes time estimating...stop. That's not what it is there for and you're not trying to assign out every minute for a person.
I don't see the distinction here. The whole reason you're estimating things is to guess at how long they'll take. If the story points do not translate to time, what use are they?
What a ride. I really disagree with almost everything that's written there. So many things are even factually wrong, or with such a limited perspective that they are very likely missing the point. For instance, what he describes is not project management (limited timeframe, specific end result) but constant, ongoing product development. He describes a development management job (might also be called team lead), not a PM job.
His conclusion seems right to me, though. Team management doesn't need fancy processes. The overview of activity and backlog needs to be there and the manager has the responsibility to enable syncing between activities and to the outside world.
If one calms down after that conclusion, one realizes that managing a single team of developers is trivial. Maybe that's why people are not talking about it so much (in contrast to "nobody gets it")?
The art is in reprioritizing existing tasks when new ones come in (because usually something needs to get dropped), and additionally there's art in handling the meta level when single processors of a team are not just people but subteams.
I only briefly touched on the re-prioritization (I usually describe it as triage) with the brief scenario of a production issue or cross-team communication once a commitment has been made in scrum. Maybe in another post.
I fucked a consultancy once on accident with this. They hired two extra devs and a manager griped at me that I wasn't being productive enough. I sketched the critical path of information flows required, and every morning grabbed a blocker card. Got a four month project done in one, consultancy was screwed because they had no backlog and were billing hourly. Oops.
Sigh. I wish PMP would already die. Or that MAYBE, JUST MAYBE, they'd fuse with PRINCE2. Both PMP and PRINCE2, on their own, suck so hard. Fused, it becomes a vastly different matter. It still has various drawbacks, but the issues become so much smaller.
This worked well and was driven by his understanding that most of his effort needed to be focused on identifying and mitigating delivery risk. Most often the key risks lay in trying to estimate the level of effort required to develop key new functionality when all involved had limited prior experience in developing similar functionality. In those cases funding a short time bounded study to explore the technologies involved and figure out the art of the possible reaped huge rewards downstream.
If only all customers were that enlightened!