This worked well and was driven by his understanding that most of his effort needed to be focused on identifying and mitigating delivery risk. Most often the key risks lay in trying to estimate the level of effort required to develop key new functionality when all involved had limited prior experience in developing similar functionality. In those cases funding a short time bounded study to explore the technologies involved and figure out the art of the possible reaped huge rewards downstream.
If only all customers were that enlightened!
Once they catch on to that, it can be great.
One of companies mentioned, Gerson Lehrman Group (GLG), literally sells the time of vetted experts.
Technical discovery is baked into traditional project management (e.g., civil engineering) in a way that continues to surprised me as a former IT project manager. There is a culture of investigation and research, such as materials research, proof of concept development, and needs-driven public-private academic research in other technical fields that IT project managers just don't get exposed to.
For instance, engineers planning a drainage project have a very good idea of how long a particular type of pipe will last before it needs to be replaced. They also have the tools/software/training needed to estimate how local stresses affect functionality and lifespan, and they're expected to use this information - even if the lifespan of the project exceeds their expected tenure.
IT PMs can't say the same thing about their infrastructure - who knows how long AWS or Azure will be around, for instance, or how long a local server blade or piece of network equipment from (x) company will actually last. When it comes to software development, the prevailing attitude is 'I want (x) by (y) for (z)', without having any justifiable basis for (x), (y), or (z).
I'm convinced IT project management could learn a lot from civil engineering project management.
The "problem solving" risk is much lower as a result.
Example: "architectural spikes" in eXtreme Programming: http://www.extremeprogramming.org/rules/spike.html
Instead of estimating by guessing based on reading the requirements, work on an issue for say an hour or so and then estimate it.
It still wouldn't be perfect, but I think this would catch a lot of cases where, for example, a seemingly simple feature ends up requiring an extensive model change.
It would also do a much better job of shaking out cases where the requirements aren't as clear as you thought when you start trying to code it.
On projects that are breaking new ground it is even more important to follow XP practices like getting frequent feedback from customers, taking the simplest solution that could possibly work to avoid overengineering before you actually understand the problem, to embrace change because your original understanding of the problem probably isn't right.
There also is plenty of risk involved in cookie-cutter commodity work. Those kinds of projects fail all the time.
... because they use snake oil methodology like XP...
Projects vary at the low end from 2 people, 2 months up to 5-7 people for a year or longer. We do smaller stuff like prototypes also, usually as a first step in a longer series of releases. We are local in SF and frequently work in client offices. Budgets are between $100K - $1M.
* Getting the project done well and on time.
* Evaluating the performance of invidual programmers.
This is probably a natural thing to do because one measure of "good programmer" is "gets projects done well and on time", but I think directly mixing those two goals together interferes with both.
If you have any experience, you've certainly been put in a situation where the project went poorly through no fault of your own. I shipped "Superman Returns: The Videogame". Every developer knew the game was going to suck. Conversely, you've probably known (or been) someone who coasted through a successful project without contributing much of value.
So that argues that project outcome is a poor individual performance measure.
Worse, when people know project outcome is tied to their performance review, it puts pervese incentives into play. It encourages even well-meaning people to over-estimate tasks, bury technical debt where it won't be seen, point fingers when projects go poorly, choose conservative projects to work on, etc.
Perhaps a way out is for the organization to very clearly and explicitly separate these two concepts. Repeatedly acknowledge that you can do good work on projects that go poorly and vice verse. Do not take project outcome into account in performance reviews.
That still leaves the question of how do you evaluate performance then? Peer review is one approach — everyone on the team probably knows who does and doesn't carry their weight. Unfortunately, this can lead to nasty biases. People whose gender/culture/whatever lines up with the majority of the team are likely to get rated higher because we like people similar to ourselves. People that are more outgoing, outspoken, or charismatic will get unfairly better reviews — a particular problem in programming where some of the best people lack those attributes. It might trigger nasty competitive behavior with cabals forming and other stuff like that.
I don't know if those problems are worse than the status quo of using project outcome for performance evaluation too. It's difficult.
If you have a poor individual contributor, they are almost always fired. If you have a poor manager, oftentimes they have worked a relationship with someone that has power that makes it very difficult to fire them, so they're oftentimes "promoted." This kind of nepotism and cronyism exists to some extent among individual contributors but ICs usually disperse more across business verticals throughout their careers while managers and others in primarily soft skill roles tend to be extremely connected (whether through intrinsic personality or due to the nature of their jobs is another question) and seem to act more as cohorts rather than as individuals.
Programming is hard, too, and good code is a much subtler thing that requires an expert to give you any sense of.
What an odd thing to say - ask any accountant or banker and they’ll say they’ve had methods for this for literally hundreds of years. Ever since “share price” was a thing, and even before that.
The problem is doing something with the information. Look at IBM for example (one of many). You can see simultaneously that managers are doing a terrible job - zero revenue growth for years - but also see that they are lavishly rewarding themselves for it.
I've had a successful career(delivering what I consider good work) on some real stinker projects. Usually though it's only been when things are framed in the above way.
As far as I can tell "agile" mostly is about predictability, not about efficiency. Most of the places I've worked, when we switched to agile, the efficiency of the team got worse, but management was generally happy because they felt like they had much more control.
I took an organization that couldn't put out two releases a year to one that put out once a month, and the owner/manager complained bitterly and pushed new features into the discussion ON THE DAY OF RELEASE.
2nd level management had to constantly, repeatedly attest to the good that the tempo and structure brought to the organization and customers.
My entire experience has been: good management + talented developers = success, and bad management or untalented developers and it fails, but it really has nothing to do with the methodology in play. I've seen (and been part of) hard core agile teams burning to the ground, and teams with no methodology be wildly successful. (And vice versa)
Agile bothers me because it generally aims to commoditize developers, and yet the best places and teams I've worked on always were they opposite: it was about finding the best way for each individual to work, and designing processes around the rhythms around whatever industry they were working in, rather than assuming everything should fit into 2 week sprints.
The best places to work adapt the process to the team.
The worst enforce a process on the team because it makes managing them easier.
The number of workplaces I've been in who equate the number of hours that people are in the office with their productivity, because managers could measure attendance easily, and had no idea how to measure actual productivity...
Still, a lack of a strict methodology doesn't necessarily make it waterfall.
Then, that whole steaming pile was thrown over the fence to QA which tried to figure out heads from tails and developers started fixing the issues they already knew about but had to leave in so that they would make their deadline.
THEN came user acceptance testing, where people actually looked at the product at the first time. This is when people realized what the miscommunications 6-12 months ago, and the change request process started.
Finally, over budget and after everyone was finally exhausted and tired of fighting, it was released to users with promises of changes after the release.
From this world, SCRUM and the manifesto makes sense.
Within those projects, waterfall was completely removed from the numerous tasks needed to get the project done so the developers never saw it.
Getting waterfall involved at the project level tasks day today has a lot of issues though. It can run a lot smoother if your team has a low expectation of pivoting due to new projects or production support needs.
It’s not without possibilities, but a lot depends on how far removed it is from day to day.
Modern waterfall where you have documentation in git with references into tests is 10x better than the waterfall of old.
Two things that need to be succinctly said:
1. Project Management is just another skill, like database management, security, or any one of a hundred other skills a tech team might have. PMs keep trying to take themselves out of the trenches and claim a special place. Every time they do that it is a mistake.
2. He touches on "waterfall" a few times. Physical things decompose into smaller physical things. That's why bridge-building is the way it is. Creative technical work is not a physical thing. It is infinitely decomposable. You take a job creating new tech and split it into 10 pieces, you've got the same job with just a lot more overhead. You can go on forever. Very quickly this natural tendency for decomposition leads to doomed projects.
Intuition will lead you astray in tech project management. Every time.
Wy too many times I have been at places where the Project Manager is seen as the "Boss" or the gatekeeper by management and by himself.
Engineer usually disrespect the project manager as they don't really understand anything and just do stuff based on an Excel sheet.
Whatever methodology you use, delivering usable increments at regular intervals is the way to go. You can see progress, determine if the direction is wrong and change course rapidly.
Software is usually reproducing human processes. Well guess what, processes change when they don't make sense in a given situation. So its logical that the software change over time as we discover more efficient ways of doing things. As such, you cannot plan for the unforeseeable and even less estimate how long it will take.
I therefore agree that optimizing throughput of a team is the one thing to do. Never commit to a delivery date if you can. Instead, commit to delivering usable increments regularly. With that, project management becomes expectations and change management. Lots of communication with the client on showcasing, and hopefully using the increment and provide feedback for the next iteration.
This is really hard to do. Devs have a tendency to stay cooked up in their ivory tower, and product (read project manager as described in the article) have the reflex to protect them from client distractions. The client may not be available or interested either. Nurturing dedicated time slots for client and devs to meet and interact is crucial. If you get that right, and never miss delivering an increment, good things are sure to come out of the project.
Fortunately, with CI and the Internet, we have all the tools to deliver usable increments at regular (even blazing fast) intervals. Power to the people who can leverage those to deliver successful software projects to their customers.
The agile approach is the way to manage projects, but how do you quote them?
I can't help but notice, also, that no project management "methodology" emphasizes (or even allows for) time spent researching/learning existing libraries, algorithms, systems, permissions, even though I think everybody would agree that this is where you're going to get the biggest payoff.
every pm methodology talks about getting to know the problem domain, risk minimalization (start with the riskiest assumption, do a minmax check on it, the pivot to the next riskiest, and so on).
selecting the right tool for the job is an inherent part of the process.
When Jenny Greene and I were working on our book, "Learning Agile: Understanding Scrum, XP, Lean, and Kanban," we were lucky enough to have David Anderson (the guy who adapted kanban for software development) review our chapter on Kanban, and he really helped us nail down this particular issue.
Other than that (and a few things he says about the PMP certification), what he says in the piece is spot on. Nice work -- I'd love to see it expanded into a book!
That could be said for any kind of project in any industry. I speak as a PMP. The only reason PMP is useful to me is to convince employers that I know how to manage projects. It has nothing to do with reality, and so it is unfortunate that employers view it as a positive signal. It's not an effective filter because you still need to fully assess potential employee candidates in order to avoid false positives.
Fulltime pairing has dramatically helped me with all of these.
Many folks feel that it sounds bad. The thing is that it's not really a skill that's easily learned from a book or a blog post. It's much better to learn from an experienced pair.
Disclosure: I work for Pivotal, so that's a large part of our whole thang.
It was common to spend a week into planning for every six man/month of project
Using that as ballpark we never got an estimate having more than 10% error.
Anectodal, I know, but without that kind of effort one is not doing an estimate is doing a guess.
That also why I don’t like the agile approach of piling up efforts and then use actual time to tune speed hoping the inaccuracies will level off one another. It just feel like a sequence of dice rolls hoping you don’t end up at the far tail of the gaussian.
That's exactly, precisely what it is, by design AFAICT. The thing about gaussians is, what with the central limit theorem and all, you provably don't end up on their tail very often.
That said, gathering a bunch more data can make your gaussian even narrower. I think you and (platonically ideal) Agile are just at different spots on the same spectrum of risk tolerance.
You can skip low level estimates, but the consequence of that is that high level timelines are uninformed by them. That’s how you end up in the archetypal situation where all the engineers know a project will slip while their managers report it’s right on track.
Can you explain what you perceive to be the problem with accessing something like StackOverflow to help solve a persistent problem or even to simply jog one's memory?
I find that this fear of being shamed for using the obvious resources at our fingertips to be one of the things that makes people less likely to want to pair. IMO there is nothing wrong with consulting books, websites, other people's ideas/opinions, or other knowledge repositories to help make good decisions or solve problems elegantly.
It was more of a reflection of being unwilling to ask team members for help and just depending on Google/SO for answers if you're stuck. I'll re-evaluate the wording there and get it corrected.
Teams work best together and to work together egos need to be checked at the door and people need to trust each other. Any problem on a team is the whole team's problem, and while one person may be accountable for introducing bugs, a good team will view that team mate as valued and capable and perhaps pitch in to mentor or help fix the problem quickly.
The Five Dysfunctions of a Team obviously points out a lot of what I am talking about much more elegantly than I can. What I am saying is that project management can only go so far, team leadership, culture, and recruitment is also hugely important to fix many of the problems you identified in the article.
Great article though, I shared it with my work-mates :)
For someone in the grip of impostor syndrome, it is admitting that you are indeed just faking it and don't know enough to 'deserve' your position.
Now it is probably not true, but that does not matter for how people react to it...
It doesn't provide high-level estimates that management provides, doesn't do budget planning for you etc.
How do you approach these topics with Kanban?
Complexity and high variability of estimates is the real problem. You can address that some by training developers on how to estimate, which I’ve heard called calibration, and then spend a lot more time digging in up front on those estimates.
Long term, a ball park can give a good idea but most aren’t going to be more accurate until we dig in. If you are choosing between projects based on expected ROI then diving in ahead of schedule to get a clearer picture makes sense.
If you are delivering a specific product no matter what, then you optimize for throughput.
It’s more valuable to tune the engine so you can drive 70mph than to slow down to 20mph with the goal of accurately predicting your speed.
>If you are delivering a specific product no matter what, then you optimize for throughput.
I really like this, thanks for the response!
IMO that means that the estimates he's talking about being accurate are fairly short term.
Totally agree with him on a lot, especially only the developer working the task can truly estimate it. When you're talking about time measures, that's the only hope of them being accurate.
When I talked about story points reflecting complexity and not time, it was specifically because time is variable by person. If the team isn't creating a time estimate on a task, that's much less of an issue.
" Those principles actually do work extremely well in many, many industries. PMP focuses almost entirely on Waterfall methodology and gaining the certification is largely driven by terminology and math.
But software is a special snowflake in the project management world and this is why countless books around managing software continue to be written."
I highly recommend you read the book I asked about in a sister comment. I wish the business people would read it. The issue I think is it correlates and requires an understanding of other "techie" and STEM subjects like queues, math (related to queues), network protocols, and traffic flow. Until the business people understand the maths and tech topics that book appeals to, friction will continue to exist.
Also, how do you estimate? Individual or group? Does everyone on the team possess the skills to estimate every story?
You don't estimate in kanban. You'll generally try to break things down into small stories, but the goal is to always be working on the most important thing and focus on improving how quickly work goes from idea to done.
The methodology I mentioned calls for exactly that, estimating best case, realistic and worst case...and then re-evaluating once the work starts.
You never really know until somebody dives into the work and it's rare to actually get time before an estimating meeting to individually look into each story. Treat the estimate as a best guess and then re-evaluate when you have better information.
Any suggestions for how to effectively communicate a "best guess" and a "re-evaluation" to a business side that basically demands "hard deadlines" for projects?
Check out his "4) Pile of wood blocks"
My original answer though ->
Forward along this article?
The Software Estimation book mentioned in the article does a really good job of breaking it down.
Hard deadlines come with a lot of questions, like why? Is there a trade show? An investor meeting? A marketing campaign?
Is it really a hard deadline? Specifically, if the product isn't ready yet are you planning to launch no matter what because we have to?
If not, then it's not a hard deadline and deadline needs to be reframed around goals.
You'll never hear Apple announce a product until it's actually ready to ship.
Usually hard deadlines come from announcing something that doesn't exist yet. The outcomes are usually buggy, rushed systems or missed deadlines that make companies look bad.
If it's for a trade show, the question is closer to "what exactly do we need for this trade show?" because management will always say "everything!"
I always tell the story of a particularly bad client that I had years ago who insisted that he had to have a project completed 3 weeks ahead of when we told him it would be ready. My partner and I killed ourselves, gave up weekends, worked crazy hours to hit his deadline for him.
And we also monitored the parts of the application that he had us create. He didn't even look at them until 2 months later.
After that we learned to create "Rush Rates" when requests like that come in, because if demanding something be done faster has no cost to the person asking for they are going to continue to ask for it.
At a company with salaried employees, management has to understand that rushes and scope creep come with tech debt and quality issues in order to hit "the deadline". Understanding things like "tech debt is not just a task you do later, it actually makes everything you do worse" are critical in that regard.
When you can get a manager to say "I don't care of quality suffers as long as we can show it on the floor at the show" then you know it's a deadline. Ideally, point them to that book to help them understand the difference between estimates, commitments and targets.
>You'll never hear Apple announce a product until it's actually ready to ship.
Apple is a consumer tech, flagship-product-focused company. B2B timelines, contracts, and deliverables work somewhat differently. Apple can always "miss" a deadline and just choose not to debut the new product yet (worst case). Many B2B contracts hinge on explicit future product deliverables, because there is a TON more power on the buy side when dealing with business clients as opposed to individual consumers. Do you have experience with these estimation issues in a B2B context, any flavor you can add there?
Project Management is about resource optimization, where for the case we're talking about those resources are people.
Optimizing those resources involves balancing the supply and demand for people's hours, knowledge and skills. More experienced and highly capable people on your team will be in more demand, despite the always limited supply of their hours which increases the cost of those hours to your timeline.
Psychology is the study of human behavior. Because the resources are people, a huge portion of your job in optimizing that timeline is optimizing the people. That means understanding motivations, weaknesses, burn out, focus, etc as well as a number of other factors.
Oh and, on an unrelated to this but related to Project Management matter, you might find some use for this very Russian tool & programming language, originally developed for the Buran space project (development continued after the project failed for unrelated reasons), and still to this day used in space flight software development:
This and the Joel on Software piece both have their heads in the write place...which is putting a lot more time into estimating by actually diving in and planning your tasks with more detail.
It's the trade off of investing a lot more time in your estimating process to say that you had more accurate estimates. If you can afford to pull that time from somewhere in your organization, I'm sure it does produce more accurate estimates.
The value proposition boils down to how often you're going to do that? Are you going to spend 8 hours every sprint to dive into estimates to try to make them more accurate? If you're making a decision between multiple projects, the time investment to decide which path to travel down the road will make more sense.
Allocating the time every 2 weeks on the path you're currently travelling though? Is the value still there in that situation any more than re-evaluating the weight of a story when it's being worked?
This is where I steer clear of hours. Formula's like that also disregard things like turnover on teams, learning curve between different team members, on boarding time, experience differences, etc. The time trade off is rarely there.
But I think it misses the mark a bit on pairing. Having somebody shoulder-surf once in a while is not a bad idea, but it's definitely not pairing. Even if you're only doing it a little, I think it's worth setting up for true pairing, so that both people can easily jump in and work on the code as they're working together. Even as little as adding a second keyboard and mouse can make a big difference. People are just more engaged if they're participating.
For a short period of time, you don't need much more than a person having a suggestion and dictating if they want to jump in.
I'd have serious reservations about mandating too much of it.
There's a gap between, "I need to type in this environment for a few minutes" and "I need to become proficient with this environment so I can work in it just as well as my preferred."
In small doses, the proficiency and work environment preferences don't need to bend much (even to share the keyboard a little).
I get the thing with environments, but it got a lot easier for me when I'd just ask people to do the magic I didn't know in their tools. Especially if it's two keyboards and two mice, it's easy enough to say, "Ok, now jump back to the test" and have them seize control for a few seconds.
But you're right that it isn't for everyone. You kinda need to build your team around it and have your interview process select the kind of people that will enjoy it.
I really like the idea of pairing with someone for two hours after you finish up a ticket. I think that would be a great way to introduce the team I'm currently on to pairing.
OA touched on this, for many companies and types of projects the majority of development is doing things you do not understand and/or don't know how to do. Yet.
You don't figure those things out until long after someone wanted an estimate and tonolan release date. It's not infrequent to have shipped v1 before you figure out how things work. Or needed to work.
The quite "no plan survives contact with enemy" applies to softdev as "no design survives contact with production"
I've seen a lot of people suggest variations of this, but after 25 years, I still suck at it. Everybody I've ever worked with has sucked at it. Everybody I've ever even heard of has sucked at it. Maybe I've had a two-and-a-half decade of bad luck, but in 25 years I've encountered exactly two types of people: people who insist that "estimating is a skill and you have to develop your skill" but don't actually practice estimation, and people who suck at estimation.
Instead, try estimating several factors of it. Some as seemingly worthless as lines of code, new dependencies, old dependencies touched, use cases, etc. Odds are high that you can actually start estimating one of those with a bit better accuracy. Odds are also good that the one you can estimate well, happens to correlate well with time/cost.
So, find what you can estimate, and then calculate a rough translation of that to what you are being asked.
That make sense?
But the huge public success comes through people who lead huge businesses or countries, and who are faced with the situation where they have a set of mostly unmotivated, unskilled people who should do something for mostly unmotivated, unskilled customers/citizen. So for them it's not at all about making it more fun for people who love to do it and who have skills, but about making it simple, controllable, and most importantly look successful although they already know BEFORE they start that it won't be successful at all, due to the lack of motivation and skill in their general userbase.
So that "The values have been totally lost behind the implementation" is neither a coincidence, i.e. it doesn't just happen in software dev, nor a surprise, i.e. it was known before it became popular that it couldn't become popular and keep its values.
Sometimes when I watch/listen/read about people explaining this as a surprising and undesired result, I wonder whether they really are so naive, or if they are looking for a way to profit from a younger audience who hasn't experience such a complete popularity loop yet.
Good PMs are a lot more like the coach for a sports team. Ultimately, the team (and its coach) is judged by its ability to colaesce and prepare for execution by a deadline (i.e. a game date), with a KPI measuring the team's collective ability to deliver (i.e. whether the team is winning or losing).
Does the coach, in the weeks leading up to a game, collectively ask the team, "so how many points do you think you can deliver in the game?" Of course not. It's a nonsensical question. How about, in the context of a new kid in a junior league, "how fast can you run from here to there on the field?" Of course the new kid doesn't know, he's never done it before. Good coaches, rather, focus on the players. One player is a really fast runner, but needs help on scoring. Another player is great with scoring, but doesn't work with his teammates on the field. How can the coach a) take advantage of his players' strengths, b) work on their weak points, c) build a more balanced team that d) coalesces by game day and can execute different plays, reliably and on-the-fly, so that the team can go on to win?
So one developer really knows the database layer, but has a lot of difficulty with UI work. Another developer is more balanced, but they've only worked on one section of the code. Is the PM/coach cognizant of this? Has the coach assigned different tasks to different developers based on their strengths, or is every task seen as inherently "unknowable"? Can the coach take a task that would take developer A reliably two days to get done, and give it instead to developer B so that B can become familiar with a part of the system he hasn't worked with before, pairing occasionally with developer C who does know that area of the system, so that B can learn that area of the system and become a more well-rounded member of the team? Is doing so OK with the business (i.e. is the task part of the critical path or not)?
This kind of work is very high-skilled. A coach has to a) be intimately familiar with each developer and their work, essentially needing to review all the work being done (even if the coach's approval isn't required for delivery itself) b) take higher level requests and split them up into tasks which correspond to the skillsets of individual team members and/or team balancing/development goals, understanding therefore that the subprocess of task formation is team-dependent and cannot occur in isolation, and do so independently if a dedicated architect is not available, c) internally use a Kanban board while still adopting sufficient slack such that unplanned work does not threaten time commitments made to the business, unaware of how Kanban boards work, while keeping track of the commitment dates in the backlog so that tasks whose deadlines are imminent can be prioritized and put onto the board far enough in advance so as to meet the commitment, i.e. cycle time + slack < due date, and also ensuring that d) there exist enough cycles so that all items in the backlog can be fulfilled by their due date, or else working with the business to reschedule the due dates for existing tasks so that other, more important tasks can be placed into the backlog with earlier due dates.
Such a coach then has two KPIs, a) the stability and possible gradual improvement of his team's cycle time, thus measuring how well the coach is familiar with the capabilities of the team, how well the coach is developing and balancing the team, and how well the coach is splitting requests into tasks that are well-matched to the skillsets of the teammates they are assigned to, b) on-time delivery ratios, measured against the due dates of tasks when they enter a cycle (and not the original due date, so as not to penalize the coach for helping the business reprioritize older work against new requests).
Sadly most people who care about PM and team leading are of the first category, with zero skill in anything and only a basic set of buzz words from the PM world, which sadly doesn't even give them the opportunity to see how management skill can be a much bigger factor than process.
For our small team, we created a Team Scrum channel solely for the purpose of allowing devs to check in at the start and end of day with work progress. This reduced a ton of overhead.
Status should be communicated through artefacts. For example, if the software has shipped, then the story is done. If you are doing a spike, when the new stories have been written as a result of the spike, then the spike is done. The only time you should need more status than that is when there has been a problem and your story got set back. In that case the scrum master should just tell it to the (project) manager -- it's a 30 second conversation and does not require the presence of your entire development team.
In the (very likely) case that your (project) management is not happy with the frequency of status reports (because it takes to long to finish something), then you need smaller stories. Getting your story size down to the size where you can complete about 1 per day will go a long way towards making your management happy. Of course the downside is that you have to be organised -- which, very ironically, most (project) managers really hate. They'd rather have stories that say "Deliver the product" with no other details and go back to their meeting (where they will undoubtedly discover new extremes of productivity, finding endless stacks of 3 word stories).
Businesses instantly turn the methodology into a device for exerting power.
What you are describing only exists in books and on manifestos and perhaps in 5% of the businesses.
The rest bastardise and devour the methodology to their liking.
Slack is great, but it doesn't completely replace the ability to see and talk to people once in a while!
Put your details, questions, concerns, etc in text, where anyone can read it any time they like, and make "social hour" something related to being a better teammate on an interpersonal level, instead of a mandatory part of the process.
Every step in a software project is solving problems.
You don't know how long it will take to solve the problem.
You don't know how many new problems you might run into along the way.
This, a thousand times over! Why is this happening?
I don't see the distinction here. The whole reason you're estimating things is to guess at how long they'll take. If the story points do not translate to time, what use are they?
is the task description ambiguous? then estimates will be all over the place, so you'll know you need to spend more time on specifying the task.
and risk minimalization. if the estimated complexity is above some threshold, you need to break down the task.
to get a time estimate you use the historic burn down rate (average story points per sprint completed/delivered)
but it's not something you can do up front.
His conclusion seems right to me, though. Team management doesn't need fancy processes. The overview of activity and backlog needs to be there and the manager has the responsibility to enable syncing between activities and to the outside world.
If one calms down after that conclusion, one realizes that managing a single team of developers is trivial. Maybe that's why people are not talking about it so much (in contrast to "nobody gets it")?
The art is in reprioritizing existing tasks when new ones come in (because usually something needs to get dropped), and additionally there's art in handling the meta level when single processors of a team are not just people but subteams.
I only briefly touched on the re-prioritization (I usually describe it as triage) with the brief scenario of a production issue or cross-team communication once a commitment has been made in scrum. Maybe in another post.