Hacker News new | past | comments | ask | show | jobs | submit login
Your developers aren’t slow (sprint.ly)
176 points by davidkellis on Nov 20, 2014 | hide | past | favorite | 75 comments

Nice start, OP (especially with unclear specs, my personal pet peeve), but you've only hit upon the tip of the iceberg.

Just a few other things I call "work prevention" that make good developers appear to be slow:

  - unclear specs
  - context switching
  - peer review (both giving & receiving)
  - code review
  - status meetings
  - waiting for others to show up at meetings
  - others not prepared for meetings (no agenda, etc.)
  - unnecessary but enforced standards (stupid, pointless)
  - programs locked by other devs (get in line)
  - pre-existing technical debt (must be fixed before amended)
  - framework can't handle it
  - works but won't scale
  - can't recreate bug
  - platform issue (server, DBMS, etc.)
  - environmental issues (test server out of date)
  - waiting for server resources (testing)
  - waiting for regression tester
  - waiting for user acceptance testing
  - specs changing during user acceptance testing
  - uncontrolled deployment "freezes" (month-end, release schedule)
  - noisy office
  - meetings when emails would have sufficed
  - phone calls when emails would have sufficed
  - text messages when emails would have sufficed
  - continually changing priorities
  - delays in offshore communications
  - lack of training (technology)
  - lack of training (business subject matter)
  - too much training (HR bullshit)  
  - not enough donuts

- reinventing the wheel because a perfectly serviceable existing solution cannot be licensed

I can't stand hearing that there's no room in the budget to pay a flat $500 to license a 3rd-party grid control (for example) only to have a $90k/yr salaried developer spend 20 or more hours recreating barely a tenth of its features. It's even worse when the license is free (as in speech), but it still can't be used, because the code hasn't been blessed by the security department, or because management simply has a prejudicial bias against the terms of the LGPL.

On the flip side, as one of those guys in the security department, it aggravates me to no end when developers can't understand why we won't allow them to open up direct production database access to some unheard-of SaaS startup that's been in existence for a total of a month.

While you present a reasonable example of why certain things have to be cleared before implementing them, most of the time you are used as an excuse to issue a knee-jerk denial of some suggestion, without even bothering to consult with you first. And people at other companies are not necessarily as competent or diligent as you are about doing your job.

Usually, the argument goes like this: - We cannot use X until it has been cleared by legal and/or security. - Legal and/or security do not have the resources available to clear X for our use within a time span that would allow X to be useful. - As a result, we won't even ask them to try. - Therefore, screw you, nerd.

Alternately: - We don't have room in the budget to license X. - The budget won't change until after the end of the project. - We can't justify altering the budget to accommodate licensing X unless it would be useful for an ongoing project. - You won't be allowed to comment on upcoming projects until after the budget is fixed. - Therefore, screw you, nerd.

Or in the worst case: - Here is a detailed proposal to use invented-elsewhere thing X in our product Y, with gobs of research and references, and oodles of possible pros and cons. - We don't understand what you are talking about. - So we're going to fixate on some tiny, mostly-irrelevant detail, and then blather about it until you go away. - Therefore, screw you, nerd.

I'll give them a pass if they end up using an older product that gets turned from an LGPL into an AGPL license so that the developer can change it from a free (speech+beer) into a free (speech) version.

Or management that continues to blindly believe everything that is free on the internet is absolutely worthless in terms of quality and support. And forces you to clone already existing freeware tools -_-

- underpowered hardware

At my previous job I requested an upgrade from a 3 year-old laptop. Initially they approved it, but before ordering one they reversed and denied my request. The justification was they'd spent too much outfitting new hires and it was no longer in the budget, despite my request coming long before the interview process started for the new hires. To make it worse, the new hires were entry level and I had the second longest tenure at the company (~90 people).

Now I work somewhere that happily provides every technical employee with the latest maxed out hardware. The difference in amount of work accomplished is staggering.

I'm doing development work on a 5 year old laptop with 4GB of RAM. I constantly have performance issues, so I put in a request to bring me up to 8GB of RAM at a cost of $73. My manager says "is it really worth it if you're getting a new machine in four months?"

Well, I guess that just depends on how productive you expect me to be over the next quarter.

Same for me : my dev pc has a 6 y.o. AthlonX64 with 4GB RAM, which force me to have a second pc to only run Lotus on it ! Six months ago, my manager ask to upgrade my workstation for the standard build provided by the IT dept, with some modifications in order to lessen the cost (the standard conf cost like 2k, while his was 700$). And he was replied : "your conf costs too much, here is a 400$ conf with an i3 core". He said then : "Fuck it, we'll take the 2000$ standard dev build then".

In reply to the top post, I would add the following task delayer : - supplier unreliability

We order a server from a major manufactoring company, which had a faulty PCI bus (well hardware defaults happen). They took 6 months and 5 differents motherboards to ship us the same exact configuration we initially payed for !

Can someone with more business knowledge than me explain this?

Like... you'll pay a developer $100 an hour, but then you ask them to spend an hour in a meeting to justify the purchase of a $50 widget that will save them an hour a day for the next year.

There must be some twisted logic at work, but I don't see it.

Makes management feel useful. They can tick off some boxes, and feel like they contributed to the process somehow.

I am sure I would be way more effective without my managers. Prioritization would be somewhat different, but many of the day to day problems that hinder me being properly productive would have gone by now.

I've been in situations where it basically boils down to hierarchy and reinforcement of the status quo. I once had a request for a new laptop refused because it would've meant mine was newer than the president/owner's machine and that wasn't tolerable to the corporate ego.

I think a big factor to this is the mindset of the culture - if it's a "We're in this together" culture, I'm sure the CEO wouldn't mind.

The question then becomes, is it a secretary/manager/ceo complaining about things 'not being fair' or is it simply that no one was ever asked and IT ended up making a decision on their own? Sometimes decisions left unspoken end up causing a kind of solidification by accident.

I find the biggest cause is that IT ends up being silo'd off (or being an MSP/corporate branch) so they aren't in touch with the workplace's culture and assume its one of petty ego.

Another factor can be that many of these decisions are looked through the eyes of the company's stock and that causes a sort of myopia regarding short-term costs vs long term gain.

Both of these end up being nasty feedback cycles and can be impossible to break once it becomes status quo.

The department has no chance to avoid paying that $100, but they do have a chance to avoid paying the $50.

Yet that $100 could have been spent on actually getting things done instead of spending $100 to save $50. I'm not sure that I understand your logic here.

Companies often create blanket policies on spending approval that might even make sense for parts of the organization but don't leave room for common sense.

They might need to categorize all expenses as capex vs opex, etc.

I'm guessing these people aren't in a developed country. I could imagine that happening in India.

> The difference in amount of work accomplished is staggering.

It's a shame this isn't more obvious. I was recently given an old laptop at work as well. So old, that it took 7+ seconds from the time I clicked "Inspect Element" in Chrome for the window to open. It was weeks before a purchase for RAM was made.

  - waiting for regression tester
Or worse, a lack of regression testing requiring substantial software archaeology in order to be confident enough to make a change to legacy code.


  - substandard documentation

- nonexistent or substandard tools

...including old compilers and otherwise a lack of support for the compilers, debuggers, editors, analyzers, profilers, etc. that will enable a developer to focus on business problems instead of technology stack problems.

After a long enough period of time, this can solidify as the more frustrated developers leave and all you have left are developers who are happy slogging through inefficient processes.

Anthropology of corporate cultures is fascinating: http://www.daedtech.com/how-developers-stop-learning-rise-of...


Can you elaborate on peer review and code review ? I have worked with and without these steps and can quantify the hours required and the quality shift in having these steps.

Do you think that these two can be disconnected from the development workflow with acceptable consequences? Or is it that there might be other standards or development measures which are no being met, such that there IS a need for review ?


I agree, there's a handful of these issues I'd planned on talking about in future posts; but your list adds a bunch more I hadn't thought of.

Lack of training is a tricky one, isn't it? If you give the wrong kind of training, or too much, or too little, it's easy to miss the mark.

If a feature isn't tested, or proved useful to a user is it really progress?

"Writing good specs is important" - it's also really really hard.

It is all too easy for a developer to blame difficulties on poor specs.

But if you're doing agile properly, you shouldn't consider a user story a 'spec' - a user story should be a placeholder for a conversation.

It is unrealistic to expect a non-technical stake holder to deeply and accurately describe anything but the most trivial feature. It needs a developer mindset to poke the idea, see what holds water, see where it falls apart, foresee the consequences.

You can't do that in a spec. It needs to be a conversation.

Agile favours 'Individuals and interactions over processes and tools', and this is too often forgotten when someone is pimping their tool, or blaming their process.

Have a conversation, understand what's needed, build good software.

> "Writing good specs is important" - it's also really really hard.

Yes, this is a very good point. Unfortunately it's also a point that many non-software developers miss. Writing software is hard. Very hard. We, as an industry, should be pointing that out more.

> But if you're doing agile properly, you shouldn't consider a user story a 'spec' - a user story should be a placeholder for a conversation.

I'll submit that there's no way to know whether you are doing agile properly or improperly, unless you move to another company.

> It is unrealistic to expect a non-technical stake holder to deeply and accurately describe anything but the most trivial feature. It needs a developer mindset to poke the idea, see what holds water, see where it falls apart, foresee the consequences.

This smacks of elitism and it is what makes non-developer scoff at us. Developers are not gods, we're not even the smartest people.

We are simply the people that have to crystallize the requirements into a computer that only understands logic. Non-tech stakeholders can completely describe business requirements and also be made to understand the how their requirements fit into the logic of a computer. Non-technical people are not stupid, they just have different interests.

> You can't do that in a spec. It needs to be a conversation.

The spec is the output of the conversation.

> Have a conversation, understand what's needed, build good software.

And this is the most important thing said.

The trick with adhering to your statement is to define the parties involved in each step. Minimize those parties and then implement each step in order as fast as possible without sacrificing quality.

Remove the word "Agile" and the concept of software development, and what you've just described is how work has been completed for thousands of years. I fail to see how Agile comes into play.

  > This smacks of elitism and it is what makes non-developer scoff at us. Developers are not gods, we're not even the smartest people.
I don't mean to imply that developers can do this because they are cleverer. Merely that developers have a mindset to understanding software systems that business people often lack.

  > The spec is the output of the conversation
As I've said in another comment I don't think there should be a spec (in the traditional sense at all). There should be a story, a conversation, and acceptance criteria (ideally defined in code)

Yes! I think this section of the post could definitely be expanded on:

It’s important to have developer buy-in. Their passion for a feature can be a huge driver for velocity.

This is why I also appreciate PM tools that incorporate discussion threads: that back-and-forth communication really helps clarify what you're building. I also like using the 5 whys after the initial spec has been written: "Why are we building this feature?"

  > It is unrealistic to expect a non-technical stake holder to deeply and accurately describe anything but the most trivial feature.
That's why you have a Project Manager / Business Analyst / Whatever Title You Prefer.

Few would consider going out and hiring their own construction crew to build their new house, managing materials deliveries, blockers, scheduling, head-count, etc. That's what a General Contractor is for.

But in software, both (some) business people and (some) developers think they can get away with attempting to go down a well-travelled path (by others) and do it all themselves.

Without that experience and expertise it's more likely than not IME that you'll spend more than you have to, and since you aren't a scheduling wizard and are trying to keep all this stuff in your head, you'll have numerous "stop work"s, conflicts, change-orders for half-baked designs, etc.

If you're building anything beyond the trivial, and you'd like to have a reasonable ball-park for what it's going to cost and how long it's going to take, make sure you're talking to the GC and not the plumber. And run far away from any business that tells you they cut out overhead by removing the GC.

I saw a slide not too long ago depicting a development process. It went something like: Skateboard -> Bicycle -> Scooter -> Car -> Truck.

If I'm the one writing the cheques, I'd be pretty pissed that I paid for four vehicles I didn't need before I got the one I did. Especially when it could have all been easily avoided by asking the right questions.

Agile can favor items on the left over items on the right. That doesn't mean you don't do the items on the right. It doesn't mean you start without a contract. It doesn't mean you don't plan. It doesn't mean you don't document requirements.

There's nothing Agile about having a chat between A and B, sitting at a keyboard, and spending money. That's just cowboy coding. And when it inevitably goes south, best of luck getting the stockholder to fall on their sword for you.

  > There's nothing Agile about having a chat between A and B, sitting at a keyboard, and spending money. That's just cowboy coding
Nowhere did I suggest having the conversation was the entire process. I am merely rejecting the notion that a story is a spec. I believe there should be no concept of spec at all. There should be a story, a conversation, and then acceptance criteria (which should ideally be defined in code, not a document)

So you would write functional acceptance tests, and the only thing written down is a Story.

If all goes well, maybe you have a happy client at the end of that. If they're OK with costs and time being potentially an order-of-magnitude off.

Most people outside of the software world aren't.

Most people outside of the software world use specs. I feel like if there were a better, more reliable way to build something, anything, it would be common.

But wether it's a car, a house, a boat, a bridge, a road, a piece of furniture or a motion film, that's not the way it works generally. If you want repeat business anyways. I feel like resounding success without specs is the exception.

At least it's always been that way in my career. The less planning effort, the less success. The correlation is so strong IME that it's basically like saying fire is hot.

As someone whose dealt with independent contractors for house work, it really sucks for a client to not know the what, for how much and when. I've never met a client that objected to nailing those down (within reason) before I started sending over invoices for Developer hours.

In my experience people buying software value working software that meets their business requirements within the agreed budget over anything else.

In twenty years I have never known a written specification help achieve that, and it has often hindered it.

I guess your experience differs, but I focus on delivering software my clients love, and so far that has never failed to work.

> It is unrealistic to expect a non-technical stake holder to deeply and accurately describe anything but the most trivial feature. It needs a developer mindset to poke the idea, see what holds water, see where it falls apart, foresee the consequences.

Lots of developers are pretty bad at this, too; its more of a system analysis than coding skill.

> You can't do that in a spec. It needs to be a conversation.

You need a conversation directed by someone with a clear thought process that can encompass what is important in both the business and technical domains, ideally involving both business and technical experts, to get to a spec. But you absolutely can specify what needs to be done in a spec, and if you don't document the outcome of the conversation in one, its almost always going to be trouble down the road in any significant system. In the best case, its going to result in the technical staff answering lots of questions from business users (often, when neither the original technical nor business folks are in the same role) about what the system is expected to do, rather than actually working on system improvements.

If you replace specs with conversations rather than using conversations to get to specs, then everytime someone would otherwise consult a spec to answer a question later they end up consulting a programmer, instead.

> But if you're doing agile properly, you shouldn't consider a user story a 'spec' - a user story should be a placeholder for a conversation.

Yep. It almost sounded like this article was saying "you're not waterfalling hard enough". Processes are not a substitute for human interaction, they are supposed to enable human interaction.

I've found the hardest part is "build good software". Granted, I'm new, only going off of experiences in four internships. e.g. Now, this company is building a tablet application, except the director is serious about UX and hired actual UX people. But things are constantly changing, on a weekly basis; we'll build out some complex interaction feature, then then have it scrapped a few weeks later. And the codebase has suffered greatly.

If things are changing because requirements are changing, rather than because complexity of requirements is being uncovered, then that is a hard issue to solve.

But it also reflects the real world.

Building good software is hard.

Yeah, as a dev I would have to get some huge book of specs. It would make things very tedious. And, of course, take a huge amount of time to create and follow.

Wholeheartedly true. Vague requirements are definitely the bane of my existence, just this last week I had a massive story I was working on which I implemented only to be told that I misunderstood the story at hand. After pointing out the confusion with the task, product admitted they got the requirements wrong.

Don't get me wrong, I like the story format of detailing requirements, in most cases it definitely helps (although I know many developers who dislike the story format), but it is not without its issues. The weak leak is always the product owner writing the wrong stories within a task.

In my opinion the biggest contributor to tech debt is NOT lack of stories in tasks or poorly written ones leading to vague requirements (although it doesn't help), it is unrealistic timelines and failing to factor in testing. When people are pushed to achieve a long list of things in a short amount of time, they take shortcuts to get them done which appeases the product owners but leaves some landmines in the codebase waiting to be stepped on.

The biggest issue with Agile in my opinion is lack of time devoted to testing. It works on a cycle of "You estimated we can get this much done within this much time" I have never been in a planning meeting that even mentioned testing as part of the normal development cycle, it is always an afterthought when the other big ticket items are done and time is running out. Time for writing tests and testing should be instantly baked into the Agile process (maybe it is and my experience has been unfortunate), this will significantly reduce tech debt.

I think the issue with testing and technical debt in my experience has been the fact the places I have worked in like to call themselves Agile, they have the daily standups that always descend into debates and meetings, they have the planning poker events and we do sprints, however they don't fully adhere to the rules of Agile. When you start slipping in extra tasks that were not estimated on or were just added and slip them into the current sprint, that is not Agile. When you start changing deadlines and priorities midway through a sprint, that is not Agile. I think this is a problem that is more prevalent than people realise. There are misconceptions as to what proper Agile is.

I used to work at a company that addressed testing with a three-pronged strategy:

1) when estimating, dev tests are part of the strategy (and, ideally, stories are written with enough detail to make testing strategies clear). Sometimes we review the ticket with QA to ensure that we both understand what's being asked for and what needs to be taken into account. Most tests at this point would include unit tests and functional tests.

2) Once a task is done, the story is reviewed with someone from QA to ensure it works. They suggest a couple of things to try that may require us to make improvements. The goal is to catch 80% of the issues at this point with 20% of the effort, and the pair-testing does a great job of flushing out issues. Here the focus is on functional tests and exploratory tests.

3) The QA team runs their own sprints testing the dev team's previous sprint's work. This is mostly performance and integration tests, but sometimes includes functional testing.

I thought the process was good because we were able to measure our software quality and address it quickly. That said, it feels somewhat cumbersome. This isn't an easy problem to tackle.

All of the projects I've done over the last few years have been "agile" - the company I work for actually offers agile training.

Officially, all projects are written using TDD, and all estimates for all tasks should include time to write Acceptance and Unit following the suggestions in "Growing object oriented software guided by tests", along with any integration tests with external software that may be required.

Even unofficially, I've never seen anyone consider a story "complete" until there's sufficient automated tests for it, and there's been at least some manual testing.

Another thing we strongly encourage (and do) as part of Agile is regular retrospective meetings on our software development process. If our testing strategy (or lack thereof) was causing Fear during refactoring, prod bugs, or difficulty during maintenance, this would be bought up during such a meeting and addressed.

I'm not saying you're not "doing Agile", but your experience very strongly does not match mine.

Your experience is definitely unfortunate - stories/features being fully tested before being considered done, and therefore potentially shippable, is a key part of a successful Agile process. Without it, you are missing out on a lot of the possible benefits of the process, and I have seen first-hand the difference having QA engineers, even not particularly strong ones, integrated into the sprint can make to product quality.

A good first step to take would be to get your team together and work to define a Definition of Done, which should include both unit and integration/E2E testing (manual and/or automated) being complete - any stories not meeting these criteria cannot be considered as "done" and you can't count the points for them in the sprint.

Of course, initially this will probably cause lots of failed sprints and will decrease your velocity, but you have to see this as a positive, in that this raises visibility of the problems, and once you can identify the specific issues you're having and take the steps you need to resolve them (whether that's a lack of QA resource, or poor testing culture among developers, or whatever), you'll know when you say something is "done", it actually is done, rather than hiding a load of extra testing work that isn't complete.

On the requirements front, again defining a Definition of Ready can help - these are the criteria stories need to meet before you will estimate or start working on them. This should include things like requirements being clear, designs/UX being complete (if appropriate) and the story being testable and of a suitably small size for you to estimate with a reasonable degree of confidence.

Once you start pushing back on estimating stories because they don't meet these requirements and educating your product owners what they can do to improve the situation (for example, breaking stories down into smaller chunks, or defining requirements more clearly), you'll hopefully find the situation improves.

Of course, none of this is a substitute for conversations and working with the product team to help them understand what does and doesn't work for you, but I've personally helped make big differences to a team's quality and productivity by taking small steps such as these, in addition to educating the team and business on what "agile" is (without all the BS that some people will try and sell you!) and why this is important to us as developers and therefore to the wider organisation.

Same here with vague requirements. Interesting enough, between requirements that are a couple of sentences that I pull out from a conversation or a 30 page requirements document, the vagueness is often a problem in both. In the former case, there just isn't enough information and it's clear that the feature just hasn't been thought out. In the latter case, the information is too detailed and there isn't enough room for interpretation of small, mainly insignificant (from a product perspective) details, on the developer's part. Either way, it ends up with a lot of wasted effort and a lot more time being spent on a project than necessary. Almost always, this is avoidable if the spec writers put in more effort in tackling the problem, understanding the system, and understanding what is and isn't possible.

I've found that, when building applications for clients, they never know what they really want. I try to explain that writing an application is just like formally documenting your business process.

If you don't have the process, then you don't have an application to write.

I have seen huge improvements in quality and productivity by using stories plus wireframes. The first thing a UI dev does is create a quick mockup for feedback from the end user (specified in the story in the blank after "As a ____" Unfortunately implementing this can be a political struggle because it takes a lot of the decisions away from the business and the project manager and hands them to the developers and users.

My boss in previous company forced analysts to write bug reports and feature requests in the form:

"how it works now, how it should work".

This took at most a month and made life for everybody a lot better.

We learned this the hard way. After months of development we really didn't use (or know) any tools for proper development so we just wrote code in the most disorganized manner possible, which resulted in shipping far apart with poorly written code. After hiring a consultant on the matter, to introduce dailys, user stories, agile and whatnot,... well the world changed for us. Significantly. For the better, of course.

I think yours is a different scenario?

The article suggests that developers trying to match business concerns with technical implementation disproportionately takes up time, as each feature was not spec-ed out and described properly in a technical manner.

Accumulating technical debt would be a delay purely on the engineering side. The engineering team is not always to blame, though.

Nope, not much different - There were two developers in the team (me + another dev) and my co-founder, who's a business (non-technical) co-founder. So it was between me and him where the specs and activities were not properly defined (it was my responsibility, and I didn't know any better) so we'd always have these long debates on what to do, and what's worse, these debates usually occured in the middle of other activities, so everyone was working on a lot of things at the same time, which resulted in work not being done on time, or even at all.

Context switching can be the worst. Particularly when it's caused by "hurry up and wait" brought on by clients.

PM: Client wants feature X and they need it now!

Developer: Great, I can get started today, but first I'll need clarification on X, Y and Z.

PM: I'll e-mail the client...

Then hours turn into days, days into weeks. By the time they get back, I've completely forgotten what it was we were going to do on the project or why I needed to know X, Y and Z.

remove the PM.

You've never had clients that take that long to get back to you about something that is actually important to them?

That's pretty lucky.

I learned to send follow up emails to remind them I was waiting on their input. If I don't hear back in a reasonable amount of time it's an email.

Oooh boy do I hate that style of user story. In my experience it just seemed to degenerate into "as a business owner, I want developers to make this feature so that our product can have this feature".

That might follow the pattern but it breaks a bunch of typical user story rules. It's easy to blame the format, but it sounds like the product owner needs to be educated on the process.

I don't think specs are ever going to be perfect. Simply because the people typically writing specs (project managers, user experience designers, business analysts etc.) view the requirements through a different lens than a developer.

In my experience the best solution is for them simply to be available to the developers to answer questions that arise in a timely manner.

Accurate reporting is great, after "accurate" guestimations. As a 15-year developer, I still get it wrong. Careful planning before you start is so crucial. And my boss knows to take whatever I say, double it, and then add a day. We're finally all on the same page :-)

I love the Mad Libs-style form in sprintly, btw.

Joe Stump told a story on his podcast about this: the VP of Engineering would always tell the business side, “Everything always takes 60 days.” He had a high confidence that most tasks take 60 days.

Here's the link to the audio: https://soundcloud.com/business-of-coding/business-of-coding...

To me the root cause is generally a lack of resources in terms of time and money. Contributing to that is the general inability for people to grasp the complexity of software development.

After implementing the requirements, it is found that they did not reflect the actual desired system. Then there are technical issues that were unforeseen. Rather than accepting these technical issues as additional costs, they are blamed on the developer, "if they knew what they were doing, there would have just done it already". Incorrectly or poorly communicated requirements are also blamed on the developer. To me the root cause is generally a lack of resources in terms of time and money. Contributing to that is the general inability for people to grasp the complexity of software development.

Personally, I have never had a client or budget that I could say was really reasonable throughout. I believe this unicorn may exist somewhere. Certainly I haven't been finding my clients in the most auspicious places.

> it also appears that teams have a hard time transitioning from “done” to “tested and ready to be deployed” (look at Completed to Accepted above).

How can it be done if you don't know if it works? It's troubling that it takes a long time to go from 'done' to 'ready to be deployed', when they sound like the same thing.

I mean, really it's just terminology, but it suggests that there's a lot of tasks that are not being considered as 'real work' that are actually quite time-consuming in practice. It can be useful to look over what work was actually done historically so that perception of a task matches reality.

Here are a few of the resources I have used in the past to help explain to non-technical folks some of what a developer needs to deal with when involved in a large project.

1) Ward Cunningham explains "technical debt": https://www.youtube.com/watch?v=pqeJFYwnkjE

2) comic strip - "How Projects Really Work": http://www.projectcartoon.com/cartoon/2

"The second biggest complaint we see from developers? Constantly changing the specs once work has begun." Of course this is called being agile. If everything has to be known before you start work how is this different than waterfall? People don't know what they want or need until they see it. So you have to find some way to let me see it which means writing something based on incomplete information.

Waterfall is all up front. Agile is up front at the task level. Normally when commited to a sprint it doesn't change.

> Waterfall is all up front. Agile is up front at the task level.

Agile isn't a methodology, its a meta-methodology (a set of principles for selecting and dynamically adapting the methodology in use.) If waterfall is the process that produces good results for the set of people you have working on the problems you are addressing, then waterfall is Agile.

What you say is about sprints is accurate about Scrum, but while an Agile shop might use (and especially start with as a baseline) Scrum, Scrum is not the same thing as Agile.

There's a good reason to argue that not committing to requirements at some point so that work can proceed directed at a fixed target is suboptimal in the general case, but arguing whether its Agile or not -- and even moreso describing Agile as definitively being up front at some specific level, is confusing metamethodology with methodology and reducing Agile to a fixed methodology, of which it is very much the antithesis.

That might be more agile, but it's not Agile.

> - Unclear requirements

> - Changing requirements

> - Context switching

That's Agile Programming!

What incentive do your developers have to ship faster ?

Feel more accomplished? Smarter? Have more unhassled free time to spend with their families? Get it over with so they don't have the "oh my god we're going to die" feeling hanging over them 24/7?

If "faster" equates to lower quality, bugs, more post-release thrash - simply to meet a contrived / marketing deadline - then none ;)

Exit earlier?

Why do they call them "Stories" in this article? Is it because of Agile?

They are called "Change Requests". Put on your big boy pants and replace "Stories" with "Changes" and it makes much more sense.

In my experience of building new things, rather than modifying existing ones, "Changes" would make less sense, as there is little or nothing to change.

Fail. All of this shows the same problem that "Agile" in general has: it's an attempt to patch closed-allocation, business-driven software engineering. It does not work. Maybe it turns the 0.5x developers into 0.75x players, but it alienates the shit out of the top talent. Fuck user stories. Fuck letting the passengers fly the plane. Fuck "sprints" or "iterations" or whatever nonsense you're calling these artificial time periods. Just let the people who can build great things, and who want to do so, go at it.

I agree with some of what's being said here, but not the direction it's heading. Business-driven engineering is a dead end and it can't be patched by hiring consultants to tell people how to play planning poker. If you want to build great technology in-house, you can only do it as an engineer-driven firm.

The real lesson from all this is that you should never let people work on more than one thing at once. Make sure they know what it is.

Neat idea. Fails horribly in practice. Why? The problem with "one thing only" management is that the matter of who chooses each person's "one thing" gets horribly political, and fast. "One thing only" works only if the workers get to choose their one thing, which brings us to open allocation, which leaves me questioning whether we need all of this process that the engineers wouldn't impose on themselves...

Also, "one thing only" tends to create dead time and throughput issues, (a) due to communication issues and blockages, and (b) because people who'd rather have a different "one thing" will often do other stuff anyway. It also tends to lead to a high burnout rate when people can't voluntarily switch between at least two tasks.

Mike, you're one of my favourite commenters on HN. But this is one of your concepts that I've never been able to wrap my head around. Specifically:

>....business-driven software engineering. It does not work.

My interpretation "business-driven software engineering" (and please correct me if I'm wrong) is one where business determines the requirements of the software. But if business doesn't drive those requirements, who will? You've repeatedly shown how terrible the typical dev is at business, and most devs would seemingly cozy up to their IDE rather than navigate the mess that is determining requirements. So who's gonna do it? Besides, which dev doesn't hate a fuzzy spec?

Additionally, or perhaps more fundamentally, a business hires devs to the ends of furthering their, well, business objectives. What, exactly, is so wrong with business setting the agenda of what gets built?

I ask because of a lesson I learned early (and painfully) in my career as a sysadmin. Technology exists for the business - Not the other way around. Unless, of course, you're a tech firm, where tech itself is your business. And even then, a business has to make economic sense, and as a programmer, I think I'm allowed to say we typically make poor economic choices. McLeod 'Clueless', if you will.

What, exactly, is so wrong with business setting the agenda of what gets built?

I don't think that it's good when the business (usually, this means the executives) do it unilaterally. Obviously, it's not good for engineers to build with no concern whatsoever as to whether they're building something useful. It needs to be a collaboration focused around letting each side do what it's good at.

I'd like to believe that enough of us have sufficient business sense that we don't need the Agile-style waterfall (no, that's not a contradiction) in which requirements flow from business into "product" into technology. Not all of us make sound economic choices, but I think that a large part of that comes from the fact that most companies promote people with any business sense "out of IT".

As one who's trying to look forward for us as technologists, I guess I'd say that we need to take some responsibility for learning business and politics. The head-in-sand strategy is bad for us individually, but also bad for us as a group.

There is a middle ground. Good managers will usually have a list of 2-3 things that you could be working on, and then the employee can pick one thing to work on amongst that list.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact