Simple solution: Small teams that own what they're working on, soup to nuts. This probably means 1-2 devs on a team, a designer spread across 1-2 teams (depending on the scope of the team), and the PM helping all of them, where necessary, but not giving them exact specs of what to build. Product manager in an agile-ish organization is an extremely poorly named title, as their primary function is not to manage, but to communicate and facilitate things that are otherwise difficult or inefficient for designers and engineers to get at.
If the PM is giving the designers and devs exact specs on the product to build, he or she is failing in their role. The PM should be focusing on communicating succinctly the business problems to solve, and the software solutions evolve naturally from the small teams. The product manager, as a member of a team that builds software, should be expected to have some insight as to the general design of the software (maybe in the form of wireframes or whatever). But providing exact specs for what to build is a battle-tested Recipe for Failure™.
This may work with top x% of programmers/designers but the fact remains, majority of the programmers and designers are not equipped to take a business problem and come up with a solution. Their core job never was to think deeply about a problem in a niche domain. Now I understand at startups it is common for multiple of these roles to fall under one person but I am yet to see that system scale, especially at an enterprise level where the customer may have very peculiar requirements the product manager can really help think through before it even goes to the devs/designers.
I'm not convinced it's more likely that the PM can think through the issues better than anyone else on the team.
In any decision, in any industry, in any profession, the only way to make a good decision is to understand the cost and benefit at the same time.
IMO, almost all corporate software disfunction comes from that failure. The decider understands the cost, but not the benefit, or the benefit, but not the cost. Sales teams that don't understand the product, developers that never talk to customers, managers that don't understand how to code are all trivial examples of this.
If the PM designs the feature incorrectly without understanding how to code or understanding the current codebase, there could easily be a 10x cost difference, in terms of implementation and QA. If the developer writes a feature that no customer will accept, all implementation time is wasted.
How to fix that failure depends on the relative costs for the PM to learn the codebase, vs. the developer learning the business domain. Though both are nearly essential to a successful company.
The same terminology would no make sense to the client and a client's terminology("I just want feature x") leaves out bunch of implementation details which a good product person should be able to clarify and fill in instead of leaving it up to the developer to guess.
The startup world is certainly leading the charge in this right now, though I think it's often mischaracterized as a resource constraint problem, one that will presumably be solved down the road by hiring a proper product manager when the company needs to "scale". Training all employees to think strategically about business and product is a universally good thing, but especially for those building the products, especially when the software is the product.
As far as how well the system "scales", I suppose it depends on your definition of scale ;) Facebook seems to be doing pretty well with some version of this approach, and I think they're at about 3-4k employees, at least half of those are technical afaik.
I'm a product designer, and I do see think deeply about a business problem and finding a solution as a huge part of my job. If not to find solutions to problems, what are the job of a product designer?
Forcing the PM to create requirements based on things she doesn't understand well is a surefire recipe for failure, as you said.
This is probably the best Product Management approach I've heard of for early startups: http://www.quora.com/Stripe-company/Does-Stripe-have-product...
For most startups the PM role is fulfilled by the founder's vision. However, as the company begins to grow and the initial vision is achieved, it becomes harder to go without someone focusing on the product aspect.
The reason someone like Stripe can do just fine without PMs is for two main reasons:
1)Incredible market pull - they just need to be methodical about following the requests and prioritizing them in accordance with revenue/implementation sequencing/risks.
2)Stripe's engineers are the target market - Stripe is built by developers for developers to make payments easy. Much like Jobs embodied his own target market, whenever you are building something for yourself, market research becomes a bit less important... more implicit.
If the above conditions are true for your startup, then absolutely, you can follow the example. However, if you are missing either, you should really consider dedicating someone to this research and visioning.
It also seems like Ries was trying to direct this post at all kinds of companies, not just early startups. Patrick himself said that Stripe's approach to staffing is an example of "doing unscalable things while you can". Not everyone is a small startup and the approach of letting everyone manage themselves doesn't work at 100+, 500+, 1000+, 10,000+ person companies.
It all depends on on this:
>The engineers who build our products enjoy, and are good at, figuring out what the product should do—not just how the implementation should work.
you mention "early", this is key. PMless agile is super simple until you have live, paying customers (as in corporations, not people). breaking existing stuff is simply not possible anymore, SLAs are in play, validation efforts, etc.
throwing shit at the wall and see what sticks works in hardcore startup mode before you have a customer base. then it becomes suicidal. even in consumer space, see the various failed relaunches and redesigns.
If you're going to make a spec that defines 'how' as a PM, then you need to make sure your designers, coders and QA do exactly what you tell them. This can actually work if you're an incredible genius and know the details of what you're trying to do incredibly well.
Most PMs don't fall anywhere near there, and so the solution is simple: spec the 'what' very broadly. Delegate protocol specs to the programmer(s) who will be dealing with that protocol the most. Make sure the programmers create a product that the designer can 'skin' afterwards - such as moving elements around the screen without programmer assistance. Have the QA work with the actual product and not a spec - the users will be using the product after all, not the spec.
I didn't get that from this piece at all. At what point did the PM in question appear to be micromanaging?
> spec the 'what' very broadly.
That's laughable. No client or internal stakeholder is ever going to spec the "what" broadly for any definition of the term, and if you let them you're setting yourself up for failure (because they're not going to like what you make, and they sign your check).
> Make sure the programmers create a product that the designer can 'skin' afterwards
And what if the programmers leave something critical out of the design? What if the designer was expecting behavior that is different from what was created?
> Have the QA work with the actual product and not a spec - the users will be using the product after all, not the spec.
QA resources use the spec to create a test plan from which they test the final product. If they don't test against the spec, what do they test against? If they're not testing against the spec, they're not exactly doing quality assurance, are they?
Your comment shows a strong lack of understanding of almost everything that was mentioned in the blog post linked to by the OP.
Sometimes the developers don't have product skills but know they don't have product skills. This is also fine.
Sometimes the developers think they have product skills when they don't, and this is always a complete disaster.
I can't tell what situation the company described in the article is in. Perhaps the product manager is incompetent (there's lots of incompetent product managers out there), and the developers are correcting for this. Perhaps the developers have good product sense that the product manager is underutilizing.
But if the developers don't have the product skills they need, skills you get by spending lots and lots of time talking with customers instead of (or in addition to) writing code, no amount of process tweaking is going to fix the real problem - individuals assuming that their technical skill translates into product skills, and insisting on also doing something they're not good at.
I've often wondered why the subset of developers who lack product skills and/or the relevant domain knowledge frequently insist on getting involved with product anyway. After all, the head of human resources rarely insists on committing code.
The "what" would also often include essential features, such as "visual graphs or charts over time". These features would grow (or shrink) over time based on market feedback and testing.
You rarely want your programmers/designers/QA defining the "why" behind your product. If it's a startup, you're generally creating your business because of the "why" - you don't start a company and then use the company to find a "why". If it's an established company creating a new product, you'd have had market research and similar groups set up before the product even started to identify the "why".
Why do these have to be mutually exclusive? There are plenty of cases, especially in enterprise, when if you solely( relied on spec-less QA, plenty of important use-cases would be missed.
The PM needs to take a step back and let the designers/developers handle the tricky details of the spec, and let the testers test the resulting product and spec from them - not the high level initial spec. You don't need to bring in your testers until you have something concrete for them to actually test against.
As something concrete so that we don't get lost in abstracts: if you are building an accounting package that needs to calculate a user's tax rate from information pulled from twitter, then having your testers create a framework for testing exact tax rates before the product is complete is folly. You may only be able to generate a 'rough band' of tax rates, and now your whole perfect spec and testing is out of the water as all the values test incorrectly.
Rather get your team to build in a way to add tax information to the system, then once they have they have the method to get the best data possible you add details to your sparse spec and bring in the testers to test the real implementation against the real spec.
Obviously there are tons of other nice patterns to follow here that are heavily written about - but the "design in stone from on-high" is a very fallible approach. Not that it hasn't worked in the past - it's had some great success - but it requires someone with a perfect image of exactly what they want and a team willing to follow that person exactly.
There is a specific reason why a QA plan is formalized before development: so that QA tests for client's requirement and not what is developed. Ideally what is developed will be what was needed but it would be foolish to assume that. If the QA was simply QA'ing what was developed, it is assuming that what was developed is what was needed. That is not a safe assumption to make, especially at scale and in enterprise.
You may only be able to generate a 'rough band' of tax rates, and now your whole perfect spec and testing is out of the water as all the values test incorrectly.
I dont quite follow but a QA plan does not need to hard code values. It should have test values where possible - in this case, may be the QA could find a couple of example twitter accounts and what the tax rate info. should be if done right. If you are saying the formula for tax rate may change mid way, that is okay as long as QA is notified. This way the QA and the dev independently do their own calculation and if they do it right, QA's manual calculation should lead to the same answer as when QA uses the functional app to do the calculation.
Your point about building up a fairly generic testing system right from the start like that seems like a very good idea. Most QA I've worked with hasn't been anywhere near that farsighted and probably where I got that opinion from - seeing QA throw out their hacked together models and hacking up a new model (and stringing together hacked up models...) From experience, I think most companies take the hacked-together route for QA and put their junior recruits in QA. This is probably different in a Fortune 50 company? I've yet to work at one of those...
maybe this is startup-land naivete speaking, but I pity the company whose product manager doesn't routinely use their own product.
What if your product is creating giant networked ovens for Nabisco assembly lines?
What if the product is an Enterprise Firewall Spam Detector IDS Bad Guy Defeater Gateway Proxy with more feature check boxes than atoms in the human body and the PM (who is a PMP, obviously) has a BA in English with a minor in interpretive dance?
What if the product is a series of experimental HIV vaccines?
Not every product in the world is a consumer webapp.
The number one skill a good product manager should have is being able to understand what the users need (not always the same as what the user wants, of course). Using the product, when possible, is a great shortcut. When it isn't possible, customer understanding is going to require phenomenal communication and imagination. That said, I would still be very wary of putting someone without some domain knowledge in charge of speccing out a super complicated product.
I'm a lot more hands on, but only because I've never experienced a QA process that actually did its job.
I think when you're still small, most typical PM functions should be distributed instead of centralized, with a manager getting in the mindset of managing people who are 10-20% PM / 80-90% engineer or designer.
While the overall product/business strategy and business requirements should be owned by one or a few founders/officers, I think as many people as possible in an early stage web/mobile product company should have most of the general skillset of a PM. Highly specialized talent might be an exception if you're creating a sophisticated product or have unique interface goals. But the general PM skills -- mockups, basic design principles, writing stories, writing requirements, managing a story/requirement from creation to deployment -- should be universal while you're small.
The inflection point goal at web/mobile startups is getting MVP after MVP out the door until you've pivoted or iterated your way to a good-enough product for a good-enough market. (Or good-enough product for a specific customer.) It's an all-hands-on-product affair.
With this in mind, I think the early CTO should be a hybrid of engineering talent and director of product management. Or perhaps depending on skillset a separate person (another co-founder or the person you thought should be the first PM) is that director of PM and the CTO is more like a lead architect. The point is that rather than doing all the product management or all the VP Engineering stuff, this person (or two) manages a team of hybrid engineers/PMs, retaining some important overall product and architecture decision-making responsibilities, but empowering and helping designers and engineers employ basic PM skills individually or split among a small group such that the person or small group owns more process for a feature.
What about one where the pudding is actually a process or physical commodity and the technology portion is merely an extension of this?
Wireframes should be signed off on by engineers before design starts, and design should be signed off on by engineers before it is approved.
"When the programmers get it (the specs), they often start negotiating with him about what's going to be built. They exchange countless emails, and he's being constantly interrupted and being asked to clarify exactly what the spec means. "
A simple solution is: speaking about the project/product feature with your developers first. Understand what will take long, what won't, what can be stripped out in the first iteration. Once this is done, work with designers to push something out.
Another super simple solution is videos.
Writing takes time. Wherever possible I try to make videos for the designers or even better have live calls.
In terms of idle time, there should be always a backlog of specced out work or bug fixes that the developers can focus on. Failing that, they can work on optimising past projects, refactoring or reading about new things.
I can imagine on some very complex projects videos being a difficult medium. But for me it's great: I just record the things I want to get done, bring up examples from other websites, use the mouse as a pointer to direct the viewers focus. I summarise what I want to be done at the end.
Usually I send over the video and checklist of things to do.
It's easy to have a reference to the Stripe article, where there are no PMs. But, this environment is the exception rather than the rule. After working in a few medium sized places, the ratio of engineers who place an equal level of detail on usability & functionality vs those that place a higher value on functionality/architecture is easily 1:6. And you can guess which ones are usually the Sr members. To their line of thinking, usability isn't something they have ever had to worry about.
I do agree with a previous poster that this is something that the startup culture is starting to change, but I would bet that this change won't be visible at most organizations until at least 5 - 8 years out as the engineer talent changes.
Now, I would say this problem is something that can be changed with some UX standards. The PM creates flowcharts and/or wireframes of the proposed solution > hands it over to the agile team > dev + ux discuss any flows that fall outside of established standards. As a PM myself, I would love this. To be required to only provide more end user/market requirements as needed would be ideal.
One issue with this is that usually with organizations that have these issues, the investment just isn't there in the area of UX (I sit here reflecting on my current situation).
I will bet that the product team disfunction mentioned in the article is directly reflected in said team's products. :)
It might be not obvious, but the main problem having a dedicated PM on the team, is that unless PM has very strong engineering or business experience [and in this case that person should either code or bring in money], the PM would fill very insecure , surrounded by highly experienced people, who can actually do things. Cost of somebody on the team, feeling very insecure and trying to bullshit/manoeuvre/bully his way around outweighs any potential benefit to the team.
From my personal experience, I've seen projects fail solely because of PMs protecting their turf.