These are all things we care about, but they are probably not the most common questions we ask. E.g. we already know about the equity split because we ask about it on the application form, so we only bring it up during the interview if we noticed something odd about it.
The thing we care most about in interviews (at least of things one can change) is how engaged the founders are with users. How do they know people actually want what they're building? Have they talked to real, live users? What have they learned from them?
We don't care super much how big the initial market is, so long as the startup is making something that (a) some subset of people want a lot, and (b) if that market is not itself huge, there is an easy path into bigger neighboring ones. Basically, we're looking for startups building Altair Basic.
A good corollary question to "How do you know that people actually want this?" is "How are people solving this problem now?"
If founders respond that there aren't really any current solutions, then it usually means that either a) They aren't making something that people really want, or b) They haven't talked to enough users.
If it's a problem people actually have, then they must be coming up with crazy hacks or solutions that are much more tedious/inaccurate/expensive/generally more painful than the one you're coming up with. Very rarely is there simply not some kind of existing solution.
Christensen talks about this too, as "non-consumers" who have a job they need to get done, but can't because they lack skills, or money. Instead, they pay a professional to do it for them, or "cobble together a solution". There are also "non-consuming contexts" where you just can't use a product (e.g. a landline phone in a car).
What I found really interesting was that the reason successful disruptions are "more convenient, simpler and/or cheaper" is not because that's an improvement, but because it enables the disruption to be used by non-consumers... (who lack the skills for a complex product; or the money for an expensive one; or access to an inconvenient one.) They are delighted to have a solution better than what they have now, so it doesn't need to be as good as the incumbents'. Secondly, if it's not good enough to appeal to incumbents' customers, it won't provoke a competitive response.
Aren't some of the most world-changing businesses actually businesses where a simple solution already existed, yet entire industries have been created or destroyed in spite of these simple solutions??
For example, before the automobile, horse and carriage was a 'solution' to the problem of getting around. In the early days, was a car a much better solution than a horse? Horses were readily available, didn't need a gasoline infrastructure, they even self-replicated so you could get a new one every few years.
I understand I'm playing a bit of devils advocate here, but I often gut stuck in the mode of thinking existing solutions are good enough, then somebody comes along and evolutionizes an industry with what at the early stages might seem to be minor improvements.
Another example would be the early days of sunglasses. If the question was asked 'how are people solving this problem now', the overly simple answer would have been 'they squint a bit'. With that as the answer, would you go and develop sunglasses?
"Very rarely is there simply not some kind of existing solution."
Yep, when you talk about known problems. Still often both the problem and solution (which is just another view of problem) lie just outside current scope of people imagination, and are only found by prospective minds when the way is open (other conditions met).
There was not simply some kind of existing solution for Internet 50 years ago.
If you solve an unknown problem (ie. an issue nobody is concerned about) you will have a very hard time to market it and show people how they need it.
There were solutions for this Internet thing 50 years ago, even if it's hardly imaginable today. If the problem you are speaking about is communication we had that 2000 years ago, in form of smoke signals. It sucked and was dog slow but it was not an unknown problem.
Thanks PG. Could not agree more on getting users as early as possible, and getting as much feedback as you possibly can from them.
In our example, we produce and sell incomplete ebooks, knowing full well we may get a few refund requests, precisely because it's invaluable for us to know how to convert a paying customer, and to ask those customers for feedback.
My partner and I got a YC interview last round (no luck this time, unfortunately) and one thing I'd like to caution is that if you have a "chicken and egg" situation ("Why would X users use your site until you have Y users and vice versa?) or an "established competitor" situation ("What's to stop Google from just adding a link to X product?"), you should have as close to bullet-proof responses to those questions as possible.
Our interview was pretty much spent trying to respond to those questions. This is why having an actual product with any sort of customers is so valuable, because that's live evidence that what you're doing is working. Unfortunately my partner and I had little more than an idea and half of a web application, which we didn't even get to demo because we were so caught up trying to defend our position. I literally never even turned my laptop around to PG and company, which is probably my biggest regret about the whole thing, because I had spent every free moment of the past several months working on it and maybe that could have said more about our ability to execute than a debate on whether X market for Y users even existed.
nhashem, great point. i think its important to get users as fast as possible, and what investors really care about is what you LEARNED from those users (what they like, don't like, data that supports your hypotheses). better a poor product + hundreds of early users than a polished product that hasn't been launched, imo
Those are great questions. Similar questions I've asked during funding and M&A interviews;
What do you not yet know, how are you developing information about that?
This is knowing what is risky and what isn't. Some problems are just 'engineering' you do work and get them done. Others need 'new physics' which is code for an imagined but not yet designed feature or capability to work. Too many of the latter can be a real problem.
How many people do you think you'll need to realize this vision? How many to keep it current?
One of the more sad failure modes of startups is over hiring. More people can be good, too many people is really bad, understanding what the people requirement is can bite you if you don't get it right.
How will customers find you? What does it cost you to be visible to them?
Customer acquisition, especially in a demographic that doesn't congregate (small/medium business fits this category) can be unduly expensive. If you have a product that would sell like gang busters at $X but it costs $X + $Y dollars to get a customer, you need to fund to n customers such that the $Y starts going down via word of mouth or other coverage.
Stack rank your feature set in 'lifeboat' order, explain how you got that order.
You should have more features imagined or lined up than you can deliver, that gives you follow on. But you also need to know what is the minimum set for a viable product. Understanding how you get to that minimum set says a lot about your priorities, your sense of the customer, and your reasoning about the business.
All great questions. Think ours are more targeted at YC/similar extremely early stage investor conversations, where the answer to some of those questions (and this is 100% ok) can be "I simply don't know right now", especially how many people you'll need to hire (just a core team to build the first product) and a detailed view on feature set (just a few features that you think people will really LOVE).
In general, answer the question directly and be honest. Trying to use bullshit marketing speak similar to what is taught during an MBA is a turn off and limits their ability to access whether or not you know what you are talking about.
At least, thats what I get from this article and every single essay I've read that was authored by PG.
That's good advice. We're interviewing 21 groups a day, and we have to converge on a decision in 10 minutes, so there is nothing more important than speaking plainly. Trying to decode marketing-speak exhausts us.
I think people use marketing-speak to seem more impressive, but on us it has the opposite effect. The phenomenon is very much like the artificial diction that inexperienced writers so often adopt, and which does nothing but get in their way.
I know it's bad form to quote oneself, but I can't put it better than this:
"When you're forced to be simple, you're forced to face the real problem. When you can't deliver ornament, you have to deliver substance."
Agreed on all above. Think it's important for them to know you've thought deeply about your space and the problem you're solving/product you're building. All of this comes naturally if you love, and are committed to, what you're doing.
Absolutely. As paul buccheit likes to say, limited life experience + overgeneralization = advice. Take what I'm saying with several grains of salt, and recognize that every interview will be VERY different in tone, style, and content
yes, absolutely important. something we really struggled with early - both the one sentence and the 30 second that people instantly GET. this is not just for investors, but for anyone - potential partners, hires, your mom, etc. and keep in mind this is def a work in progress - you'll improve on it and often change it substantially over time!
I realize this will sound like a flippant answer, but I mean it seriously: the chance is either very high or very low, depending on how good your application is.
It took me a long time to realize that when the odds of getting into something were described as e.g. 1 in 10, that didn't mean the odds for any given applicant were 10%, but rather (to the extent the people deciding were good judges) that for 10% of applicants the chance was nearly 100%, and for the other 90% nearly zero.
The famous example mentioned is W.S. Robinson's 1950 study of literacy among immigrants. He found that states with higher populations of immigrants had higher literacy rates, but the average literacy rate among individual immigrants tended to be less than the average population. He concluded that immigrants must move to areas where the literacy rate was higher.
It's not too much of a stretch to make a similar argument for Silicon Valley as a whole or refute any individual startup's odd's of success as 1 in 20.
The generalization of this could be called the statistical singleton fallacy: people (almost universally) inappropriately apply statistics to individuals. Statistics are only valid over populations (in the general sense). As the number of "things" in a population diminishes to 1, the confidence interval goes to 100%, i.e. you can not apply statistical conclusions to a singleton.
A simple graphic illustration of this is smoking. A smoker's probability of dying of a smoking-related disease before age 65 is 15.6%. However, my probability of dying of a smoking-related disease (assuming I smoked) is either 0% (I don't die of a smoking-related disease) or 100% (I die).
People don't understand why some people smoke when there is such clear evidence of the increased probability of death due to smoking-related disease. Well, for each smoker, the probability is either 0% or 100%. If the smoker believes his probability is 0%, he will continue to smoke. If he believes his probability is 100%, he is a hypochondriac.
Thus, smokers either believe they are untouchable or they are crazy.
This is why it is so hard to sell "it's good for you" things... they are almost invariably statistically good for a large population, but can make no guarantees of "goodness" when applied to a singleton.
This applies in spades to health-anything:
* Individual's health: Lose weight and exercise - it's good for you.
* Program's health: testing is not guaranteed to find any bugs - if it were, running testing a second and third time would always find more bugs.