Hacker Newsnew | comments | show | ask | jobs | submit login

Bill Gates and Mark Zuckerberg dropped out of Harvard while undergrads. David Filo, Jerry Yang, Sergey Brin, Larry Page didn't bother to finish grad school at Stanford and dropped out too. On the other hand, we can think of Craig Venter, Andrew Grove, Gordon Moore, Carver Mead... all of whom obtained PhD's.

The usefulness of a degree depends on what field one is working on. In the Software arena, smart kids out of high school can do a lot if college is not holding them back. But try to start a laser / semiconductor / biotech company with high school kids if you dare ;-)

One might not need a degree to be a good software entrepreneur, but there are many other entrepreneurs who don't live in the software world. Sure, HN is focused on software, but it seems to me that it's irresponsible to promote the idea that all that entrepreneurs need is passion and hard-work. Necessary but not sufficient.




"In the Software arena, smart kids out of high school can do a lot if college is not holding them back. But try to start a laser / semiconductor / biotech company with high school kids if you dare."

Exactly. We've inadvertently created a culture that rewards extremely shallow achievements. Thirty years ago, brilliant 20-somethings wanted to send men to Mars and build super-colliders. Today, the smartest college kids are trying to build social networks for dogs.

Frankly, that's not a trend worth celebrating. A society where an advanced education isn't an economic advantage is a society going the wrong way.

-----


That's just not true. I meet a lot of startup founders who are working on very hard technical problems and have hopes of changing the world by solving them. The Etherpads, for example, had to literally prove theorems to get their real-time collaboration sw to work, and they hope eventually to use it in a whole range of applications.

You could just as easily argue that thirty years ago (I was actually around then and old enough to be paying attention), technical people just wanted safe jobs solving circumscribed problems for large organizations.

The truth is that, all other things being equal, each generation of people is roughly equally ambitious. Past generations weren't golden ages compared to the present, or vice versa.

-----


I don't see how you can argue that each generation is equally ambitious, while also arguing that the previous generation was seeking out "safe jobs".

I agree that people are probably as ambitious as they were decades ago on an individual basis, but that doesn't mean that their goals have stayed the same over time. Even in the last decade, there's been a massive shift of technical talent toward entertainment technology, whereas it's getting harder and harder to find R&D jobs in physics, chemistry, biology and computer science.

There are always a few companies doing interesting technical work, but if the number of under-employed PhDs I know is any indication, the fraction of companies doing R&D is pretty darned small, and getting smaller every year. It's much easier for a PhD in Physics to get a job as a sysadmin at a web startup than it is for her to get a job in her field. I think that's sad.

-----


I find it frightening that in the U.S. at least, basic research seems to have seriously declined at the government and large company level. Bell Labs is gone, and most companies now are focusing on applied rather than fundamental advances.

And before people start piling on large companies and the government with the usual rant, try to figure out how you could invent the laser, fiber optics, tcp/ip, lunar landing, orthgonal frequency division multiplexing , etc, with 2 college dropouts well-versed in rails and javascript.

-----


I think there are cycles where technology get's to a state where basic research has huge value until you have enough basic research to to build a wave of new technology's. Once you have fully exploited the last wave of technology you are ready to build the next level of new technology. But, you rarely change technology's while there is still room to grow the old tech. Example: Modems starting at 300bps to 56k used the same basic technology, then DSL, then Fiber.

IMO, there is value to understanding the limitations to existing technology before you start doing basic research.

-----


Yes. New technology is invented in response to problems with the older technology, and if you haven't capitalized on the old technology's potential, you'll likely just be reinventing the old solutions instead of coming up with new, better ones.

-----


Interesting. A counter anecdote would be the advances in physics in the early 20th century, when we were still far from finished innovating with Newtonian mechanics. I've always doubted the whole "necessity is the mother of invention" thing; I don't think relativity was discovered because it became necessary to discover it. On the other hand, when it comes to technological innovations (semiconductors, transistors, lasers, etc.,) I just don't know enough scientific history to assess your hypothesis.

-----


I think the limitations of Newtonian physics where only discovered as people built devices that broke them. There is a great time line for that period, but when you consider the instruments required to make the discovery's you see they are based on specific assumptions about how the world operates. It's only when their design reaches their limits that meaningful discovery's can be made.

http://timeline.aps.org/APS/Timeline/

"The Dutch physicist Heike Kamerlingh Onnes finds that mercury loses its electrical resistance at temperatures near absolute zero. This low temperature effect is observed in other materials as well."

That takes advanced equipment to reduce things to that temperature, an expectation that resistance is effected by temperature, and an formula that works at higher temperatures to notice the discontinuity.

PS: Consider all the benefits high speed computing has provided when designing aircraft. You need a lot of wind tunnel / real world tests to build a model, but with that model and lot's of computing power you can design aircraft out to the limits of your simulation. At which point you need to collect more data.

-----


My point was that if you can make as good a case for x > y as for y > x, then x = y.

Do you have any evidence for this massive shift toward entertainment?

Surely there are more research jobs in biology than there were 30 years ago, considering biotech was mostly invented in the intervening period.

-----


Are there more biotech startups than 30 years ago? Sure. But probably not more than 20 years ago, and certainly not more than a decade ago.

There's plenty of evidence of a shift toward entertainment. Look at the big startup successes of the last decade: Google, Myspace, Facebook, Friendster, YouTube, Blogger, Flickr, Digg, Bebo, etc. Save Google, nearly every major success story has been an entertainment/media play. In fact, I don't see this argument as particularly surprising or controversial, given that the web has been the dominant technology of the last two decades, and that the web has always been about media.

I think we both agree that personal ambition is pretty much constant over time. My argument is that if that's the case, then it's not reasonable to argue that people preferentially sought out "safe" jobs three decades ago. Smart, ambitious people have always sought out the opportunities that make them most successful; today those opportunities are clustered in industries that reward ephemera.

-----


A quick side note. The number of biotech startups is mostly a function of the legal environment. The industry is more regulated that almost any other sector of the economy.

-----


Your evidence is that most of the startups you've heard a lot about are mass market consumer startups? Would you expect to have heard as much about the ones working on infrastructure?

-----


If I didn't know anything else about the industry, then probably not. But that's kind of the point -- that's the list of successes that everyone knows about; the list that every CS undergrad wants to found.

-----


Their privacy policy mentions: "patent-pending edit synchronization algorithm". I'd much appreciate hearing about their experience with a software patent (I'm in a similar position).

Unless they want to not publicize it, to avoid anti-patent sentiment? e.g. two TechCrunch comments: http://www.techcrunch.com/2008/11/19/etherpad-shows-google-d...

-----


>The Etherpads, for example, had to literally prove theorems to get their real-time collaboration sw to work, and they hope eventually to use it in a whole range of applications.

Could you tell them to publish the papers or the notes please?

-----


The Etherpads finished college, though.

-----


This is true. Right now, people like Mark Zuckerberg are looked to for inspiration among young people. It's all fine and good that he's created something big, but has it really done anything for us? If I want inspiration, I'm more interested in looking to the engineers who put a man on the moon 40 years ago.

As useful as sites like Facebook, Twitter, and Youtube are (not to pick on any of them), their achievements are nothing compared to putting people in space, designing planes like the Dreamliner and the A380. or even the engineering that went into the Chunnel.

-----


Aren't you people generalizing a little too much, obv there are loads of people working on big, important problems like energy, medicine etc. While many people use Facebook I don't really see many that hold Mark Zuckerberg as an idol.

-----


Sorry, but sending men to the Moon to drive around in a little car is a better example of wasteful entertainment than any of the three web companies you named.

-----


You have to take scale in consideration.

The Facebook team's accomplishment may be much smaller than putting people on the moon, but the size of their team (and partners) and their budget isn't comparable either.

I say an accomplishment of X value by spending Y resources is equivalent to one of 100X value by spending 100Y resources (numbers and proportions unrelated to facebook VS moon example).

-----


Innovation and utility is defined by use. If people want to use social networks and not some microformats to enable the semantic web, that's innovation. All this complaining about working on worthwhile products has some truth to it, but look at the utility of lasers. In the end, lasers enabled cd/dvd's for millions of people to watch movies and music, and that's the ultimate utility.

-----


I can think of quite a few people who dropped out and just became losers. Just saying, while dropping out of an education might work for some people, I'm willing to bet that on average, it's a bad decision.

-----


Exactly. It's a manifestation of selection bias. We pay attention to the Gateses, Zuckerbergs and such, and disregard the thousands and thousands who dropped out of college and ended up achieving nothing in their lives. It seems to me that Microsoft's or Google's successes are anomalies that one should not try to emulate...

-----


The key is having a marketable skill. With a good education, you have both a skill, and a certificate saying you do.

Without an education, you might still have a marketable skill, but people have to take your word for it.

Personally, I don't have a university degree. I do have a marketable skillset, but it's up to me to prove it, and through a bit of luck I'm in a pretty good position. But it's certainly not a path I would recommend others to travel, you leave too much to chance.

-----


Larry Page holds a Masters degree in Computer Science from Stanford, according to Wikipedia.

-----


But didn't he intend to obtain the PhD when he went to Stanford? I once read a long interview with him, and I believe that PhD was his goal. I am not sure, though.

-----


His Stanford web page (http://infolab.stanford.edu/~page/) identifies him as a "Ph.D. student".

-----


I assume he took a leave of absence, then.

-----


That's what I've heard. My point was that at least at some point he was calling himself a "Ph. D. student" -- his goal was a Ph. D., not a Masters degree.

And while I'm on the subject, Sergey Brin also still has a Stanford web page (http://infolab.stanford.edu/~sergey/), which helpfully informs us that "Currently I am at Google."

-----


Eric Schmidt has a PhD in EE/CS.

-----


But most news and comparisons are about these successful people who have ran startups successfully who have dropped out from school.

Could it possible that these are really people who dropped out after studying for a few years having learnt what they can or realised what they cannot learn rather than people who never went to college.

This is a big difference.

-----




Applications are open for YC Winter 2016

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: