Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Whoa, hold on... I think medicine and biotech are misunderstood here.

First of all, Big Pharma != Academia or biotech startups. The former group concentrates on profitability and marketable drugs, whereas the latter two do the heavy lifting for a broad spectrum of diseases.

Secondly (and more importantly), we must realize that there are two issues at hand: the ease of developing an effective drug to combat a particular disease state, and the predominance of a given disease in the general population.

Big Pharma already claimed the "low hanging" fruit decades ago. Many of today's medical problems require a systems engineering methodology to approach them as the pathways and metabolomics are just too incredibly complicated to yield direct solutions. The search spaces are vast, multidimensional, and tedious.

To make matters worse, it's really difficult to commit to studying an orphan disease. A solution to a particular type of cancer, Alzheimer's, or HIV would have wide applicability and save many lives. Finding a solution to a rare genetic disorder, however, may help only a few thousand individuals. It's incredibly sad that there is such suffering from rare diseases, but we have so little mental capital to invest in fixing these problems. At present, biotech and medicine is nothing like programming where iteration and debugging yield fast results. Years of personal labor can become pointless if mistakes are made.

I don't know if it's justified for me to feel this way, but I wish more of the brilliant minds in programming would switch to a biotech/research profession. Computing is such a well-traversed and developed field and solution space. While it isn't glamorous, medicine and biotech really need you...



My take is that it's just too early in biotech. Back in 1973 it was certainly possible to envision Facebook and Twitter. For example, if you read Steven Levy's Hackers you'll learn about the Community Memory project:

http://en.wikipedia.org/wiki/Community_Memory

But computers and their associated tech like telecom were so expensive in 1973 that Community Memory was impractical at scale. There was no Facebook around to offer Lee Felsenstein an enormous salary and a potential IPO. The infrastructure wasn't yet built out, the audience wasn't yet there, and investor consciousness had not yet been raised. It took another twenty years of development to get from that point to the Web, and then another decade or so to get to Facebook.

Biotech is still stuck in the equivalent of 1973. We've identified many of the processes that need to become cheaper and more ubiquitous in order to conduct biotech research in your basement, but the tools are still large, expensive, and tedious to use. So at this point if you go into biotech you'll either spend 12 hours a day pipetting liquids by hand, waiting for the PCR machine, and pasting your data into Excel for analysis, or you'll spend 12 hours a day raising money to pay for the pipettes, the PCR machine, and the other scientists who are running them.

I've spent years as a biotech postdoc, and I've spent years as a programmer, and it's no mystery to me why people would rather be programmers, and would rather hire programmers. The day-to-day work is more pleasant, and the results have got such evident and immediate value that you can get good pay for them with relatively little effort or risk. Plus, you're part of a really big and freewheeling global culture, not only of web users but of web programmers.


And I know many many people who view programming as boring and love setting up experiments and assays. I was the only person in my group who looked at computers as anything more than a convenience.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: