Full disclosure: I have an undergraduate degree in bioengineering from 2008. I did a lot of lab work, but never worked in industry. I've went straight into web apps after graduating.
I'll give another example. A telescope is big. But good luck grinding the mirror. You can do this by hand, by the way. To within a tenth of a micron. The really good guys actually do the final "grinding" with their bare thumb.
At one point, being an amateur astronomer required building your own telescope, entirely by hand. I have books that go through everything from grinding your own 6" mirror to making an equatorial mount. Then in the mid 1950s, companies started making decent small astronomy scopes. Oddly enough, a 6" scope today has the same price as a 6" scope in 1955 ($450), but that inflates to $3600 today.
Not too long ago I picked up a shoddy 4" scope for $10 at a yard sale. This would have been the worlds largest telescope, if you went back in time to 1800. How is that for progress?
The paraphernalia required to do professional work is always getting cheaper and easier to operate. Bio seems to be in a similar state as astronomy in 1930. The pros are using their huge 100" scopes on mountaintops. The amateurs are using homebrew 6" scopes. You've used the real deal, but that does not mean there isn't a lot of virgin territory within the reach of an amateur.
I'd be more concerned with safety and that sort of thing.
I suppose they deserve to know why they're wrong though. For starters the article isn't saying anything beyond the fact that there are people who buy used PCR machines and run reactions in their basement. Now, I think that's pretty cool, but nothing meaningful is going to come of it. There are a few reasons for this:
1) Biological Research Is Expensive - Setting up a real lab so you can do the basic stuff like running gels and western blots is going to cost somewhere around $100,000. Note that this isn't like buying a computer where once you pay for it you're set. Actually performing an experiment can cost hundreds of dollars for reagents alone.
2) Biological Research Requires Oversight - If you're going to work with radioisotopes, or toxic chemicals, or animals you need permits that are not offered outside of an academic or industrial setting. Two of the basic techniques of molecular biology, northern blots and southern blots, use radioactive reagents. I'll grant that there are alternatives that use fluorescence, but they're expensive, and aren't widely used.
3) Biological Research Is Tedious and Slow - I think part of the reason computers took off so quickly is that you can run a program and get quick feedback on whether or not it worked. Most of biological research is not like that. In graduate school I worked on genetic modifications that hypothetically extended the life spans of fruit flies. In order to see if a modification worked, I had to assay their life span over a hundred days. I'd normally be incredibly excited about getting the result on day one, but by day forty my interest was mostly exhausted. Biology doesn't give you the instant thrill of knowing a piece of code worked, which is one of the reasons it's not going to take off like personal computers did.
I'm not going to single anyone out, but if you've been making comments in this thread there's a high probability you have no idea what you're talking about.
For instance, is cloud-based synth/analytics really that far fetched / impractical? For example, have you seen the automation we're seeing with microfluidics?
Are you saying that hackers can't order biological systems the way we order custom circuit boards today? The means of production are comparatively high - yet it's very cheap to make them.
Techno skeptics love to say "why would". Futurists like to say "what if". The correct response is probably "how can".
Back then the real alure was that computers could compute things amazingly quickly, and later, that through the internet, we could organize information on computer networks. And that everything could scale really quickly.
We may be entering an era of basement labs, but that doesn't mean that these breakthroughs will necessarily be life changing.
IT is working on the problems of organizing people and information. This will likely produce many more people working on solving human maladies. But all these things being done in basements without any regulation or standards may lead to some health risks for first movers.
The modern computing era started out with hobbyists, Bill Gates and Steve Jobs some of the most famous examples. This started less than four decades ago, now look where we are. Imagine if we had to wait for large corporations or governments to repeatedly try out new ideas and see what succeeded or failed. Democratizing technology poses risks as you mention, but also offers great potential for innovation. I hope the upsides of this new era outweigh the bad.
It's like when I speak with my co-workers, and I say 'what would you like this tool to do- anything you want it to do, I can make it do', and they reply 'well, what sort of things can it do?'
If I recall correctly, the microprocessor came about simply because Intel got tired of re-designing calculator chips from scratch, and so decided to make a one-size-fits-all. I doubt anyone saw microprocessors as the future back then; it was just a cheaper, easier way to make a desk calculator.
The real question is: what will biology be able to do that
computers aren't able to?
Also maybe cure diseases.
That said, current synthetic biology does not strive to become an alternative to computers, but instead compete or work together with nanorobots. Perhaps something computer-like will happen once we have synthetic cells figured out and also create large artificial systems with them -- but that's a distant future, if ever, its ethics will be heavily debated.
Biology "apps" could give you the ability to jump higher, run faster, grow stronger and live longer. They could even make us smarter.
Personally, I find all of that very compelling.
Heh, "changing life" is exactly what biology can do that computers can't.
So lets imagine a malicious person with funding and manpower turns up: Sure, totally synthesised genomes will turn up and you will be able to email some guy who generates your operon to stick into whatever, so the barriers of entry for a bio-terrorist will be lowered somewhat; but even then the knowledge of how to create a killer virus will be scarce.
And after 20+ years of gained knowledge, once a billionaire Bond-villain with a fortress of henchmen* has made a killer virus, it is tricky to imagine it out-competing any virus created by nature's hands - she had several million years worth of head-start and a hell of a lot more funding.
So, yes, it is very unlikely, but still possible. However, if you were going to go to all that work to spread some virus, I would strongly recommend that you just learn to make nukes instead. They are much more reliable, predictable, and will have less fallout for you to worry about.
*Although a Government could much more probably fill this role.
Heh... I wonder what will play the same role as SCSI this time around...