
Biology Will Be the Next Great Computing Platform - joubert
https://www.wired.com/story/biology-will-be-the-next-great-computing-platform/
======
dnautics
As someone who has a PhD in biochemistry and has been coding since age 5, I
find the headline rediculous. You won't be computing in biology. Consider, a
transistor is about 50-100 _atoms_ wide. A protein, which cannot itself even
be a minimal unit of compute, (maybe a molecular transistor like NiFe
hydrogenase has a shot) is already bigger than that.

The things described in the article are not biology used for computation, but
principles from CS applied to biology which has some validity. In one case I
achieved strides in protein design by reducing the engineering cycle from
months to days. Imagine hitting "compile" and having to wait a 72 hours to
know your result. That's biology, even with a fast organism like yeast or e
coli.

~~~
jerf
"As someone who has a PhD in biochemistry and has been coding since age 5, I
find the headline rediculous. You won't be computing in biology.... principles
from CS applied to biology which has some validity."

I think it may be worth reminding people that modern-day computing is only a
subset of computation, and not even necessarily a very large one. Computing
with biology has _extremely_ different characteristics from computing with
modern computers, and the field of Software Engineering may not have much to
say about it except in the vaguest of terms, but _Computer Science_ may still
have quite a lot to say about the limits of computation on this new substrate,
how to do error correction, and all sorts of other things.

A computational formalism can easily be applied to something like an anti-
body, and of course, anything larger than that. The primary thing that stops
us is the fact that we have such a poor picture of what is going on that we
don't have the data to create the formalisms correctly, nor machines
(biological or otherwise) to manipulate such formalisms. But that's a human
limitation and in some sense an _engineering_ limitation, not a limitation of
Computer Science.

Whatever we are coding in biology will be radically, radically different.
Entire classes of squabbles in our industry become irrelevant; you won't get a
concurrency choice (it's just "yes, all the time and everywhere"), there will
be no meaningful debate over mutability (again, "yes all the time and
everywhere"), "structured programming" is just meaningless since the entire
concept of "scope" will be out the window. I don't know what it _will_ look
like. But I do know that whatever it is, quite a bit of Computer Science will
still apply to it. Biological computing will not be immune to information
theory. It won't be immune to complexity theory, although a whole new suite of
classes will probably have to arise, and there will be a lot more probability
in it (but recall that complexity theory already extensively uses that, so
it's not like that's a new thing). And even software engineering won't be
_entirely_ tossed out... for instance, "leaky abstractions" will be an even
bigger problem than they are for us now.

~~~
glenstein
>Whatever we are coding in biology will be radically, radically different. ...
But I do know that whatever it is, quite a bit of Computer Science will still
apply to it.

Whenever there's an article about computation and the brain, or computation
and biology, there's always someone showing up saying that the comparison is
ridiculous because, hey, biology/computers are different.

To me, that's always felt like it was missing a huge point and I think your
comment is one of the better ways I've seen it expressed. I hope we're getting
closer to a point where we can eventually turn the page on this, and the
Hubert Dreyfus-style arguments/declarations about what "computers can't do."

~~~
dnautics
it's not about what "computers can't do", it's about what "biology can't do".
Or more, appropriately, "what you could maybe push biology's round peg into a
square hole and force it to do but we should probably stop and think about
whether it's worth the effort".

------
tim333
I wonder if the headline was even written by the author rather than some copy
editor. It doesn't seem to match the article which is quite interesting on how
computation is being used in biological research. "Biology Will Be the Next
Great Conputing Platform" is kind of silly.

~~~
klmr
> I wonder if the headline was even written by the author rather than some
> copy editor

Almost certainly not, and they rarely are.

------
tristanj
This is analog computers all over again.

It suffers the same problems, namely lower reliability and a lack of
precision.

I don't see how this will stack up to the millions of man-years humanity has
invested into silicon based computers.

~~~
ItsMe000001
> _This is analog computers all over again._

No it is not. Analog computers were used before good digital computers could
be produced, for the same roles and straightforward problems.

Life solves problem at scales that you cannot even touch with "precision" and
in completely different ways. The problem is thinking that those systems are
supposed to be used _instead_ of computers. I reality, they are to tackle much
different problems. Also see my longer comment here. Whenever you have a
problem that you can nicely solve with a silicon-based computer you don't need
a biological system.

Also don't forget that biological systems are still settings the targets for
things like AI or robotics, from which our silicon-based systems still are
very far away from reaching. They do that with added features of self-
replication and self-healing.

~~~
blattimwind
> No it is not. Analog computers were used before good digital computers could
> be produced, for the same roles and straightforward problems.

Actually, analogue computing was still widely used for simulation and the like
in many industries well into the 70s and 80s. Though often augmented with
digital computers, for various purposes (simulation series, automated
recording/reporting etc.)

~~~
CamTin
I think "70s and 80s" is well within the definition of "before good digital
computers could be produced" in the context of many industries. You guys
aren't disagreeing.

~~~
blattimwind
16 bit minis were, well, not common, but still used in numbers in labs and the
like. Consider that d. computers are mostly used for information systems
nowadays, meanwhile analogue computers were not used at all for that purpose.
So apart some inventive examples I won't rule out, no company migrated from
paper to analogue computing to digital computing, because analogue computers
were not suited for information systems.

That's likely why analogue computers persisted for fairly long after the
introduction of digital computers: they solve very different problems, and an
analogue computer could solve simulation problems at a speed that the 16 bit
mini right next to it would never achieve. Also, they are highly modular (in
fact they are _nothing but_ a bunch of fundamental modules like integrators,
diffs and other filters that the user interconnected as required) and very
easy to extend with specialized circuitry and the like. Thus, from a
purchase/investment point of view, analogue computers could easily be scaled,
while this was more difficult with digital computers.

------
sedachv
This article reads like a PR press release. You can't talk about biocomputers
in 2018 without mentioning Tom Knight[1] and iGEM[2]. Knight not only
pioneered this "next great computing platform," he also pioneered the current
one: Knight was involved in early ARPANET work, Lisp Machines (MIT was the
second organization after PARC to build networked personal workstations), and
massively parallel SIMD (ie GPU) work on the Connection Machine.

[1] [http://people.csail.mit.edu/tk/](http://people.csail.mit.edu/tk/) [2]
[http://igem.org/Main_Page](http://igem.org/Main_Page)

------
biridir
Synthego custom crispr kit seems like a very useless service. Guide RNA design
for knockout is extremely trivial. You just need a couple of 20bp+NGG sequence
which is unique to the gene.. Who would pay to do that?..

~~~
toufka
That doesn't interfere with any other part of the genome... That's a hard
compute problem.

~~~
biridir
Do you mean the effect of the actual gene knockout, crispr specificity or off-
spot mutations?

~~~
klmr
CRISPR specificity. Designing _good_ gRNAs isn’t quite trivial, although it’s
not as hard as the article makes it seem. I’m not sure about the claim of
reducing the time of experiments by “month”.

~~~
biridir
[http://crispr.mit.edu/about](http://crispr.mit.edu/about)

Designing a good gRNA is quite straightforward if you compare it to any other
bioinformatics tasks. Even designing a degenerate primer is more complex than
this.

------
cup-of-tea
Conputing? A Freudian slip maybe?

------
ItsMe000001
Occasionally I see posts, for example on reddit but also discussions here,
about what other programming languages a programmer could learn as the next
(and higher) step in ones ongoing education.

My suggestion for what to do after the CS degree, a road I took myself during
the last few years, is to go to edX (or Khan Academy for any missing basics)
and at the very least take MITs "Introduction to Biology", which actually is
an introduction to genetics (when you learn biology you first have to
understand cells). Also, _neuroscience_ , for which there are many courses
(longest one on Coursera, "Medical Neuroscience", with an excellent teacher),
from the basics ([https://www.mcb80x.org/](https://www.mcb80x.org/)) to
computational neuroscience ([https://www.edx.org/course/computational-
neuroscience-neuron...](https://www.edx.org/course/computational-neuroscience-
neuronal-dynamics-of-cognition)).

I found understanding the basics of biology a lot more informative than
learning new twists about some functional programming concept. It introduces
you to a _massively_ (!) parallel world where statistics rules, and
errors/outliers are actually essential to functioning biological systems. You
may find that if you remove all errors, for example that a protein is made
that is not supposed to be made because it's not supposed to be needed,
actually is essential, because only by making it does the cell notice that a
different (better) fuel option has become available (that is from a concrete
example, see the linked course :-)). So definitely add plenty of statistics
courses until you learn to think "big". Each time you think about a problem
think about that thing happening a trillion times instead of individual cases,
until this becomes a habit. This helped me a lot, because most people focus
their mind on single examples, for example when making suggestions how to
improve society (hint: suggestions that sure work for _anyone_ , but if
_everyone_ tried it would quickly unravel). Okay, in the last two sentences
I'm going out on a limb (claiming that learning biology and statistical
thinking helped), but definitely try some biology, statistics and neuroscience
guys. It's all free and most of it high quality. And the statistics knowledge
is just as good for machine learning so you need that anyway.

Also recommended: "Principles of Biochemistry"
([https://www.edx.org/course/principles-biochemistry-
harvardx-...](https://www.edx.org/course/principles-biochemistry-harvardx-
mcb63x-1)), although I would say you should not try to do all the exercises
because it will be way too much work (and highly demotivating, looking at how
the discussion forum shows how the nr. of students very quickly grows thinner
from course section to section) to try to learn all that stuff by heart. Also:
"Cell Biology: Mitochondria"( [https://www.edx.org/course/cell-biology-
mitochondria-harvard...](https://www.edx.org/course/cell-biology-mitochondria-
harvardx-mcb64-1x-1)) to understand where the power comes from.

Also check out individual universities, some have a lot of free courses
(sometimes using the edX platform software, which is open source, e.g.
Stanford) hat they don't put on 3rd party platforms like edX.

~~~
devgutt
IMHO these courses only scratch the surface, good to learn a thing or two,
without much applicability. I am right now searching for a degree path in
Biochemistry or Molecular Biology online. I want to spend time on it, but be
able to actually apply my knowledge. There are so few of them (maybe because
of the lab classes, idk). I've found online degrees in ASU
([https://asuonline.asu.edu/online-degree-
programs/undergradua...](https://asuonline.asu.edu/online-degree-
programs/undergraduate/biological-sciences-bs)) and UF
([https://ufonline.ufl.edu/degrees/undergraduate/biology](https://ufonline.ufl.edu/degrees/undergraduate/biology)),
but they are really expensive. Biology is the future, but unlike the article,
I think it will make software obsolete altogether.

~~~
kharak
Do you by any chance did come across an online course for biochemistry or
something similiar? I've wanted now for quite some time to study biology and
change my career in that direction, but no relevant degree seems to be
designed for remote study.

~~~
ItsMe000001
Look at the "Principles of Biochemistry" course I linked to. Even when OP says
"it's scratching the surface", that was in comparison to a complete several
years study, that course by itself _is_ extremely involved and pure biochem.
It's "only" a single course, but let's wait what you say after unit 3 -
because judging by the forum participation, about 99% of people who join that
course won't even make it past the 2/3rd mark. So if you can stomach that one
course it would be a good sign. I read that about chemistry in general, should
be the same for biochem, that if you study in those fields the load is quite
extreme.

------
alfonsodev
Is there any project or discipline aiming to model our biology in software so
that we can simulate things on a computer? (Edit) I found this
[https://en.m.wikipedia.org/wiki/Modelling_biological_systems](https://en.m.wikipedia.org/wiki/Modelling_biological_systems)
I guess that is it

~~~
klmr
This is one of the main subfields of bioinformatics.

------
agumonkey
At that point I flee anything with computing. I feel it's a reflex fad that
invades any field.

~~~
goatlover
Surprised there isn't a major branch of philosophy called computational
philosophy. Give how some computer scientists treat computation as central to
existence.

~~~
tim333
There's this
[https://en.wikipedia.org/wiki/Computational_theory_of_mind](https://en.wikipedia.org/wiki/Computational_theory_of_mind)

------
mherrmann
* CoMputing (typo in the title)

------
throwaway84742
Huh? I thought it’s been the computing platform for roughly the past billion
years.

------
TOSsedAway09876
"Co _n_ puting"?

------
gm-conspiracy
Please fix _Conputing_.

------
simonebrunozzi
Typo in the title: "conputing" should be "computing".

~~~
arketyp
A serendipitous typo though, putting emphasis on the parallellism. It's the
same prefix anyway.

