Hacker News new | past | comments | ask | show | jobs | submit login
Biology Will Be the Next Great Computing Platform (wired.com)
128 points by _zhqs on May 4, 2018 | hide | past | favorite | 70 comments



As someone who has a PhD in biochemistry and has been coding since age 5, I find the headline rediculous. You won't be computing in biology. Consider, a transistor is about 50-100 atoms wide. A protein, which cannot itself even be a minimal unit of compute, (maybe a molecular transistor like NiFe hydrogenase has a shot) is already bigger than that.

The things described in the article are not biology used for computation, but principles from CS applied to biology which has some validity. In one case I achieved strides in protein design by reducing the engineering cycle from months to days. Imagine hitting "compile" and having to wait a 72 hours to know your result. That's biology, even with a fast organism like yeast or e coli.


"As someone who has a PhD in biochemistry and has been coding since age 5, I find the headline rediculous. You won't be computing in biology.... principles from CS applied to biology which has some validity."

I think it may be worth reminding people that modern-day computing is only a subset of computation, and not even necessarily a very large one. Computing with biology has extremely different characteristics from computing with modern computers, and the field of Software Engineering may not have much to say about it except in the vaguest of terms, but Computer Science may still have quite a lot to say about the limits of computation on this new substrate, how to do error correction, and all sorts of other things.

A computational formalism can easily be applied to something like an anti-body, and of course, anything larger than that. The primary thing that stops us is the fact that we have such a poor picture of what is going on that we don't have the data to create the formalisms correctly, nor machines (biological or otherwise) to manipulate such formalisms. But that's a human limitation and in some sense an engineering limitation, not a limitation of Computer Science.

Whatever we are coding in biology will be radically, radically different. Entire classes of squabbles in our industry become irrelevant; you won't get a concurrency choice (it's just "yes, all the time and everywhere"), there will be no meaningful debate over mutability (again, "yes all the time and everywhere"), "structured programming" is just meaningless since the entire concept of "scope" will be out the window. I don't know what it will look like. But I do know that whatever it is, quite a bit of Computer Science will still apply to it. Biological computing will not be immune to information theory. It won't be immune to complexity theory, although a whole new suite of classes will probably have to arise, and there will be a lot more probability in it (but recall that complexity theory already extensively uses that, so it's not like that's a new thing). And even software engineering won't be entirely tossed out... for instance, "leaky abstractions" will be an even bigger problem than they are for us now.


Computing with biology would be more like the "movable feast" computing architecture videos on YouTube. Prepare to be out-neckbearded if you watch them. Not sure how useful it would be. You don't want your bank account running on such a machine.

Part of what makes computing extremely useful is that it's a crutch that lets us do things our feeble human minds can't do. Logical purity, high numerical computation, immutable logging and provenance. At the edge we start to tolerate imprecision in some domains like hci and decision making when there's a premium attached to the process, that statistically (or if we're being cynical, projects the appearance of) yields an improvement, like being able to ingest high inhuman volumes of data, or where human interaction in the correction of erroneous processing is proximal and minimally effort inducing.


>Whatever we are coding in biology will be radically, radically different. ... But I do know that whatever it is, quite a bit of Computer Science will still apply to it.

Whenever there's an article about computation and the brain, or computation and biology, there's always someone showing up saying that the comparison is ridiculous because, hey, biology/computers are different.

To me, that's always felt like it was missing a huge point and I think your comment is one of the better ways I've seen it expressed. I hope we're getting closer to a point where we can eventually turn the page on this, and the Hubert Dreyfus-style arguments/declarations about what "computers can't do."


it's not about what "computers can't do", it's about what "biology can't do". Or more, appropriately, "what you could maybe push biology's round peg into a square hole and force it to do but we should probably stop and think about whether it's worth the effort".


It will be different, but concurrency, mutability, structured programming, and scope are very relevant.

Concurrency: Not all genes are run at all times. Gene regulation can look a lot like code. Ex: if(curr_temp > DANGER_TEMP){temperature_defense_gene.activate()}.

Mutability & Scope: Cells that have the same genome can act very differently (all of your cells have ~the same DNA, but act very differently) and some cells in your body can change roles.

Structured Programming: Biological pathways use a lot of control flows. [1]

[1] https://en.wikipedia.org/wiki/Biological_pathway


While there's truth in the paradigm you advance, it's not anywhere near as binary or direct as you might me suggesting.

There's a ton of leakage and noise at most stages in biological pathways. Biology actually spends enormous amounts of effort to reduce the noise and leakage, but it's really hard to eliminate entirely.

Lets take the danger temp gene you use as an example. In bacteria that's mostly true, but even there what if we're low on nutrients? Or low on a particular nutrient? Or what if we start producing too much protein which is aggregating because the concentration is too high? Each cell is balancing all these concerns simultaneously, so even very simple biological circuits work/don't work in unexpected ways. What the OP is suggesting is that you're always concurrent with a huge number of different things going on that are far less isolated from your "process" than is traditionally the case in CS.

I agree that concurrency, mutability, scope, all that is relevant, but I think the OP's point that we don't have the theory on how those will apply in a biological context is correct.

Edit: for those interested, some papers on biological noise and the efforts cells use to reduce it:

https://arxiv.org/pdf/1610.00820.pdf https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4006352/ https://www.cell.com/molecular-cell/abstract/S1097-2765(12)0...


> Imagine hitting "compile" and having to wait a 72 hours to know your result.

You mean, something like giving a deck of cards to a computing center operator and eventually getting back a printout of the outcome a few days later?


Yes, but e coli really is close to the limit of how fast you can go. They're rather like tribbles, before they are done splitting, the next generations dna has already started duplicating. The DNA copy time is ~40 minutes and the dividing time is prot close to 30.


Clickbait aside, I think there are vast possibilities for DNA to be used as a fabrication tool. Imagine a seed you could plant and it would rapidly grow in the shape of a house, using nothing but sunlight, atmospheric CO2 and a bit of fertilizer.

Also, by controlling the expression of biological features you could create bioelectronic hybrids that leverage the best of both worlds. Imagine you ship to Mars a few pounds of SoC silicon dies, then grow them into an army of elephant sized drones. They faithfully execute commands with the precision of robots, yet require no maintenance or consumables, living off the land with self-healing capacity.

I believe the idea is not using DNA for general purpose programming as much as controlling biological systems using the paradigms of the software world.


> I believe the idea is not using DNA for general purpose programming as much as controlling biological systems using the paradigms of the software world.

That's not a new idea. I was doing this in grad school. In my postdoc, my boss wanted to rewrite the DNA control structures of our system. I overrode him and told him that the biochemistry needed to be fixed (manipulating the amino acid sequence and not the promoters). On my branch of the track, I got a 16x improvement in yield (we started from trace). On his branch there was no effect.


But what if there was a way to write algorithms that can't be expressed with a von Neumann architecture?

Do you know where I can read about what categories of biocomputation there are? Is there a biological Turing machine that's ever been built, or no, I guess it'd be grown?

As another commenter said, computation is a more vast category than the simplicity of one instruction and then another instruction. Fantastic things can be computed, but most of them are difficult with modern computers.

Parallelization in a biocomputer would also be an important advancement. Like how early computers were room--sized, I can imagine a room full of intricate, strange plants all arriving at their results. That's a lot cheaper too, presumably, then a bunch of GPUs on a rack. Even if it's more expensive, the category of computing it could do might make it still worth it anyways.

Do you think someday that biological computing could perform useful computations?

I'm sure you're busy with important work, you don't have to answer these. I'm sure your knowledge is immense, so your time is precious! Thanks for your comment!


> But what if there was a way to write algorithms that can't be expressed with a von Neumann architecture?

There almost certainly isn’t, due to the universality of computation (see the Church–Turing hypothesis). At any rate, even if this hypothesis turns out to be false (almost unthinkable), DNA won’t offer anything beyond what classical von Neumann architecture offers.

> Do you think someday that biological computing could perform useful computations?

Potentially: proteins are a good potential starting material to build nanomachines. However, this has got nothing to do with this article. As the initial comment said, the article is completely unrelated to biological computing platforms, the title is pure clickbait.


I know it was off the track of the article, I was gravitating to the more interesting idea.

Thanks for your time!


I can't answer you directly but perhaps this helps too.

I recently finished Summa Technologiae from Stanislaw Lem which does investigate a lot of these questions in many of its chapters.

I highly recommend it as it is a detailed, vivid exploration written by a very wise person so much so that it is easy to forget while reading that the book was published more than 50 years ago.

But there's also a practical reason why you might be interested in it even if you wouldn't think anything of a personal(?) recommendation from someone on the internet.

It is based on a lot of academic papers Lem read from all sorts of fields, which are all referenced (at least they were in the version I've read), so it should be easy to then start to look for more recent papers based on those.


I found it online, looks fascinating! Thanks so much!

That's the best kind of book to read, the kind someone wrote after reading a bunch of other books.



Fellow biochemistry Ph.D. here, and even if you could make this work, good luck finding a usable programming model. This makes quantum computers look easy IMO. And that's too bad because if I worked at Synthego, I could walk to work.

That said, the article seems to be about accelerating genetic engineering rather than a new programming substrate. There's probably 9 to 10 figures in that, no?


The whole article is a fine demonstration of what happens when a Silicon Valley journalist writes a piece on biology without an inkling of what it's all about...


The article is fine. The editor (who provided the title) is the one without a clue.


I remember seeing my first article about biology as a computing substrate when I was a kid and thinking OH MY GOD...and then nothing happened for 100 years. (I am very old.)


Forgive me, but did your message not spring from a biological computing device which is arguably the most marvellous material object to exist in the known universe?


You'll have to convince me that what makes us special is hardware and not software.


Not exactly human made.

Also I suspect the article is wrong and we'll be outclassed by AI.


Biological AI?


I wonder if the headline was even written by the author rather than some copy editor. It doesn't seem to match the article which is quite interesting on how computation is being used in biological research. "Biology Will Be the Next Great Conputing Platform" is kind of silly.


> I wonder if the headline was even written by the author rather than some copy editor

Almost certainly not, and they rarely are.


This is analog computers all over again.

It suffers the same problems, namely lower reliability and a lack of precision.

I don't see how this will stack up to the millions of man-years humanity has invested into silicon based computers.


> This is analog computers all over again.

No it is not. Analog computers were used before good digital computers could be produced, for the same roles and straightforward problems.

Life solves problem at scales that you cannot even touch with "precision" and in completely different ways. The problem is thinking that those systems are supposed to be used instead of computers. I reality, they are to tackle much different problems. Also see my longer comment here. Whenever you have a problem that you can nicely solve with a silicon-based computer you don't need a biological system.

Also don't forget that biological systems are still settings the targets for things like AI or robotics, from which our silicon-based systems still are very far away from reaching. They do that with added features of self-replication and self-healing.


> No it is not. Analog computers were used before good digital computers could be produced, for the same roles and straightforward problems.

Actually, analogue computing was still widely used for simulation and the like in many industries well into the 70s and 80s. Though often augmented with digital computers, for various purposes (simulation series, automated recording/reporting etc.)


I think "70s and 80s" is well within the definition of "before good digital computers could be produced" in the context of many industries. You guys aren't disagreeing.


16 bit minis were, well, not common, but still used in numbers in labs and the like. Consider that d. computers are mostly used for information systems nowadays, meanwhile analogue computers were not used at all for that purpose. So apart some inventive examples I won't rule out, no company migrated from paper to analogue computing to digital computing, because analogue computers were not suited for information systems.

That's likely why analogue computers persisted for fairly long after the introduction of digital computers: they solve very different problems, and an analogue computer could solve simulation problems at a speed that the 16 bit mini right next to it would never achieve. Also, they are highly modular (in fact they are nothing but a bunch of fundamental modules like integrators, diffs and other filters that the user interconnected as required) and very easy to extend with specialized circuitry and the like. Thus, from a purchase/investment point of view, analogue computers could easily be scaled, while this was more difficult with digital computers.


Life doesn't solve "problems" though - it just creates random order from chaos.

What kind of problems do you even have in mind? What would you even solve if we could somehow understand or control these bigger forces of organization?

I agree with the OP, we are not replacing silicon with DNA or anything - this article is just another "tech" article that shouldn't exist.


> Life doesn't solve "problems" though

Wait, what? Biological systems are not solving problems? It does so all the time! Seriously?

> it just creates random order from chaos.

You get that with just physics, on a large scale for example the formation of galaxies and stars, or when things self-sort by weight in a gravity field (heavy stuff goes to bottom of a solution, lighter stuff goes to the top).


I think he ment life in a broader perspective (i.e. evolution) not conscious and aware biological systems as in humans.

I would argue that very few biological systems solve problems, what they do is serve a function which has been dictated by evolution.

But at no point is there any purposeful act of solving problems just like your genes aren't selfish they just behave as if they are.


No, they don't solve problems, it's just natural selection in an environment with lots of randomness.

Life didn't even start out by solving problems, as I said: random order from chaos. Life doesn't care about solving problems, it's just collection of random matter.

Given enough random matter, you are destined to get some with odd properties in right conditions that can also spiral out of control to create something like a human.


> No, they don't solve problems,

Of course they do. Being alive presents all kinds of problems. I mean, seriously man... I don't understand the things you write, really strange stuff, I'm at a loss.


Life didn't start out because it had the problem of not being a thing.

Can it still be considered "problem-solving" when the so-called "problems" are only so in the context of animal survival and breeding from the perspective of animal consciousness? Life just randomly grows and takes on random shapes, it's not a force of problem-solving.


Yet, here we are, solving problems.


You are not life, you are a human, a lifeform - product of life.


Life would extinguish itself if it was the force of problem-solving, because that would ultimately solve all the problems.

I don't know how can you create a problem-solving force from an environment where none exists. You can only create forces of chaos from such an environment.


> analog computers all over again

Biological coding is quite digital/discrete in its nature, from genes up, that's why it scales and has things like error correction. You can get all the precision you need if you get EC and statistics right. Now that will be the biggest challenge. (And getting statistics right is essential even for something like a modern CPU that always has at least a few broken transistors in it but still works.)

What we see as "individual organisms" at the large scale are analog in their functions, but this is an illusion. The actual units of information or transmissible programs are nothing but digital.

Life likes discrete/digital building blocks for complex machines for the same reasons we humans like them in our technology, nothing special here...


Analog computers were just digital computers with a different computational paradigm. That is, they had the same IO mechanisms, just how the Input became Output was slightly tweaked.

Biological computing will be interesting primarily because it will be able to build things, which neither analog nor digital computers are particularly good at. It's an entirely different paradigm. The act of computation can also be the output.


> Analog computers were just digital computers with a different computational paradigm.

No, there are fundamental differences. Digital computation is inherently error correcting because there are only two states — on or off. With analog computing, errors compound over long times and there’s fundamentally no way to fix this (it’s not just “hard”, it’s theoretically impossible). That’s why analog computation is fundamentally limited in scale, it can’t handle complexity through abstraction. Modern computer science is fundamentally tied to digital computation. This is also why evolution relies on a digital (albeit not binary) storage medium (DNA), and if we every have useful biological computation, it will likewise be digital.


This is another case of "cutting off my quote too early".

Compared to the difference between what biological computation and digital computation is going to look like, what we call analog and digital computers are brothers, because they share the same basic IO. ("No, they have differences..." Yes, they do... but compare to biology. A valid input to a biological computation process will be your physical body. Not some numeric representation of it, but the actual thing. This is inconceivably foreign to either the modern digital computer paradigm or anything we've ever called an analog computer. By comparison, whether someone is typing "17.213" onto a screen or setting a dial somewhere vaguely in the region of 17 is a minor detail!)

Biological computation won't be digital. It'll have digital components, but it will neither be intrinsically digital the way current computing is as it will inevitably be operating in an analog domain, nor will it resemble our current computing paradigms.


This article reads like a PR press release. You can't talk about biocomputers in 2018 without mentioning Tom Knight[1] and iGEM[2]. Knight not only pioneered this "next great computing platform," he also pioneered the current one: Knight was involved in early ARPANET work, Lisp Machines (MIT was the second organization after PARC to build networked personal workstations), and massively parallel SIMD (ie GPU) work on the Connection Machine.

[1] http://people.csail.mit.edu/tk/ [2] http://igem.org/Main_Page


Synthego custom crispr kit seems like a very useless service. Guide RNA design for knockout is extremely trivial. You just need a couple of 20bp+NGG sequence which is unique to the gene.. Who would pay to do that?..


That doesn't interfere with any other part of the genome... That's a hard compute problem.


Do you mean the effect of the actual gene knockout, crispr specificity or off-spot mutations?


CRISPR specificity. Designing good gRNAs isn’t quite trivial, although it’s not as hard as the article makes it seem. I’m not sure about the claim of reducing the time of experiments by “month”.


http://crispr.mit.edu/about

Designing a good gRNA is quite straightforward if you compare it to any other bioinformatics tasks. Even designing a degenerate primer is more complex than this.


Conputing? A Freudian slip maybe?


Occasionally I see posts, for example on reddit but also discussions here, about what other programming languages a programmer could learn as the next (and higher) step in ones ongoing education.

My suggestion for what to do after the CS degree, a road I took myself during the last few years, is to go to edX (or Khan Academy for any missing basics) and at the very least take MITs "Introduction to Biology", which actually is an introduction to genetics (when you learn biology you first have to understand cells). Also, neuroscience, for which there are many courses (longest one on Coursera, "Medical Neuroscience", with an excellent teacher), from the basics (https://www.mcb80x.org/) to computational neuroscience (https://www.edx.org/course/computational-neuroscience-neuron...).

I found understanding the basics of biology a lot more informative than learning new twists about some functional programming concept. It introduces you to a massively(!) parallel world where statistics rules, and errors/outliers are actually essential to functioning biological systems. You may find that if you remove all errors, for example that a protein is made that is not supposed to be made because it's not supposed to be needed, actually is essential, because only by making it does the cell notice that a different (better) fuel option has become available (that is from a concrete example, see the linked course :-)). So definitely add plenty of statistics courses until you learn to think "big". Each time you think about a problem think about that thing happening a trillion times instead of individual cases, until this becomes a habit. This helped me a lot, because most people focus their mind on single examples, for example when making suggestions how to improve society (hint: suggestions that sure work for anyone, but if everyone tried it would quickly unravel). Okay, in the last two sentences I'm going out on a limb (claiming that learning biology and statistical thinking helped), but definitely try some biology, statistics and neuroscience guys. It's all free and most of it high quality. And the statistics knowledge is just as good for machine learning so you need that anyway.

Also recommended: "Principles of Biochemistry" (https://www.edx.org/course/principles-biochemistry-harvardx-...), although I would say you should not try to do all the exercises because it will be way too much work (and highly demotivating, looking at how the discussion forum shows how the nr. of students very quickly grows thinner from course section to section) to try to learn all that stuff by heart. Also: "Cell Biology: Mitochondria"( https://www.edx.org/course/cell-biology-mitochondria-harvard...) to understand where the power comes from.

Also check out individual universities, some have a lot of free courses (sometimes using the edX platform software, which is open source, e.g. Stanford) hat they don't put on 3rd party platforms like edX.


IMHO these courses only scratch the surface, good to learn a thing or two, without much applicability. I am right now searching for a degree path in Biochemistry or Molecular Biology online. I want to spend time on it, but be able to actually apply my knowledge. There are so few of them (maybe because of the lab classes, idk). I've found online degrees in ASU (https://asuonline.asu.edu/online-degree-programs/undergradua...) and UF (https://ufonline.ufl.edu/degrees/undergraduate/biology), but they are really expensive. Biology is the future, but unlike the article, I think it will make software obsolete altogether.


> IMHO these courses only scratch the surface

Of course - that is what I recommend to programmers and CS majors, working in those jobs, on the side, not as a career path.

But in any case, those are "real" courses, so "scratching the surface" not because they are dumbed down but because those are the freshman courses. Of course year 2+ students will get more advanced courses not usually found on edX (although they have quite advanced topics in physics, for example https://www.edx.org/course/mastering-quantum-mechanics-part-...).

As I said, an alternative to learning yet another only mildly different programming language (that runs on the exact same pieces of silicon as the other ones they already know, so it cannot be fundamentally different by definition).


Do you by any chance did come across an online course for biochemistry or something similiar? I've wanted now for quite some time to study biology and change my career in that direction, but no relevant degree seems to be designed for remote study.


Look at the "Principles of Biochemistry" course I linked to. Even when OP says "it's scratching the surface", that was in comparison to a complete several years study, that course by itself is extremely involved and pure biochem. It's "only" a single course, but let's wait what you say after unit 3 - because judging by the forum participation, about 99% of people who join that course won't even make it past the 2/3rd mark. So if you can stomach that one course it would be a good sign. I read that about chemistry in general, should be the same for biochem, that if you study in those fields the load is quite extreme.


Would you mind talking about what field or area you work in?

I came from a bioengineering background and ended up doing a lot of computational work (signals and communication systems, ML, FEA), so I feel like we are arriving at the same conclusion from opposite sides.


I studied CS, worked as a consultant and at a major Linux company, later as a freelancer. Learning stuff feels good :-) Even better when it's something I never ever expected to learn, since I thought that after choosing my field of study for university my path was all set. Thanks Internet - possibilities truly have increased by orders of magnitude compared to my youth!


Is there any project or discipline aiming to model our biology in software so that we can simulate things on a computer? (Edit) I found this https://en.m.wikipedia.org/wiki/Modelling_biological_systems I guess that is it


This is one of the main subfields of bioinformatics.


At that point I flee anything with computing. I feel it's a reflex fad that invades any field.


Surprised there isn't a major branch of philosophy called computational philosophy. Give how some computer scientists treat computation as central to existence.



* CoMputing (typo in the title)


Huh? I thought it’s been the computing platform for roughly the past billion years.


"Conputing"?


Please fix Conputing.


Typo in the title: "conputing" should be "computing".


A serendipitous typo though, putting emphasis on the parallellism. It's the same prefix anyway.




Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: