Hacker News new | past | comments | ask | show | jobs | submit login

A lot of things regarding chemistry and biology - finding catalysts for nitrate fertiliser production (uses 5% of worlds power), perhaps some catalysts in cement production (my idea), invention of new medicines, either due to better cell simulations or (my idea) faster protein folding.



Is the bottleneck preventing better cell simulations computational or lack of data / insufficient understanding of biology to inform a predictive model?


My understanding as a physics undergrad is that we can write down the 'equations of motion' of molecular systems with Hamiltonians describing the energies and correlations of the electrons in a system, but they very quickly become intractably difficult to solve. You're trying to solve a very, very big eigenvalue problem.

However, with the right kind of quantum computer or quantum simulator, you could construct a system of qubits that is described by the exact same Hamiltonian. That way, the quantum state of your qubits and your original system would behave in exactly the same way. Then, you just let the qubits evolve in time and read out the system's state at the end. Do that a bunch of times and you'll see an average picture of what the original system you're modelling (protein or something) would do.

So to recap - we can get around the difficult compute bottleneck caused by the desire to perform high-fidelity physics simulations by creating a system that follows the same rules and which we can probe much more easily.


I'm a cofounder of a start-up working on near-term applications of quantum computing in biology, specifically on the protein structure side of things (https://www.proteinqure.com). There are many self-contained subproblems in this space which are not limited by data because models are accurate enough to inform experiments, but there's probably not a scientist in the world who would say we have a sufficient understanding of biology to make predictive models with generality.


That's interesting, what are some examples of those subproblems?


Not an expert, but my understanding is that while better knowledge/models of the biology might be useful, ultimately it's just a massive computational project to test out and optimize protein folding and such complex chemical behavior. The physics/chemistry and biology are understood well enough, there are just nearly countless paths to explore making it a huge problem.


Nature manages to fold proteins quickly enough. What's nature doing that we can't?


Parallelization on a single particle level at every point in space.


Nature manages to cause weather patterns easily enough, why can't we?

Your question doesn't make sense. Many (most) things in nature are incredibly hard to model correctly. Just because they happen doesn't make it easy to quantify usefully.


They know the recipe, we are searching for one


I wouldn't expect a clear bottleneck, because increased computational capacity leads to increased understanding.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: