Hacker News new | past | comments | ask | show | jobs | submit login
Why Recursion Pharmaceuticals abandoned cell painting for brightfield imaging (owlposting.com)
66 points by abhishaike 81 days ago | hide | past | favorite | 11 comments



I find this fascinating. I used to do high content screening of cells with machines like the Perkin Elmer Operetta and lots of dyes. I would have never thought machine learning would make us go full circle back to brightfield.

The meat of this article:

> The trend of ML in general over the past 15~ years has been to strip away more and more of the biases you’ve encoded about your dataset as you feed it into a model. Computer vision went from hand-crafted interpretable features (e.g. number of circles, number of black pixels when thresholded, etc), to hand-crafted uninterpretable features (e.g. scale invariant feature transform), to automatically extracted uninterpretable features (e.g. hidden dimensions of a convolutional neural network). In other words, the bitter lesson; pre-imposing structure on your data is useful for a human, but detrimental to a machine.

The gist of it I understood is that ML could already tell and label the organelles and cell structure in brightfield imaging without the dyes and that adding those on top just muddled and hindered it.


It makes a lot of sense in retrospect because all the structures are visible on the brightfield image - it's just more annoying for a human being to pick out the features on thousands of slides.

I wonder what the dye budgets and spectral bands will go to now that there is no need to mark the boundary of the nucleus. I bet there are a lot of things that are not visible in the refractive index that you could dye.


My team did part of the work described in TFA, AMA.


I should point out that we are hiring! Software engineering, data science/ML, and IT positions are available in Salt Lake City, Toronto, New York, and London:

https://job-boards.greenhouse.io/recursionpharmaceuticals


Do you guys have a cooldown period for applications?


Not that I know of! Note that the positions we list are typically in different teams so it's worth reading the descriptions to make sure you're picking whichever one is most appropriate to your experience+interests (and, as a corollary, if you weren't a fit for a previous position, you may be one for one that comes up in the future).


they dont respond to cold apps in my experience, except canned rejection after a few months for some reason


We definitely do (as in, I myself have) hire folks from "cold" inbound applications.


I got a response to a cold application within the last year.


Do you guys have high hopes for video data? Were there early results that showed promise for kinetic data?


Yes, we (and plenty of others) have prior data showing that perturbations (things that you can do to cells, like knocking out genes or putting drug-candidates on them) have different effects at different times, so we are excited to see the potential for time course imaging.

Note that "video" is a little different here than the way we all usually think of it - think "minutes or hours between frames" not "frames per second".




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: