Hacker News new | past | comments | ask | show | jobs | submit login

AR/VR visualization of 3D and 4D microscopy data (multicolor 3D video, as well as 3D point clouds over time) for biological research.

Look for "lattice light sheet microscopy" or "superresolution microscopy" such as (3D)-STORM or STED.

These techniques are adopted at a high pace. Groups spent $ 500,000 and often more on the hardware. They can produce terabytes of data within days, but we hardly have any tools to view and interact with it. (And the people are overwhelmed with the analysis.)

Imagine a holographic video of living cell (potentially in near real time) where you can zoom by grabbing the hologram.




I'm intrigued! How do you interact with these images right now? Do you display it on screen and rotate it with the mouse? Can you elaborate a bit more about how would a VR/AR solution add value here other than being a fancier solution?


Manipulation with the mouse if you're lucky. Often this would involve Matlab, or some specialized software. But in many cases we don't even have this and people use Fiji to scroll through the z-dimension with a slider.

For presentations people often render a movie with Matlab, investing hours to get it right. With AR, you could take a movie by filming with a virtual camera in your hand.

Augmented Reality would add the most value. Some examples:

+ intuitive exploration of the data (imagine learning about a plant by scrolling through cross-sections)

+ intuitive manipulation of the perspective (the mouse 3D rotation thing is really tricky)

+ collaborative viewing

+ annotation of objects (eg. tracing a filament through 3D by following it with the finger)

+ avoid occlusion by just zooming and moving in

+ be able to point at things in a 3D image

It would bring much more natural ways to interact with the data. Essentially scaling up your molecular structure 10^6 to 10^7-fold so you can explore it as you'd explore a sculpture.


A friend just did his diploma thesis on this. He built a system where you use a tablet device in your hand to push through 3D space visualizing the layers. I couldn't possibly describe it well enough, but I'll forward him a link to this thread and see if he answers.


When I interned at Autodesk, they have something similar for movie making.

they call it the virtual camera. it's essentially a tablet, you used as if it's a camera of a virtual world.

Avatar was made with that device.



Do you know where I could source some sample data? I'm developing a VR data visualization system but not in this particular context, could certainly look into it.


Here, follow the instructions to download and install the HDF5 plugin for Fiji:

http://lmb.informatik.uni-freiburg.de/resources/opensource/i...

From the same site you can download "pollen.h5". Then follow the steps to load the dataset in "hyperstack (multichannel)" mode. This will open a window where each frame in the movie is a z-slice of the 3D image.

Then go to Plugins/Volume Viewer (scroll down) and switch the mode (top left) to "Volume". (this is what you should get: http://i.imgur.com/0QoGa1t.png)

The same people also published lots of 3D data of the zebrafish embry. Have a look at:

http://vibez.informatik.uni-freiburg.de/

Look for "e098.h5" to download it.

If you have never worked with the HDF5 format before, you can use HDF5 View (https://support.hdfgroup.org/products/java/hdfview/) to look into the files. As you can see here (http://i.imgur.com/nnTiyQh.png) the data in pollen.h5 basically is a uint8-array of 193x199x419.

There are libraries for many languages to read HDF5. Eg. "h5py" for Python.


Thanks for the references, looking.

By the way, check out https://www.youtube.com/watch?v=-PVAcLlYUpg


This is Cool!


[follow up]

For Light Sheet 3D video of developing fly embryo take a look here: http://www.digital-embryo.org/ They have movies and also downloads of the raw data and Matlab pipline for analysis.

Notice that this was published in 2012. Fresh data doesn't get published before the analysis is done and the paper written..


Is this volumetric data?


Yes.

These are images taken by confocal microscopy, where you focus a laser scanning microscope on a plane and only see that. Then you move the focus plane up by a bit and take the next picture.

Super-resolution microscopy such as STORM, STED or PALM on the other hand will give you coordinates.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: