Hacker News new | comments | show | ask | jobs | submit login

While there has been a huge growth in brain research over the last decade or so, the methods and analysis are still immature, and results are often quite fuzzy (even if researchers would like their reviewers to believe otherwise). There also doesn't seem to be a clear goal here, other than to throw a bunch of money at some interesting questions. The Genome Project had the very clear goal of mapping the human genome. The European project aims to simulate a human brain. This just seems to be directing a lot of money at research which is already being done.

That being said, I think there might be some very interesting opportunities here for talented developers. When I worked in the field a few years ago, the software used for this stuff was generally a lot of MATLAB scripts with C subroutines held together by some scripting language duct tape (Python/Bash). It was slow, it was buggy, we were basically writing documentation as we figured it out ourselves, different labs had different methods and scripts and in general it was kind of a mess. It's actually bad enough that there are research grants out there to develop better analysis software. Most of the researchers are trained in statistics, neuroscience, and psychology but have very little programming experience. If you are interested in some of the software that is out there right now, below is a partial list. Also, if you are interested in this kind of research but can't contribute code-wise, contact your local research university. They are always looking for test subjects for this kind of stuff, you usually get paid pretty well ($50-100/hour for imaging studies) and you get some cool images of your own brain out of it if you ask.

http://www.fil.ion.ucl.ac.uk/spm/

http://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FSL

http://afni.nimh.nih.gov/afni

http://surfer.nmr.mgh.harvard.edu/

http://nipy.sourceforge.net/

http://www.trackvis.org/




I've thought a lot about why fMRI data analysis tools are so terrible, and I've come up with a few reasons:

- It's impossible to implement the algorithms for fMRI data analysis efficiently in most "dynamic" programming languages due to the performance hit you take from using a dynamic language. (It might actually be possible in Julia, NumPyPy, Python with Numba, but these languages are not yet well-established.) On the other hand, dynamic programming languages are much better suited to exploratory data analysis than C is, so essentially all fMRI data analysis ends up being a mixture of C code and glue code in some other language. In this regard, I don't think SPM (MATLAB with C MEX files) is really that bad. It's fast and it avoids having to read the data from disk multiple times.

- People use what the tools they know, not the tools that are best for the job. FreeSurfer is a mess of C, C shell, and Tcl/Tk, but there's nothing else that can visualize fMRI data with comparable ease and accuracy. Most people in neuroimaging only know MATLAB, which is pretty terrible for analyzing large data sets because it can't mmap files (and it doesn't have the language features necessary to make this possible, and it's closed source).

- Related to the above, it's easier to get funding to develop a novel algorithm than to implement an existing algorithm in a way that makes it more useful/accessible to researchers. I believe this is slowly changing.

- There are a lot of different algorithms used for analyzing fMRI data, and no single package implements all of them. The necessity of each algorithm differs by lab and researcher, according to scientific necessity, personal preference, or the conventions of their subfield. People end up writing their own code to glue together methods from different analysis packages, which is, again, often written using the wrong tools.

- Us graduate students who know how to code well need to publish papers. There is comparatively little incentive to publish code.


Check out http://nipy.sourceforge.net/nipype/ It's basically a pipeline for plugging different packages' outputs and inputs into each other, wrappers are already there for SPM, FSL, and other well known programs and it's easy to write your own stuff if you know some Python. It speaks to exactly this approach, as it is algorithm/implementation agnostic, it just defines the process and tries to formalize getting data from one place to the next.


Matlab can mmap. SPM may not use it because they have limited themselves to Matlab 7 (2004) compatibility (also no toolboxes).

http://www.mathworks.com/help/matlab/memory-mapping.html

It's somewhat a moot point though, because everyone likes to gzip their nifti and unfortunately the file formats don't have a uniformly-accepted way to leverage the huge disk savings we can get from masked data without applying (stupid from the point of view of the types of analysis we do) compression. Even filesystem-level compression doesn't help. If you can fit all your data into available RAM, you're fine. If not...


I wasn't aware of this, and it's surprisingly hard to Google. Thanks for informing me.


Let's add to that fMRI data is next to useless for actually studying the brain. It can map large scale organisation, but it's like trying to measure the economy based on which power stations are in use at any given time.


You could learn quite a bit about the organization and function of a country based on the temporal-spatial patterns of its energy consumption particularly when you're given the power to control external stimuli. But yeah, it could be pretty useless for studying something on the scale of a subdivision. It depends entirely on which spatial scales you are interested in studying.


Following your analogy, trying to understand the brain by putting a few dozen electrodes in it is like trying to understand the economy based on watching people interact in a few dozen rooms.

Like you, I am skeptical of the explosion of human neuroimaging, but I think that, as a technique for determining where to drop your electrodes, fMRI can be a very powerful tool.


Agreed about the horrible tools. Unfortunately, this is an extremely difficult market to break into. Most academics want free and open source tools and don't care how many research assistants or post-docs they have to torture to use them. Compounding the problem is that the market is so small any commercial product is going to have to be very expensive (>$1000.00) to make it worth while re: development and support costs. For instance something like BrainVoyager (http://www.brainvoyager.com/) which is a commercial competitor the tools you listed starts at $4,000 which is a pretty good chunk of any equipment budget on a grant.


Closed-source tools are also problematic for research purposes, since at some point in your career you will almost certainly want to run an analysis that they can't perform.


BrainVoyager's approaches are from the ice age and not flexible enough to handle modern protocols. Only a moron could defend using it for research. It's tranlational at best (a toy you give those clinicians that can't even figure out SPM). But it does make pretty pictures.


I'm a clinician and it admittedly took me ages to figure out SPM. I still hate it, and think it is terrible to work with but useful. The state of the software is actively hampering contributions from clinicians. The problem is, as you have pointed out, that if BrainVoyager is the alternative, that's no alternative at all.


I'm not defending BrainVoyager as a product. It was just an example of the type of business model the academic market requires.


Well said! I agree and you are correct that the software used for this stuff pretty much sucks. The thing is that a lot of scientists don't know that the software could be better, and they are reluctant to try new software considering it is so expensive (We just spent $5k on a licence for software that is subpar. I've seen this 91203091283 times and it's disappointing). New software will change the game.

To add to that list, http://www.brain-map.org/ is a reallllllllly nice tool. I love what this company is doing in terms of automating gene expression within the brain and look forward to their software going through some revisions.


Add to those NEURON (http://www.neuron.yale.edu/neuron/) which is the most popular for compartmental neuron simulations. It's a terribly old program (1990) with horrible limitations yet people still use it.


Terribly old - But is being updated constantly, they are adding in the capabilities to explicitly (in hoc or python) add in chemical reaction schemes. Some interesting research is being worked on in that regard at the moment.


If there are horrible limitations, email Dr. Mike Hines or write some code that will fix it. It is open source.


From what I see in the EU project, while it's stated goal is to simulate an "e-brain", still a big portion of the money and manpower of it is going to (a) the mapping of biological brains that seems to overlap with this proposal, and (b) biochemistry and neurotransmitter basic research.

I mean, a mandatory prerequisite in any serious 'brain project' is some progress in almost all of the related fields, and in a sufficiently big project you try to get some leading scientists from all these areas under your umbrella.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: