Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Since fMRIs are likely the future for this type of thing, this is probably the most disturbing job posting I've ever seen:

http://www.acfei.com/forensic_services/jobsearch/job-236170....

-

Title: PhD COGNITIVE NEUROPSYCHOLOGISTS

Date Posted: 05/03/2012 [...]

[...] seeking contract physiologists to provide technical expertise in central nervous system (CNS) studies related to credibility assessment. [...]

Candidates will provide expertise in the use of CNS technologies such as functional magnetic resonance imaging (fMRI), [...]

A Top Secret security clearance is highly desirable. Candidates without a security clearance who possess superior qualifications may be processed for the required USG security clearance before commencing work.



There are already commercial operations using (or claiming to use) fMRI for lie detection [0]. I can understand the potential appeal of similar positions for applicants — there's a scarcity of academic jobs for even highly skilled cognitive neuroscience PhDs.

As I've posted elsewhere in these comments, the best available evidence suggests that fMRI is no more useful than polygraph for lie detection. Unfortunately, the power of brain images to induce credulity in otherwise intelligent people is well-known [e.g., 1].

[0] http://noliemri.com/

[1] http://link.springer.com/article/10.1007/s12152-007-9003-3


I wouldn't be so quick to discount the technology's potential. While the commercial operation you cited is probably dubious at best, and while it may be true that the current state of the technology is as unreliable as you say for the purposes of credibility assessment, I still believe the future of fMRI research holds incredible potential.

Keep in mind that as far back as 2011, researchers succeeded in reconstructing images from the visual cortex[0].

Moreover, the intelligence community has far more resources and motivation to perfect such a technology. It's possible they may already have advanced well beyond what's currently known today. If not, then they certainly will in the future, provided the technology continues to hold its promise.

[0] http://news.berkeley.edu/2011/09/22/brain-movies/


I wouldn't be in cognitive neuroscience if I thought our methods held no promise, but there's a vast gap between reconstructing images from visual cortex responses and accurately determining whether a person is lying. Farah et al. [0] recently summarized the best available evidence and concluded as follows (emphasis mine):

"... different policies should be considered for different applications of fMRI-based lie detection. We do not join calls to ban fMRI-based lie detection across the board. Despite the enormous shortcomings of the current evidence ... we suggest that restrictions should be proportional to the outcomes and principles at stake. Risk reduction in dating calls for different standards of certainty and different protections of individual rights than the interrogation of terrorist suspects."

[0] http://www.nature.com/nrn/journal/v15/n2/abs/nrn3665.html


My apologies, I didn't check your profile prior to replying and thus was unaware this was actually your field. Good to know. :)

On that note, aside from your assessment almost certainly being far more accurate than my own, I would say yours is largely more desirable from a societal point of view. Even if the technology advances at a most glacial pace, society may still not be properly prepared for the implications upon its arrival. The slower the better, perhaps.


That study is likely closer to BS than you might think. http://www.wired.com/2009/09/fmrisalmon/


True, although I cannot help but feel that your supporting evidence for this claim is somewhat fishy.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: