Hacker News new | past | comments | ask | show | jobs | submit login
Webcam Pulse Detector (github.com)
259 points by theschwa on Apr 16, 2013 | hide | past | web | favorite | 79 comments

The paper that this is based on is also pretty fascinating: Eulerian Video Magnification for Revealing Subtle Changes in the World http://people.csail.mit.edu/mrub/vidmag/

Patented :'(

What's exactly patented? I remember earlier papers on pulse detection than the one on the page.

Well, in my opinion, a patent like this is well deserved. I could be wrong, but the researchers have spent significant time on it and have discovered/invented something really unique.

that's the tragedy of it. something cool is invented and now it'll be fenced off and buried for 25 years in all likelihood, and/or becomes part of a big patent portfolio.

doesn't a patent imply that you must let other people use your idea in exchange of a fair price?

No, a patent implies that you publicly disclose the knowledge behind your discovery immediately, in exchange for the property of this knowledge being guaranteed† to be yours for a limited time, and its use guaranteed to become free when this time runs out.

The alternative is not to disclose it, with the risk for you that nobody will reverse engineer or discover by themselves, or the risk for everyone else that this knowledge could be lost forever.

† pending approval, and unless successfully challenged in court (where the patent applies).

ICBW, but no, I don't think it does, except it perhaps a narrow set of circumstances.

Compulsory Licensing[1] is maybe what you're thinking of, but from what I can tell seems to apply mostly to drug patents, or things {the,a} government wants from you. It has a wider applicability in copyrighted forms of IP, I believe.

Patents that are part of some organised standard are often required to be placed under 'FRAND[2] (Fair, reasonable, and non-discriminatory)' compulsory licences to allow for interoperability whilst still allowing the patent holder to receive (reasonable) royalties if they wish.

[1] https://en.wikipedia.org/wiki/Compulsory_licensing

[2] https://en.wikipedia.org/wiki/Reasonable_and_non-discriminat...

It would be more correct to say that "patents that their owners contribute to some organised standard ..."

If I add a patented process to an important standard, but I don't own the patents, you still don't get those patents just because they are part of some important standard.

No. In fact, patents are exactly the opposite.

They only grant the right to exclude others from doing something. So if i own a patent on X, it gives me the right to prevent others from doing X.

Even if i license my patent on X to you, it does not necessarily mean you can do X, because doing X may also require other patents that my patent is an extension of. In other words:

Imagine we have patented processes, each building on the last. Patent A covers doing thing one Patent B covers doing thing one, then thing two Patent C covers doing thing one, then thing two, then thing three.

Granting you only the right to perform patent C does not enable you to perform the process described by patent C, you'd still need licenses for patent B and patent A to do that.

Your thinking of a Compulsory license. However that's not the general case, for example the vast majority of drug patents have no such restrictions.


PS: The only real exception in the USA is the US Government get's to ignore patents at it's digression.


There is already an Android app[1] that does something similar. You put your finger up to the rear-camera and it detects your heart rate.

[1] https://play.google.com/store/apps/details?id=si.modula.andr...

It works differently. This one works by magnification of tiny changes (colour, movement, etc)

See also: http://people.csail.mit.edu/mrub/vidmag/

Has anyone been able to get it to work? Whenever I try to open the stats display, I get:

OpenCV Error: Assertion failed (p.checkVector(2, CV_32S) >= 0) in polylines, file /tmp/buildd/opencv-2.3.1/modules/core/src/drawing.cpp, line 2064

I have the same problem and have opened an issue: https://github.com/thearn/webcam-pulse-detector/issues/1

Maybe somebody that knows more about OpenCV than I do can figure out what's wrong.

I did a git pull this morning, and it's working now.

works for me on OSX 10.7.5, with opencv @2.4.4_3+python27 installed via macports.

I did have to downgrade to the ffmpeg (ffmpeg @1.2_1+gpl2+nonfree) port (from ffmpeg-devel) due to build errors in opencv, but worked ok after that.

Couple of the openMDAO packages didn't install either, but didn't seem to harm anything.

Give it a video feed from a political debate and live-tweet the politicians pulse on different topics. I foresee a future of politicians sporting bangs and bob cuts.

They're already so heavily made-up for the cameras I would be surprised if this worked.

Necks might not be, or hands.

This would probably be a bad metric though, you might accidentally be selecting for sociopathic traits or something which might be undesirable in a politician.

Cool fact, a Slovenian company that is now SF based has made this tech into an iPhone app a few months ago. -> http://www.azumio.com/apps/cardio-buddy-2/

I really like their apps and I'm quite proud that they originate from my neck of the woods.

It's a nicely designed app, but it doesn't work that well. Told me I had 60, 62, 61, 59... BPM when in fact it was over 75.

Add some software to do expression analysis a la Paul Ekman [1] and one has a remote non-invasive "lie detector".

[1] https://www.paulekman.com/

Just like lie to me

I haven't downloaded this yet, but I certainly will. If it is accurate and reasonably precise (yea, what does that mean?), I'd like to have it running in the background, gathering data as I work, while also gathering statistics about my work. I, like most of the people here, am a productivity junky. I wonder if a heightened pulse could indicate mounting stress, which necessitates a break. Sure, I have other machinery that could indicate mounting stress -- like my mind -- but the quantitative, off-loaded metric shifts responsibility in a way that could be helpful. "I'm not really stressed," says me to my brain. "No, your heart rate is quite high," says my computer to me.

You could get better results with a Bluetooth pulse monitor.

That probably requires wearing something, potentially on your finger. Webcam is much less intrusive.

Turns out that the fingertip pulse oximeter[1] I have works surprisingly well on a toe.

Probably only applicable if you work from home though.

[1] http://www.nonin.com/OEMSolutions/WristOx2-Model-3150-OEM

Typically a chest strap but yeah no disagreements here, it would be less convenient.

Interesting. There was a Rock Health company that created an app using this technology. The app is called Cardiio. I know the founders were from MIT/Harvard but can't recall if they were the same people who conducted this research.

I've gotten this far trying to install this on Lion:

brew install gfortran

sudo easy_install scipy

sudo port install opencv +python27

export PYTHONPATH=/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/:$PYTHONPATH

wget http://openmdao.org/releases/0.5.0/go-openmdao.py

sudo python ./go-openmdao.py

cd ~/Downloads/webcam-pulse-detector-master/openmdao-0.5.0

. bin/activate

cd ~/Downloads/webcam-pulse-detector-master && python get_pulse.py

Hope it's helpful to anyone else doing the same. OpenCV is segfaulting now, but I may update this later if I get it.

I got it working in ML with :

brew upgrade

brew tap homebrew/science

brew install opencv

brew install gfortran

sudo easy_install scipy

mkdir OpenMDAO

cd OpenMDAO/

brew install wget

wget http://openmdao.org/releases/0.5.0/go-openmdao.py

export PYTHONPATH=/usr/local/Cellar/opencv/2.4.4a/lib/python2.7/site-packages/:/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/:/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/:$PYTHONPATH

sudo python ./go-openmdao.py

cd openmdao-0.5.0/

. bin/activate

#get the project in webcam-pulse-detector I used the github app

cd ../../webcam-pulse-detector/

python get_pulse.py

> brew install gfortran

gulp. fortran?

This would be an awesome application for google glass! Does anyone know if it's possible?

Imagine being able to look at someone and know their pulse. You'd have a wearable lie detector.

I haven't been amazed for a while. Thank you.

Great App for poker players

Imagine it integrated in Google Glass...

I want to point it at crowds and get an aggregate crowd-pulse rate! Measure the excitement!

I expect to see this used by DJs, where the momentum of the music is controlled by the crowd.

Need way more light than that for a standard camera... now, maybe with an IR camera...

Perhaps stage lighting could be used to illuminate the crowd...green lighting perhaps? :)


Reminds me of this question on Stack Overflow:


[Not a Python user] What is OpenMDAO and how do I install it on OS X?

This looks great for a music performance stunt -- the audience doesn't know but the DJ rigged up a webcam as a trigger!

what if someone created passwords based off of your brainwaves (assuming every brainwave is unique as a fingerprint)

The concept of "passthoughts" has been discussed recently.


It'll be interesting to see how robust that is against different levels of alertness. I clicked through but couldn't see any info on using it on tired people - and your EEG changes significantly with alertness level. Depends on what they're measuring though.

How precise is this, do we have data? I'm guessing it depends on the refresh rate of your webcam.

It's precise enough for the application.

Webcams typically have a 15-30 fps framerate. Heartrates are typically 30-200 bpm (~= 0.5 to 3.3 bps).

The duration of the pulse is much shorter than the beat, though. Should still be visible at that framerate, just noting that pulses aren't sinewaves :)

It seems that the Nyquist-Shannon sampling rate would be too low for the results to be accurate.

No ? The frequency needs to be at-least double this seems the be the case. However webcam noise and the algorithm might increase the need.

The webcam framerate should be at least double the heart rate to get an accurate measurement, not the other way around.

And isn't this the case?

15 fps (worst case framerate) vs 3 bps (worst case heart rate)

I dont believe so. Since the heart rate could be much higher, it would alias, and the camera wouldn't be able to tell the difference between normal heart rate and low heart rate.

This is what I suspect could happen, with two possible heart rates matching the same set of samples:


If your heart rate is higher than 3bps, you won't be sitting patiently in front of a webcam waiting for lock-on.

Sedentary (calm, sitting) heartrates go from ~40bpm for extremely fit people to ~90-100 for particularly unfit people. Higher than 100 is unusual for sedentary people, and works out to ~ 1.5bps. The waveforms in the pulse itself (note pulse, not ECG, which has much faster elements) speed up with the pulse, but I would estimate (could be wrong) that your highest frequency significant elements would be around 5hz for a 1.5bps heartrate.

Your example from wikimedia has a signal that is ten times faster than the sample rate - and the nyquist limit says this would be aliased. My back-of-the-envelope calcs above suggest a 5Hz signal at the high end against a 15Hz sampling at the low end, which is a ratio of 3:1, which is enough to satisfy nyquist.

Disclaimer: I am an ex-neuro tech and ex-sleep tech. I am used to sticking electrodes onto people and studying them. The exact form of the pulse wasn't big in my area, so maybe a cardio tech can chime in and correct me.

Since the heart rate could be much higher

Err - why do you say that? If your heart rate is much higher then either you aren't human or your are probably dying.

Did you read my original comment?

Human heart rates range roughly between 30 to 200 beats per minute. That maxes out at a bit over 3 beats per second.

when I have some time and can dig out my pulse oximeter, I'll see if I can hack it in for comparison. Based on trying to take my pulse manually it certainly seemed in the ballpark, and got even better/stronger locked results from using my thumb ~5-10cm from the camera.

It seems quite sensitive to image saturation for losing lock though.

it should still be very accurate though right? and so decently precise over a period of time?

What in the world? Is this magic?

On each heartbeat, blood is pushed and makes the skin nearly unperceivably redder. By watching for and tracking this subtle change, you can extract a heartbeat.

> ... unperceivably redder

But why is he analyzing the green channel then? :)

Look at the absorption spectra at http://omlc.ogi.edu/spectra/hemoglobin/index.html. Green is ~510nm, red is ~650nm, blue is ~475nm. The difference between the absorption of Hb and HbO2 in the red range is the reason why pulse oximeters use red. In this case, we don't care about differentiating oxy- vs. deoxy-hemoglobin, we just care about the total absorption. Since more light is absorbed at green than at red, it will show more of an effect with pulse.

I am not confident in this explanation, and still don't know why green over blue.

> I am not confident in this explanation, and still don't know why green over blue.

I'm not sure either, but I can tell you that if you examine digitized (RGB) 35mm film -- across all 35mm film stocks -- the blue channel is the most noisy.

So my guess is: He might have chosen green over blue due to the blue channel being noisier.

Is that noise inherent to film? Do webcams have the same noise?

The green channel has the least noise for Bayer-sensor digital cameras (which is almost all of them).

50% of the sensor sites are green (25% each for red and blue), so there is more signal given the same amount of noise.

Why do they make them like that? Why not make the sensors 33/33/33?

From Wikipedia on the Bayer Filter: "Bryce Bayer's patent (U.S. Patent No. 3,971,065) in 1976 called the green photosensors luminance-sensitive elements and the red and blue ones chrominance-sensitive elements. He used twice as many green elements as red or blue to mimic the physiology of the human eye. The luminance perception of the human retina uses M and L cone cells combined, during daylight vision, which are most sensitive to green light."

The Foveon sensor does sample RGB at every pixel by using 3 layers, but it is only used in a few cameras.

Why isn there a live demo?

Awesome. Could open up a whole field of applications.

Like allow robots to detect if something is alive or dead. :D

Is there a demo link for this application?

So strange that tye dont put up a live demo!

I haven't tried it out, but give this a shot it was linked on the original research paper's site[1]:



Its a native app: hard to have a live demo for a native app.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact