An MIT project that started as a way to observe the shallow breathing of newborn infants being used in all kinds of applications. Last year, MIT's Computer Science and Artificial Intelligence Laboratory created a video amplification process called Eulerian Video Magnification. As The New York Times reports, the amplification analyzes pixels in a video frame, determining their exact color, and then identifies how that color changes frame to frame.
The algorithm them amplifies the color changes by up to 100 times, so that a baby who seems perfectly still suddenly wiggles and flushes red with each breath and heartbeat. MIT's algorithm isn't meant to show things as they actually are, necessarily--the 100 times amplification obviously exaggerates motion--but it gives us a visual representation of minute changes we otherwise couldn't detect with the human eye.
It's easy to see how the amplification could be built into a video baby monitor to put a new parent at ease, but the MIT scientists have bigger things in mind. According to the Times, the public response to a presentation last year opened their eyes:
Michael Rubinstein, a doctoral student and co-author on the project, said that after the presentation and subsequent media coverage, the team was inundated with e-mails inquiring about the availability of the program for uses ranging from health care to lie detection in law enforcement. Some people, says Mr. Rubinstein, inquired about how the program might be used in conjunction with Google’s glasses to see changes in a person’s face while gambling.
The code is available online, but MIT has also partnered up with Quanta Research Cambridge to turn it into a web app. Go to Videoscope and you can upload mp4 and webm files to have them run through the algorithm.
Check out the MIT video below for a more detailed look at how the algorithm works.