A 30-second video of a newborn baby shows the infant silently snoozing in its crib, his breathing barely perceptible. But when the video is run through an algorithm that can amplify both movement and color, the baby’s face blinks crimson with each tiny heartbeat.
The amplification process is called Eulerian Video Magnification, and is the brainchild of a team of scientists at the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory.
The team originally developed the program to monitor neonatal babies without making physical contact. But they quickly learned that the algorithm can be applied to other videos to reveal changes imperceptible to the naked eye. Prof. William T. Freeman, a leader on the team, imagines its use in search and rescue, so that rescuers could tell from a distance if someone trapped on a ledge, say, is still breathing.
“Once we amplify these small motions, there’s like a whole new world you can look at,” he said.
The system works by homing in on specific pixels in a video over the course of time. Frame-by-frame, the program identifies minute changes in color and then amplifies them up to 100 times, turning, say, a subtle shift toward pink to a bright crimson. The scientists who developed it believe it could also have applications in industries like manufacturing and oil exploration. For example, a factory technician could film a machine to check for small movements in bolts that might indicate an impending breakdown. In one video presented by the scientists, a stationary crane sits on a construction site, so still it could be a photograph. But once run through the program, the crane appears to sway precariously in the wind, perhaps tipping workers off to a potential hazard.
Written By: Erik Olsencontinue to source article at bits.blogs.nytimes.com