Scientists Can Record Audio From Silent Video Now

Marc Hogan

By Marc Hogan

Lead News Writer
on 08.05.14 in News

The phrase “visual album,” used to describe Beyoncé‘s video-per-song self-titled album last year, could one day take on a whole new meaning. As WFMU points out, researchers at MIT, Microsoft and Adobe have come up with a way of building an audio signal based on objects’ tiny vibrations in a silent video. As part of their experiments, the researchers managed to analyze the vibrations of a potato-chip bag, captured through soundproof glass, and come up with understandable speech.

Alexei Efros, an associate professor of electrical engineering and computer science at the University of California at Berkeley, likened the findings to a real-world equivalent of James Bond’s technological gadgetry. “This is totally out of some Hollywood thriller,” Efros said in MIT’s news release. “You know that the killer has admitted his guilt because there’s surveillance footage of his potato chip bag vibrating.”

The researchers also reconstructed audio from videos of aluminum foil, water in a glass and the leaves of a plant, according to MIT. Objects vibrate subtly when sounds pass through them, the researchers say, and the technique in the experiments pass frames of video through an array of image filters. The researchers then use an algorithm to put together what comes out of the filters and  reasonably gauge the movements of the whole object.

Forensics and law enforcement have been floated as applications for the technique, and that only makes sense, but it’s tempting to dream up ways it could be used on music’s furthest fringes. While some of the cameras the researchers used were high speed, others were at the standard frame rate of a smartphone. What would a concert sound like if its audio were reconstructed based on vibrations in blades of grass? “I’m sure there will be applications that nobody will expect,” said Efros.

Then again, maybe the sound from the video wouldn’t even be different enough to make sense in an avant-garde musical context. In the video below, which demonstrates the MIT researchers’ experiments, they capture a recording of Queen and David Bowie‘s “Under Pressure” based on the vibrations of earbuds plugged into a laptop; it’s so distinct, it gets recognized by the Shazam music-ID app.

See also: Twitter Can Be Used to Press Virtual Vinyl.