"In 2009, Casey Pugh asked thousands of Internet users to remake "Star Wars: A New Hope" into a fan film, 15 seconds at a time. Contributors were allowed to recreate scenes from Star Wars however they wanted. Within just a few months SWU grew into a wild success. The creativity that poured into the project was unimaginable...Finally, the crowd-sourced project has been stitched together and put online for your streaming pleasure. The "Director's Cut" is a feature-length film that contains hand-picked scenes from the entire StarWarsUncut.com collection."
I was never a big Star Wars fan, but four minutes in, I'm already hooked. What a fantastic homage, idea, and collaboration. Like the YouTube choir performing Eric Whitacre's "Sleep" and The Johnny Cash Project, Star Wars Uncut is a massive-scale art project created by folks who are strangers to each other, made possible by the web:
Read more about the project at Star Wars Uncut.
1.31.2012
1.06.2012
When the world is your instrument: Mogees
Mogees is, according to the website, "an interactive gestural-based surface for realtime audio mosaicing". The video demonstration is thrilling:
Mogees - Gesture recognition with contact-microphones from bruno zamborlin on Vimeo.
More text from the website:
In the video we show how it is possible to perform gesture recognition just with contact microphones. Through gesture recognition techniques we detect different kind of fingers-touch and associate them with different sounds. In the video we used two different audio synthesis techniques:
- physic modelling, which consists in generating the sound by simulating physical laws;
- concatenative synthesis (audio mosaicing), in which the sound of the contact microphone is associated with its closest frame present in a sound database.
The system can recognise both fingers-touches and objects that emits a sound, such as the coin shown in the video.
I don't understand this, or what's happening in the video: is the microphone simply massively amplifying the almost inaudible sounds your fingers make while tapping and riffing on objects? But I think it's amazing, would go nuts to get my hands on the technology, and wish one of the "objects" in the video had been a living thing.
[via Kottke]
Mogees - Gesture recognition with contact-microphones from bruno zamborlin on Vimeo.
More text from the website:
In the video we show how it is possible to perform gesture recognition just with contact microphones. Through gesture recognition techniques we detect different kind of fingers-touch and associate them with different sounds. In the video we used two different audio synthesis techniques:
- physic modelling, which consists in generating the sound by simulating physical laws;
- concatenative synthesis (audio mosaicing), in which the sound of the contact microphone is associated with its closest frame present in a sound database.
The system can recognise both fingers-touches and objects that emits a sound, such as the coin shown in the video.
I don't understand this, or what's happening in the video: is the microphone simply massively amplifying the almost inaudible sounds your fingers make while tapping and riffing on objects? But I think it's amazing, would go nuts to get my hands on the technology, and wish one of the "objects" in the video had been a living thing.
[via Kottke]
Subscribe to:
Posts (Atom)