Thursday, August 13, 2009
This is the first minute or so of the Twitter-based video I'm working on. As noted earlier, the pictures were harvested from PingWire which presents images hosted by three sites serving tweeters. I pulled a couple thousand pictures of cooking, eating, and sometimes growing food from the constant image stream over the course of a few weeks in July.
The soundtrack was generated by an iPhone app called Bloom, which was designed by ambient sound guru Brian Eno and programmer and musician Peter Chilvers. The app costs four bucks, but that's considerably less than I'd have to pay Eno for his services.
I jacked the iPhone into a laptop running Audacity and recorded about 22 minutes of digitally generated ambient drones, gongs and chimes, which I next imported into video editing software along with the pictures. Aside from editing out some noise and reorienting a few sideways pictures, the pictures and sound are as they were when I got them. The video presents what the Internet and my iPhone churned out while I was paying attention.
I'm starting to really get off on the whole mashup of found image/found sound and future-cheesy technology. Maybe I should run it on a stack of Commodore monitors.