Adam Ferriss is doing some great explorations over video lately, using digital techniques that remind of or use same principles than the analog ones such as video scan and video feedback, they are just mesmerising. I asked Adam about the process of these techniques which have been developed using openFrameworks and taking as source a webcam stream or a single image from the webcam.
He tells us that has been really inspired by Andrew Benson's feedback experiments, especially Melting Rainbow Heart (which Andrew used for MGMT's Optimizer too) and his HSFlow shaders, Adam also has been discussing with Johnny Woods about LZX analogue video feedback and out of curiosity he started trying to emulate some of the effects into openFrameworks to experiment in his own.
"Each video is a little different, although the core principle is pretty much the same. I use a series of offscreen framebuffer's to pile on different filters, and then feed the output of the final on screen framebuffer back to one of the earlier ones to complete the feedback loop (just like pointing a camera at a screen that is displaying the camera feed). The filters themselves are shaders written in GLSL.
Since these are made with shaders running on the GPU, it's super fast and can run in real-time at 60fps or faster. It's pretty satisfying to shake your head around and throw off gobs of color, so eventually I'd like to get everything working in webGL as a little web toy so that other people can play with the parameters and experience it for themselves." - Adam Ferriss.
Into the post you can find a brief explanation of each video and the source Adam used to experiment with. By the way don't miss to check his pixel based works too. See more;
Download source:
Andrew's Shaders working in openFrameworks
A pretty straightforward feedback effect using a sharpen filter
Enyomel
"Enyomel was made by using a wonky kernel convolution emboss filter that is continually rotated towards/away from the screen along the X axis. The initial image was a frame of my webcam, but it disappears into the feedback almost instantaneously."
Blink
"Blink was made with Andrew's Horn Schunck shader that tracks the motion between the previous and current frame to find edges in motion, and coloring the outline based on the direction of travel. Then without clearing the framebuffer I continually translate the image along the Z axis, which creates that "infinite zoom" effect."
A wash
"A wash is kind of a combination of the two effects, taking the output that I got in the blink video and feeding it into another kernel convolution shader."