I am really enjoying working with Max MSP in developing interactive motion based experiences, so I wanted to continue working on my previous “Video Trail” prototype. This time I was interested in looking at the relationship between the past and the present. Instead of just projecting movement in front of the camera, I wanted to use that visual input as a mask for a delayed “live” video feed. In a sense, one can see the movement of the past, through the movement of the present.
There are lots of ideas I have about how something like this could work in a public space context. It would be great if the video being played was delayed several hours, so that what you were seeing was less of an immediate past, and more about who was in the space long before you. I am also interested in working with pre-recorded video.