Just finished prepping my installation for our end of year exhibition. I finished the coding earlier this afternoon but I had to write a new soundtrack for it seeing as the old one was wank. Basically I'm using the footage from the archive copy in the previous post combined with the power of MaxMSP/Jitter. I'm using Max to capture the movement of the audience in the space as they walk in to look at my installation. This is then translated into data which determines the direction of video playback so to the viewer it appears as if you are walking around the object. It's not perfect but it works and hopefully is stable enough to last the opening night without me having to tweak it. I was really pissed off with the whole thing this morning but I'm so happy now its done.
My patch, using large portions of cv.jit.track: