Peter Gabriel: Back to Front

It’s strange the way that knowledge can change the way we see things. I can’t see a live video feed without wondering how it was put together; how the effects were done; how it was mixed to make a (more or less unified) visual experience… and the gig I went to on Friday (Peter Gabriel, Birmingham) really made me think. Cameras, live video manipulation, and cool computer vision effects have really changed the live music experience.

The first time I noticed the use of live video effects in earnest was at an Arctic Monkeys gig in Grenoble, in early 2010. They’d used small screens at the front of the stage which were linked to cameras around the musicians, and which projected retro-style black and white images of the band in real time (actually, if I remember correctly, they were sepia tinted for quite a lot of the show). These videos had been processed to give an old school tv-not-quite-tuned-in effect – it was quite impressive to see live video editing, on the fly, applied to about 8 camera feeds at the same time.

Fast forward 5 years or so, and now I can say that Peter Gabriel’s live show takes this to the next level. Small cameras are now ubiquitous and they certainly were on stage on Friday; it seemed as though some instruments, most lights, and a couple of the people were wearing little video cameras available for live feeding into the stadium display (I’m not sure how many cameras there were, but my guess is “over 20”). Having only seen the show once I can’t tell how precisely choreographed it was, but my intuition is that there’s quite a bit of variation from night to night so you can’t just have it programmed in advance (“oh, we’re halfway through Sledgehammer, let’s cut to the gopro on the drums for 12 seconds“).

A photo of the show, stolen from PG’s facebook page

Perhaps unsurprisingly for an artist who’s always been at the forefront of visual effects and music, this gig provided quite the feast for the eyes. Pretty much each song was accompanied by a different set of computer-vision driven VFX on the big screens. I spotted (I think) some hough circles, some edge enhancement, lots of interesting noise effects, colour channel splitting with different delays on R, G and B (cheap effect but quite trippy, actually), some more generic motion delay, and some fairly serious looking cool 3D/depth imaging, probably driven via something like a Microsoft Kinect. Some of this might well have been precomputed, but a lot was done on the fly1.

Actually, it’s kind of strange; whilst there’s a lot of major video processing, done in real time, from a set of cameras … it’s a simultaneously technical and low-tech extravaganza. The lights are controlled by actual people and not robots; there were five great big triffid-like lighting rigs being pushed around the stage by 3 people each. There were more lighting people hiding in a gantry with spotlights. I spotted eight people lurking in the rigging, accessed by little rope ladders. This is a workforce intensive show.

And the music? Well that was just fucking awesome. From the opening acoustic numbers, to the fantastic way the band dropped into electric half way through Family Snapshot, playing The Family And The Fishing Net (really not sure what it’s about but it’s always been my favourite, a dirty spooky creepy epic of a song, only improved by being played LOUD and live), skipping around Solsbury Hill, then playing the So album the entire way through (in order, with the original lineup), and closing with Biko, dedicated to young people who put their lives on the line to protest injustice (particularly the recent Mexican students). From the start to the finish it was awesome. I think it was crowd pleasing for both the passing fans and the diehards like me. Also, that man really knows how to work a stage. I don’t think I’ve enjoyed a gig as much for a long time.

So, as a vision researcher, I had to think – well wouldn’t that be a fun project? You could do some really remarkable live stuff with 3D/2.5D video mixing, feature tracking, pose estimation… get some real cutting edge computer vision into the performance. (Doing more lightweight processing like they do now would also make a great dissertation – open source live VFX for performance, anyone?).

1Someone’s bound to comment now saying that this is all trivial with aftereffects plugins or something…

Leave a Reply

Your email address will not be published. Required fields are marked *