Friday, 21 February, 2020 UTC


Summary

Working in production is a daunting task, and more so when it is live. Every part of the filming must be absolutely perfect, from the stream to the video and audio quality.
Peter Collis is a veteran producer in the immersive industry. After being Head of Camera Systems at Inition, Mr Collis now freelances for a few organisations, including Philharmonia. I caught wind of a great story from one of his recent projects, which I felt should be retold. The story gives a great insight into the technical side of immersive recordings, as well as what can be done if something goes wrong (and promptly saved by the skills of the team).

A picture of Peter Collis. Credit: Peter Collis.

Following the success of The Virtual Orchestra and Beethoven 5 VR pieces, Philharmonia wanted to capture in 360 the last part of Mahler’s 9th Symphony, being performed at the South Bank Centre. It was also going out as a live transmission on YouTube, as one of the first live broadcasts of its kind.
The sound was to be captured and recorded by the combination of a few groups. On previous occasions, we’d recorded the audio with binaural mics just from the camera positions. For Mahler, the whole Orchestra was to be locally mic’d up so that a full clean Ambisonic mix could be created for a location-based, multiple speaker experiences.
My job was to do the 360 filming bit. Every 360 filming job I’ve done I’ve pushed to be stereoscopic, it gives a richness that I don’t think is there in mono.
So we got hold of a Jump rig and I added a top camera to capture the ceiling. The Jump is an array of 16 GoPros set in a rig around 30cm in diameter. It normally has a bit of a hole top and bottom, but we were keen to have a complete 360 space for the viewer to explore in.
It also comes with a recommended minimum near object distance of 1.5m… Which was a problem. The Mahler piece requires a full orchestra of over 100 players crammed on the relatively tight Royal Festival Stage. Where we wanted to place the camera was only a few cms from the closest players.
As DOP I had to make the call… It either is not going to work because musicians are too close to camera or we squeeze them all back away enough for it to work, or even ‘drop’ some of the players (which no one was going to love me for).
We re-jiggled the staging and pushed the players back, but they all need their elbow room to wield them violin and cello bows… And it wasn’t enough. So we asked the unthinkable and said can we drop 4 or so players to make it work…
I wasn’t invited to the meeting between the lead conductor, festival director, promotions dept and the like. But the outcome was they gave me the reduced numbers. So we set up a test day, tweaked the auditorium lighting, mapped the space for the audio.
The day came and we were ready – bar one worry. The system did not have a wired start trigger, it was a WiFi-based system. The bit of the performance we wanted was right at the end. The rig was prone to overheating when running continuously so we couldn’t just capture the whole performance. The WiFi trigger worked perfectly fine in the empty auditorium but (through previous experience) I knew that when 2000 people and their WiFi-enabled phones, were in the venue that wasn’t necessarily going to be the case.
So my next tricky demand: I needed a seat in the front row so that if I couldn’t trigger the system remotely I’d have to jump up on stage to fire it manually. Again they agreed (also saying let’s hope that isn’t necessary).
The piece is incredibly demanding on the players and especially the conductor so there are short breaks in the 90-minute performance, so we identified the opportune moment for me to jump on stage should I need to. And of course, I did!
So the rig was recording, I sat back and hoped it wouldn’t overheat. It didn’t. We got the shot. It was a pig to stitch, all those bows and stitch lines (in 3D too remember). But it got done and is out there.

What struck me about the story is that, after years of work in the industry, something new always comes up to be sorted. Mr Collis know how to sort it, and jumped in to do so.
There are many stories like that in production. You make use of what you have. So when they do happen, it’s good to retell the stories, so that others in the same industry can learn from them and continue forward.
The piece has been edited for clarity.

Tom Ffiske
Editor, Virtual Perceptions
Tom Ffiske specialises in writing about VR, AR, and MR across the immersive reality industry. Tom is based in London. 
  • Twitter
  • Facebook

Subscribe to Virtual Perceptions

Keep up to date with the trends and topics of the immersive reality industry, from gaming to healthcare and beyond.
Subscribe
The post Production insights: How Peter Collis managed the first live broadcast of its kind appeared first on Virtual Perceptions.