How Ai Weiwei 360 was made, formed the topic for discussion at a panel session held during the digital cities week in Birmingham. Anthony Lilley, CEO of Magic Lantern Productions and Rupert Harris, Producer at AVM, shared insights into why and how Ai Weiwei came to life online.
Ai Weiwei was a landmark retrospective exhibition that kept the turnstiles moving and was a runaway success. The Space worked with the Royal Academy and Ai Weiwei to extend the exhibition digitally, and break the barriers to entry for a wider audience online.
To find out how an audience was reached for this project, refer to our ‘Finding an Audience’ guide or case study Ai Weiwei 360: bringing a landmark exhibition to online audiences in our resources section.
Offline to Online
How was the offline tangible art exhibition transformed into a seamless online experience? What technology was used? And which platforms was it made for?
Anthony believed, “the principle that drove the transformation, was to stay true to the artistic work,” yet transform it onto a digital platform where the audience could truly immerse themselves.
The objective was to give the audience a real sense of the physical exhibition space, make it interesting, interactive and add depth to the digital experience.
Tech, what Tech?
The production used a mix of ‘first-ever’ and existing technology, which consisted of capturing the exhibition using traditional linear video and panoramic 3D video, and was later converted to virtual reality art and 360 video.
Photorealistic stereoscopic 3D imagery was also used to provide real depth, giving it free flow movement. Anthony wanted the experience to “feel less like Google street view and more like a guided experience.” Where sound was concerned, to bring the virtual reality experience to life, the team commissioned 12 original tracks for each room, in order to bring out the most in the artworks. The sounds of iron being hammered to clay vases being broken, truly makes it an immersive playground.
Ai Weiwei 360 was made to work across desktops, mobiles, tablets, and virtual reality headsets. Rupert mentioned, “What we didn’t realise but learnt from, the content had to be tweaked, depending on which platform it was being viewed on.”
Rupert reflected on the key learnings taken from producing a project of this scale and nature.
What we learnt
Galleries like the Royal Academy are very busy, and to produce without proper preparation makes the process very difficult. Rupert said, “the sooner you can book, schedule and pre-plan production the smoother the execution.”
“Test your ideas against the execution. If you have an approach that you think will work, try it and if it doesn’t, then try again,” he said. Initially, the team had planned to use 360 video, which for a short time, they shot with, but soon dropped its use. This was due to low quality and bigger data footprint that led to huge loading times.
Surprisingly you don’t need big budgets to produce a project of this nature. Rupert said, “Try to be inventive with the technology you have.” They found using a less technological approach such as using photography, achieved better results, as they didn’t have to hire expensive lights to shoot.
Know your online platform:
Know the technical capabilities of each platform you wish to distribute on, and learn to adapt your content to maximise the user experience.
“We had to change the content for the Samsung Gear VR and mobile devices,” said Rupert. The former supported 3D stereoscopic video where as the latter did not. To achieve a similar result, Rupert and the team used audio instead on mobile devices via Google cardboard.
Give us your feedback