“Find Your Way to Oz” is a new Google Chrome Experiment brought to the web by Disney. It allows you to take an interactive journey through a Kansas circus. You are swept up by a massive storm which leads you to the land of Oz.
In this project our goal was to combine the richness of cinema with the technical capabilities of the browser to create a fun, immersive experience that users can form a strong connection with. But it was important to complement Sam Raimi’s vision for Oz. He has spent five years carefully constructing a world for this story to live in. Our interactive journey would have to live in the same space, and yet could not cover the same ground. In other words, we needed a simple interactive journey that flows towards the world Mr. Raimi was creating on film. That is where the key phrase ‘Find Your Way To Oz’ came in. This was going to be your journey to Oz. From there, we focused on two sections in the film, specifically. First, the Baum Bros circus scene stood out to us as a very natural space within which to have fun. There were a lot of very obvious opportunities for us to create interactive touch points for the journey users would take. Secondly, the storm was very intriguing for us. In the film the storm is intense, dangerous and beautiful at the same time. It would be an obvious vehicle to transport the user to Oz. But it was an exciting technical challenge for us to make a storm that feels truly dangerous and beautiful within the browser. How could we make an interactive storm feel intense, and yet get that softness – that magic – of real clouds?
Find Your Way to Oz on desktop is a rich immersive world. It is a space you navigate through at your own time, using your cursor. It’s easy, you can’t get lost. But there is a sense of exploring and finding little details of Oz every step of the way. We used dynamic 3D and several layers of traditional filmmaking effects that combine to create a near-realistic scene. The most prominent technology is WebGL, Threejs, custom built shaders and DOM animated elements using CSS3 features. Beyond this, getUserMedia API (WebRTC) for interactive experiences allowing the user to add their image directly from Webcam and WebAudio for 3D sound. Early 3D Our first full WebGL demo was made about two weeks into our production. It was proof of concept for us, to establish the sense of realism in the look we were going for, and to test whether what we wanted to build would run in a browser. We built a simple bullet proof test. The geometry of the tent could be easily duplicated infinitely, using just a parameter, in order to stress test all of our computers when it comes to performance.
The first interaction that you encounter as you move through the circus is a classic circus cut-out attraction, where you can put yourself within the scene. You can share the pictures you take, with friends, allowing them to join in the experience. To make this happen, we’re using WebRTC to capture pictures from your webcam within your browser. User’s permission is required to use the web cam.
The second touch point allows you to compose your own song on a vintage music organ. When you customise the music, that music is brought back into the Circus, acting as a soundtrack for the 3d world you are in. So, the music that you create at the music box not only is created directly in the browser, it also follows you and is placed in 3D space using the Web Audio API, creating a stereoscopic sound track that responds to your movements.
The third touch-point recreates a classic film mechanic from the past, a zoetrope type experience where you take a series of images of yourself, and play them back as a short film. Again, we are using WebRTC to capture pictures from your webcam within your browser, in this case using a timer mechanic.
A bit like a matte painting
In many classic Disney films and animations you could find different layers would be combined to create a scene. There would be live action layers, and hand animated, some actual physical sets created others painted on glass; a technique called matte-painting. In many ways the structure of the experience we created follows this idea, even though some “layers” are much more than static visuals rather they affect the way things look according to more complex computations. Nevertheless, at least at the big-picture level we are dealing with views, composited one on top of the other. At the top you see is a UI layer, with below it the 3D scene: itself made of different scene components. This approach was especially important in making the storm come to life.
As you move towards the end of the experience, the clouds become darker, the wind picks up. The circus is overtaken by a giant storm, and enveloped in darkness. From very early on in the project we were attracted to the idea of creating a storm. We wanted it to be powerful, intense, and especially, beautiful. Something that draws you in even though you know it is dangerous. To create the final storm sequence many different techniques were combined, but the centerpiece of this work was a custom GLSL shader that looks like a tornado. We had tried many different techniques from vertex shaders to create interesting geometric whirlpools to particle based animations and even 3D animations of twisted geometric shapes. None of the effects seemed to recreate the feeling of a tornado or they required too much in terms of processing. A completely different project eventually provided us with the answer. A parallel project called brainflight involving games for science to map out the brain of the mouse from the Max Planck Institute had generated interesting visual effects. The inside of a brain cell in one of these games looked a bit like the tunnel of a tornado. With a bit of more styling, some rotation, and some noise, it was clear the team was onto something. It had the softness we were looking for, and this approach is attractive because it isn’t based on a frame-limited loop but is always different, always changing. Storm swirls were 200 lines of code, and so fairly light for the processor. But that was just the storm itself. We needed a world around that storm for you to travel through. And somewhere for you to exit. We started to see if the storm would still work when sandwiched under layers of clouds and on top of a dramatic background. But then we just have the storm. Now we need to build a world around that We added traditional matte paintings behind it. A balloon, with traditional keyframe animation. Particle-based clouds, placed on flat planes, to cover the ground and give you a greater sense of speed. A number of simple objects flying around that storm as if having been swept up. These were actually items from the circus that we had broken and mangled up. Next we added illustrated water drops onto the lens And finally the sun, which creates lens flares.
The music and sound in the storm sequence are crafted in a cinematic style, to make the scene come to life. The sound engine used creates a stereoscoping feeling that is responsive and immersive, making the experience more powerful, especially when you move through the storm. The sound team at Plan8 worked with a proprietary sound engine that handles everything from music synchronisation to the 3D-soundscape of the whole experience. The sound engine uses the Web Audio API in Chrome for playback and 3D-positioning. When you enter the carnival field several sounds start in relation to where you are. Wind blowing, birds flying by, the organ music, tent cloth flapping in the wind, chimes, creaks, and the distant sound of thunder; all have a subtle presence in the 3D world of the carnival field. As the user moves through the scene the sounds change accordingly and if you enter the tent these ‘outside’ noises will get realistically muffled by the tent fabric. The organ sound is a mix of three custom-made sounds. One is the breath/wind noise the mouth emits when blowing into a pan flute. This forms the attack of the sound. Sampled multiple times to get rid of repetitions, and layed out with Kontakt5 sampler in a round robin patch. The other two sounds are two different custom-built organ patches from ProjectSam. Together they bring the organ to life. The organ interactive sequencer lets the user shape their own melody, and as you leave the organ and move on to the carnival area, the melody you wrote will follow you and continue to play throughout the experience. As the you move closer to the distant thunder storm you can feel the rumble increase. The storm’s sounds are made up of 10 or so layered rumbles, wind noises, whistles and squeaks that come in randomly so no looping can be heard. It’s a very dense sound picture as the cinematic music starts and it’s a real challenge to master the balance between sounds both in the frequency spectrum and volume as our main output source are laptop speakers. Every sound is tested on various speakers and setups, so we can assure everyone gets an as good experience as possible. When you complete the storm, you end in Oz. That is when we hand over to Sam Raimi and his team, who complete the journey for you. By following the wizard we lead you to the trailer of the film, and by extension, to the cinema.
The key theme in this experience, on both a small and large scale, is the duality between Earth and Oz. In the film, every character in Oz has an Earth counterpart. They are connected. We wanted that duality to be a part of everything you do. From the preloader onwards, all things you can interact with have that duality ingrained in them.
The complexity of the project necessitated a feedback loop internally that would be as short and efficient as possible. Rather than have our visual artists ask a developer to make changes to the code for them (to tweak colour values, or depth of field settings, etc) we worked with a visual interface to control all assets, live in the browser. We had the ability to share the settings that work best with each other. Anyone could try tweaking values to significant parameters in the experience and make a decision about what we liked the most. It was an essential way for us to bring all these different elements together, and avoid a funnel when it comes to polishing them once they were in the build.
Our challenge for the mobile experience was a very different one, because from very early on in the process we knew that we wanted the mobile journey to essentially cover the same ‘Journey to Oz’ as the desktop experience would. On the other hand, there was the need for the experience to run well, in the browser, on a wide range of smartphone devices. We focused on a creating a simplified approach to the interactive touch points that lead you to Oz. And we simplified that journey overall, to create a more focused series of steps for users to follow. We felt that the photo booth experience from the Desktop website might translate very nicely to a smaller touchscreen device. And with access to the users in-built camera and photo albums through the browser, this experience felt tailor made. Around that we created a more linear journey to Oz that uses imagery from the desktop WebGL environment to give the user the ability to pan around the same environments, or scroll to take steps forward, or back.
For more technical detail check out our article on HTML5Rocks
Many people make the production of this experience possible, too many to all list here please visit the credits page for the full story.