Virtual Reality was all the rage in the past year, this buzzword was used and abused often to simply describe a 360° video played in a cardboard. For us, a VR project needs to have real time stereoscopic 3D rendered in a headset. The user has to be able to interact with his/her environment to create immersion and a close-to-reality experience. That premise guided our work and was challenged at each step of the process. VR is demanding and running a real time 3D scene at 60fps on a mobile device is not as easy as it is on a desktop computer.
Here, we document our creative process and the challenges we faced crafting our first VR experience.
For our first steps into the VR world we wanted to target the largest audience possible a decided to experiment with the Samsung Gear VR and the Galaxy S7.
This relatively affordable VR setup is the kind of device a digital production studio like us would likely use for commercial projects.
It has a QHD screen (1440x2560), one of the best mobile CPU/GPU available at the moment and it runs VR apps untethered to a computer.
We quickly decided to go with Unity as we did our first tests in the Oculus Rift and Unity was the first to offer the compatibility with this device.
The first concept was to build an island divided in 4, with a different season on each part.
We really liked the assets from Mikael Gustafsson available in the unity store and tried to create the scene with it.
The island’s size, dynamic lighting and the animations were too much to handle for the Galaxy S7 and the framerate dropped down around 20fps.
To bring more storytelling and richer user interactions, we decided to restart from scratch and wrote a different scenario.
The idea was to build a dream-like experience, including the most common dreams scenarios :
- an out of body experience
- roaming in a weird place
- a nightmarish scene
- an endless fall
We turned the complexity of the graphic style down a notch and opted for a low-poly, plain-color-shaded universe with as little lighting as possible.
We eventually came up with this scenario :
I wake up in a dark bedroom in the middle of the night. In front of me, a floating light orb : The Nightlight.
I take a look at it and suddenly the bedroom walls open up like a cardboard box and I’m standing in a forest, in the darkness of a misty starry night.I walk a few steps, taking the time to play with the fireflies surrounding me.
The light orb is still here and seems to wait for me, I try to catch it but it flies through the forest’s trees and it’s to quick for me. I follow it in the woods for a while and I realise that there are no fireflies anymore, the trees around me look dead and gloomy. I stumble across a cave carved in the moutain and take a few steps in.
The orb is lighting the end wall of the cave on which I distinguish a wooden door.
Suddenly, the door opens and I enter a corridor. The flicking ceiling lights barely light up this scary place and I could swear I just saw a shadow moving in the background.
I tiptoe on the squeaky floorboard toward the end of the corridor and I open one of the two doors in front of me.
The door opens up on the exact same place, I walk a little faster, I’m completely lost and things start to get weirder, paintings move on the walls and I can hear someone whispering in my ears. I think this place is haunted.
As I was ready to open yet another door, the floor collapse under my feet and I fall in an endless well surrounded by lights. The orb is floating here, brighter than ever. I fall faster and faster in the white overwhelming light.
I finally wake up in my bedroom, the sun is shining out the window… It was all just a dream.
Unlike many phone-powered headset like the Google Cardboard, the Samsung Gear VR integrates a swipeable D-Pad and a couple buttons (ok, back, vol up/down).
Though it offers some interesting controls over your VR apps, it’s quite awkward to use and it’s a feature unique to the Gear VR.
We decided to limit the interaction to the user’s gaze and use only the back button to restart the experience.
The scenario is linear, camera moves along a path slowly and the user’s attention is focused on a light orb to prevent any motion sickness.
We used unity’s coroutines to chain the interactions, waiting for one to complete to go to the next part of the experience.
To have a fluid journey through the different sections, all the assets are positioned in the scene from the start and a scene manager hide/show them in sequence.
We used both ambient lighting and sound to give each section its own mood.
The scene is lit with a color gradient that is tweened between each section.
A special attention was payed to the sound design. Each section has its own soundscape composed by Hearbeats. and we used Unity’s sound spatialization for all the objects’ sounds to make the whole experience even more immersive. Sound design is crucial when building this kind of experience. It adds so much depth to the mood you want the user to feel.
Reviews and Performance issues
Oculus’ policy regarding application performance is very strict and getting published on the oculus store was more complicated than we initially thought.
It is imperative that the app runs always at 60fps and you have to provide support for devices ranging from the Galaxy S7 to the Note 4 (the latter is not mandatory though).
Guidelines from Oculus for app performance are :
- 50–100 draw calls per frame
- 50k — 100k polygons per frame
- As few textures as possible (but they can be large)
- 1 ~ 3 ms spent in script execution (Unity Update())
We also kept the materials as simple as possible (plain diffuse color) and reused them as much as we could.
Avoid transparency as it tends to increase overdraw dramatically and profile on mobile (can’t stress this last point enough) while developing to keep an eye on the number of draw calls.
Keep in mind that the mobile device has to render the scene twice (once for each eye) and that the 100 calls advised limit is reached crazy fast.
This was how the profiler looked before optimizing :
Unity optimize a lot of stuff for you out of the box by batching some of the draw calls using a single material into one. It is also possible to mark objects as Static if they are not animated, when the lighting is baked etc. for an even more efficient batching.
There were 300+ draw calls before batching and 133 after (SetPass calls), it was slightly too much for the device too handle, hence throttling the V-sync to 30 fps
Here is the profiler after the optimizations :
Rendering is much smoother, batched draw calls are around 80 and the framerate is a steady 60fps.
With all these improvements our app could hit the Oculus store and is available here to try it yourself. We truly believe that VR on mobile is at its beginning and is going to blow up in the near future. It is not the easiest to start developing with but we manage to push the device’s boundaries to achieve our creative vision and we are eager to build more beautiful experiences for the HTC Vive or the Microsoft Hololens.