17 March 2016

Cardboard Carding Mill

By: Sean Vieira

Arriving a month into my employment, the DadenU day on the 29th of January came with uncertainty over the expectation of what I should produce on such a free form working day. What should I do? What could I do? I had spent the week prior thinking up ideas of what to pursue, only to throw them away due to the scope extending beyond the one day deadline we had. Some discussion later and one suggestion stuck: “Why don’t you see if you can create a Google Cardboard version of our Carding Mill Valley scene?”.

This suggestion was relevant on both a personal and professional level, and so I decided to follow it up. Having an interest in virtual reality (VR) technology and having never developed for Cardboard before, I considered this a good opportunity to get hands on with it.

SimpleCardboard.PNG

Getting my phone, and our Cardboard headset, out I then jumped on to the internet to read up on this particular brand of VR. Google Cardboard is an ingenious, low end solution that allows anyone with an Android mobile phone to experience Virtual Reality. Google has adopted an open source stance on the viewer, meaning anyone can develop their own Cardboard compatible headset (leading to a nice variety of available viewers). There are official viewers that are literally a cardboard box with some velcro (not forgetting the all important lenses) that come as cheaply as £10, meaning this is definitely a product for everyone.

The idea behind Cardboard is that you strap your phone into the viewer, rotating it to landscape, and start up a VR app. The display is split into two halves, a left and right, where each line up with the lenses in the viewer. This is what provides the ‘virtual reality’ illusion to the user when they look into the headset.

My first step was to get an understanding of how the Google Cardboard SDK works. Fortunately, it already comes with a Unity integrated SDK so this provided me with an easier way of moving our Carding Mill Valley scene, itself a scene within a Unity project, to the format required. So I downloaded the SDK, and fired up the Demo project provided.

Consisting of a Cube and a menu on the floor the Demo doesn’t look particularly impressive, but it was interesting to experience it first hand and get a feel for the design. On inspection of the project, all of the important parts were bundled together in Unity prefabs, meaning that it would be very easy to get a project up and running in a VR enabled form.

A prefab named ‘CardboardMain’ does most of the work. Within it are two GameObjects, one named ‘Head’ and another named ‘Stereo Render’. Unsurprisingly, the ‘Head’ object acts as the player character’s head, and contains the script that applies the device’s rotation and position to the transform of the Unity gameobject, allowing the user to influence where the camera looks just by moving their head. The object contains the Main Camera, which is a standard Unity Camera with a couple of scripts attached - one which controls whether we render in Stereo or Mono and one that provides spatial enhancements to the audio listener. This camera is used to render the scene normally (i.e in mono) if we have disabled the VR mode. It is the parent to two child cameras, a left and right, which are used to render the scene in stereo when we have enabled VR mode.

These cameras represent the left and right eyes of the user, and are offset ever so slightly from the main camera to provide the distortion necessary for the trick that is Virtual Reality to work. Each eye contains a script that alters the projection of each eye camera and applies this to the stereo render in the controller script on the Main Camera.

BoxTest.PNG

CardboardMain also contains the ‘Stereo Render’ object which is where the output from the script attached to the Main Camera goes. This object contains two cameras, one of which is the pre-render camera, which provides us with a solid black background, in front of which the post-render camera, the other camera attached to the ‘Stereo render’ object, renders. This is an orthographic camera that will display our stereo output from our two eye cameras side by side, which is then used in conjunction with the lenses in the physical viewer to create the VR effect.

Migrating this to the Carding Mill wasn’t too difficult. After importing the asset package, the MainCamera in the scene was replaced with the CardboardMain prefab, and it was as easy as that to get it to render in stereo. Testing on our development device confirmed that this switch was mostly successful. It seemed that the device was really struggling to handle the size of the landscape and the foliage, as the framerate was suffering. The foliage was duly removed from the terrain, the landscape detail reduced in distant areas and collision barriers erected so that the user couldn’t wander into the hideous, barren lands. The frame rate was much more palatable now when ran on the device. Some tinkering in the future would perhaps find a nice balance between efficiency and quality, but for now this would do.

In the Demo project provided by Google the user is stationary, whereas in our scene the user would be moving. How would I initiate movement for the player, considering that there is only one form of input when using our Google Cardboard viewer? Initially the idea was to let the user hold down the button (which pressed the phone’s screen) to move forwards. This unfortunately didn’t work out as after the initial contact, the touch that was held was never continually registered by the device.

Until a better method was designed, I figured I’d make the button presses toggle movement as a stop gap solution, so that I had a slightly more impressive demo to show off once the day was over. The direction of movement was determined by where the user was looking. This worked fairly well and wasn’t too disorienting when in use, though it could get annoying if you wanted to turn around to look at a view of the valley but forgot to press the button to stop.

BasicValley.PNG

By the end of the DadenU day, I had successfully converted our Carding Mill Valley scene into a walkable, virtual reality landscape for Google Cardboard. Most of this is down to the ease of use of the Unity SDK for the technology and that is a huge bonus for the hopes of Google Cardboard, now and in the future. This venture was so successful that we have decided to pursue it further. Let’s hope this road will continue to be both interesting and fruitful!

Note: After Sean's work on this we decided to push ahead with this as an early release of FieldscapesVR. Sean has been doing a lot more work on the navigation aspects and we've further enhanced the terrain, and the application should be on the Playstore by the end of March, and the iOS version on the AppStore by end of April.

No comments:

Post a Comment