This very long blog post is about developing for mobile, untethered VR and some design considerations and pitfalls to look out for when doing so, before moving onto performance optimisation, testing and Oculus Home store submission. But before getting into all that, let’s look at why you should.
There are currently two main strands of VR development — tethered and untethered, or PC-based and mobile-based. PC-based, tethered VR utilises the powerful graphics cards within desktops to run high-end VR headsets such as the Oculus Rift and HTC Vive. Mobile, untethered VR utilises a user’s mobile phone device within a plastic, cardboard or foam VR headset, relying solely upon the processing power offered by that particular model, which will vary wildly from handset to handset.
Mobile VR Considerations
Mobile VR is currently the cheapest way for consumers and enthusiasts to get into Virtual Reality, with a range of mobile handsets supported and low cost headsets available. Whilst this comes with some issues (more on that later) current and future models of mobile handsets are powerful enough to render convincing VR experiences that are comfortable for most (at a solid 60Hz on Samsung Gear VR). Google Cardboard has shipped over 5m+ units whilst Samsung claim over 1m concurrent users are active within the Gear VR ecosystem at any given time.
Headsets range from the cheap Google Cardboard v2 (often free at events) to plastic variations that are a bit more durable, or even foam versions like Merge VR, to Samsung Gear VR and a whole range of other mobile device connected options releasing soon. Whilst having range and options is good, to develop VR you ideally need a clear set of known hardware limitations and capabilities to ensure a solid, smooth, comfortable experience for end users. Therefore will concentrate on the current best of class options, Samsung Gear VR and the Google Daydream VR (releasing later in 2016 but you can start developing now, see my previous blog post about it here).
Because of the low cost of entry for end users and consumers, the mobile VR market is rapidly growing with hundreds of new apps becoming available each month. 2016 has seen a real increase in general awareness of VR amongst consumers, buoyed by the launches of Oculus Rift and HTC Vive and imminent release of Sony PlayStation VR in October, along with VR appearing at many public facing events and shows.
However being based on Android (although iOS does support Cardboard now and vice versa) it means it is easy to publish titles on the Play Store and already we are seeing lots of cheap, poorly designed VR applications that don’t provide great experiences. Therefore the Gear VR and Daydream marketplaces are (or will be when released) curated for quality control.
Whilst this means there are additional barriers to publishing your apps, for end users it helps guarantee a level of quality of experience and overall, will boost VR adoption since comfortable experiences see users return, or what to continue to try others. So in the short term, slightly more of a headache for developers, but in the long term, better for everyone as VR is helped towards mass-market adoption.
Limitations
There are limitations to mobile VR though, which need to be considered before deciding upon which platform you’re going to support. Firstly, as the name suggests, mobile VR content is driven by mobile devices, not high-end powerful PCs. Whilst newer devices are certainly providing more processing power than ever before, there are still limits to how far and how complex a scene can be rendered in 3D (if you choose to go down that route). However that doesn’t mean it’s ineffective since a 3D generated scene doesn’t have to be photorealistic in order to be convincing or immersive for users. Low poly[gon count] worlds can be just as fun and effective within VR that are far easier to generate on mobile with great performance.
Another aspect to consider is despite being untethered, allowing the user freedom of movement without being cabled to a PC, currently it’s not possible to track the position of the headset, meaning that no matter how far the user wanders, their viewpoint will only ever update in a 3 directions. This means that a lot of the full VR experiences they may have tried on tethered VR, where they can lean in and get up, close and personal with characters from any angle, or walk around within a room-scale area, are not possible with mobile VR. However there is hardware being worked on that will allow this eventually.
So with less processing power available and no positional head tracking currently possible, it’s understandable why mobile VR excels at and is well supported for 360º photos and video applications. Without any form of real interaction though, if you are just looking around looking at or watching something, there is argument that this isn’t proper VR. Others would also state that with the advances being made with mobile VR and available processing power, developers should target full, high-end VR since the gap will only narrow and whatever you develop today for mobile VR will have a limited shelf-life, as the hardware advances rapidly over the next months and years.
Setting Up Development
Whilst it’s easier to publish apps on the Google Play Store for Android, and a great place to start if you are happy to release free content, I’ll mostly be focussing on developing for the curated content store offered by Oculus for the Samsung Gear VR, which will also be relevant for the Google Daydream VR ecosystem when it releases later this year.
Development environment, SDKs and being an Oculus developer
To make a 3D VR app, you need a 3D development environment that supports the necessary Software Development Kits (SDKs) for VR, in this case the Oculus Mobile SDK. The best 3D engine to use for this is Unity, which is available for free or via a paid subscription. The free option has some limitations and requirements of usage but the paid subscription now includes everything you need to publish mobile VR on Android devices (i.e. Samsung Gear VR and Google Daydream VR) and iOS (which doesn’t really have a strong VR footing yet although iPhone VR apps are now supported by Google Cardboard) rather than having to pay for them separately, as you did with older versions.
Once you have Unity, you need to get the Oculus Mobile SDK for Unity and set yourself up as an Oculus developer. This is a simple registration to create an Oculus ID to log into the website with and start an area where you will eventually submit your app for review before publishing. The Mobile SDK has gone through a number of iterations and is now straight forward to work with, being very well documented. Alongside the Mobile SDK, you’ll also want to get the Audio SDK to make best use of the Oculus positional audio offerings that really add immersion for users to your VR apps. Similarly, this is now robust and well documented.
Now you have the Oculus ID setup, you can also join the Oculus forums (and the Unity forums are great too) to find answers to any questions, queries or problems you run into and be part of the wider VR development community.
Hardware
So that’s the software side; to develop professionally and create great apps, you will need the hardware as well, so you can build, test and run your apps locally before submission to the store. If money isn’t an option, get as many of the supported Samsung mobile handsets you can afford, to test your app on to check performance across the range of devices. If money is tight, aim for the lowest common denominator device, the Samsung S6, to ensure that your app runs on at least the minimum specification mobile handset supported. Having the hardware for testing means you can take your app out on the road too before release and have users test it and give you feedback as well. Usability and user comfort are key elements of VR design and it_is_very_important to make sure your VR apps do not make users feel ill! But this is a subject for another blog post another day…
Make sure your mobile device is set to both types of developer mode and allows installation of apps from unknown sources. You can do this by going to first Settings > About device > Build number and tapping Build number seven times to enable developer mode for Android. Then go to Settings > Application Manager, select Gear VR Service, then Manage Storage. Tap on VR Service Version a number of times until the Developer Mode toggle shows up. Slide it over to enable Oculus Gear VR Developer Mode. Finally, you want to enable your device to install and run apps from unknown sources, i.e. for when testing and you aren’t downloading from the Oculus or Google Play Stores. To do this, go to Settings > Security and check the option Unknown sources and OK the warning prompt. Now your mobile device is set to launch apps that are in development that you can side load onto it for testing and demoing. NB. If you can’t see Gear VR Service as an option, make sure you have put the device into a Gear VR headset previously to start the installation and setup process of the Oculus Gear VR software, drivers and home app — yes this means you need to buy one!
One final element of hardware to software link is getting the OSIG of the Samsung mobile handset to insert into your test builds so that it runs on the Oculus Gear VR system without being a verified app. To do this you will first need to download a Device ID app from the Google Play Store, then run it to get that specific handsets Device ID. Once you have that, go to the Oculus OSIG generator website and enter the Device ID to get your unique OSIG file. Once downloaded, insert into your app package within Unity to allow the built apps to run on your device.
That’s the software and the hardware, what to design and develop?
Of course, not every app you develop has to be released or destined for public consumption so it’s a good idea, especially with VR, to create a series of prototypes and simple ideas first before committing to the full development effort of a polished experience app. Google spent some time making loads of interaction prototypes for Daydream VR that were simple and effective in helping developers understand the possibilities of what would be possible with the new hardware and input controller. If you’re starting out in VR dev, you will need to do this as well so you understand what does and importantly, doesn’t work in VR, then again to understand the current limitations of mobile VR. If your app performs poorly or is uncomfortable to use, it will not get past submission.
Keep it simple
Simple ideas and simple interactions work best for mobile VR because of the limited input options available. Yes, bluetooth Android controllers are available and supported by Samsung Gear VR, and the Google Daydream VR ecosystem states that a device must come with the controller to be compliant, but the majority of users presently do not own bluetooth gamepads. Therefore if you design an app (typically a game) that has only works with a bluetooth gamepad, rather than the built-in touchpad on the side of the headset, you will be dramatically reducing the potential market size of your buying audience.
The touchpad on the side of the Gear VR (v1 retail edition) is bumpy to make it feel like a gamepad D-pad, so reduces the ease of diagonal swipes and interactions but makes it clearer for new VR users where to swipe forwards, backwards, up, down with a nubbin in the centre to mark a button tap area. The newly announced Gear VR 2, available to pre-order for release later in the year, has reverted back to the flat touchpad that the early Innovator Editions had. This is a good design decision since you can have more complex movements tracked from users fingers.
The main downside of the touchpad on the side of the device is that new VR users tend to get headset grabby, holding the headset with their hands as they get used to the sensation of having their senses taken over by VR. This often results in user accidentally quitting or pausing the app, depending how it’s designed to operate, adding to the confusion and makes it tricky if demoing to people as unlike tethered full VR, you can’t see what they’re seeing without the app being mirrored to a display somewhere.
Baring in mind most people who try your app could be new to VR, it could be the first VR experience they’ve ever tried, simple input makes it a lot easier for them to accept and adapt to the technology, which can be overwhelming for some in it’s own right. Thankfully as VR adoption is becoming more widespread and more and more people have access to VR, having to keep this fact in mind will reduce over time and hopefully next year, we won’t have to worry about it so much.
Rock solid performance
In order to pass submission and to make a comfortable experience for users, your app will have to run at a rock, solid 60 frames-per-second (fps) on the Samsung Gear VR. This is necessary as it has been deemed the lowest comfortable framerate for comfortable VR experiences by most users. Any frame drops, even for the briefest instant, can cause users to feel ill as the virtual world is stuttering and jittering, struggling to keep up with their head movements.
This can be quite a challenge if you are unused to 3D engine optimisation or geometry simplification. You will only have about 50,000 polys (100,000 tops) to play with at any given frame within your VR scene, so you are going to have to be clever and think about the style and look overall and implement Unity tricks to get it looking the best whilst remaining stable.
Thankfully the latest version of Unity 5.4 now supports single pass rendering so you can achieve the same result with less effort required by the hardware, with the VR elements being taken care of by the engine rather than drawing everything twice, albeit slightly different to encompass each eye viewpoint, to achieve the 3D depth effect necessary.
Also thanks to John Carmack, original creator of DOOM, is now at Oculus and spends most of his time focused on mobile VR development and tools. As a result, Gear VR has long supported asynchronous timewarp, a technique employed by the SDK to smooth out frame drops and allow developers to get away with the occasional jitter on mobile hardware. But this does not mean you should use it as a crutch and think that you do not need to optimise your codebase! It has limitations and won’t always save you or your users.
Oculus states the following targets to aim for, when it comes to limitations to bare in mind when developing for mobile VR with the Gear VR headset:
50–100 draw calls per frame
50,000–100,000 polygons per frame
As fewer textures as possible (but they can be large)
1–3 ms in script execution
Throttle the CPU and GPU effectively to control heat, battery usage and scene performance
NB. All other Android APIs and SDKs (such as Google Cardboard) do not give you access usually to direct control over the CPU and GPU in the mobile device, this is something only Oculus does with specific Samsung devices through their partnership in creating the Gear VR and mobile SDKs.
General Best Practice & Design Considerations
Now it’s time to look at a series of useful tips and tricks centred around general best practice and VR design, to ensure you are providing a great VR experience for new and expert users alike.
General Guidelines
The VR marketplace is still fairly small, although it is now possible to sell upwards of 100,000 copies of a VR game app on the Oculus Gear VR store if popular. Don’t expect to become a millionaire overnight, this isn’t Angry Birds level of adoption, yet.