Mobile SDK

In theory, Gear VR applications may receive messages from System ...... 112 | Testing and Troubleshooting | Mobile or .... GLSL_ES_Specification_3.00...

2 downloads 78 Views 9MB Size

Recommend Documents


Mobile SDK
For detailed instructions on updating native projects to this SDK version, see Mobile ..... Once downloaded, extract the .zip file into a directory of your choice (e.g., ...... against VrApi, then calls nativeSetAppInterface() to allow the C++ code t

Mobile SDK
To become acquainted with the Oculus VR environment and with using Gear .... In your activity's onCreate method, add the following line: ... Bitmapfont - Fix for font info height not being y-scaled. ..... using native code languages such as C and C++

Mobile SDK
Native application framework for building high-performance VR Applications from scratch. ... Native project sample applications and source to provide reference model ... Mobile Development Basics: Information every developer should know about ......

Mobile SDK
As this is a significant API refactor, please refer to the Unity Development Guide: ... Menu localization support: English, French, Italian, German, Spanish, Korean. ..... Note: As of October 2014, the Windows x86 version of the JDK appears to be ...

Mobile SDK - Oculus
push notifications, or authentication via a separate Android application) do not make sense on Oculus Go. • No Camera. Oculus Go does not have a camera, and ...

Audio SDK
than the dimensions of a typical human head, allowing us to rely on timing ...... The Oculus Spatializer Plugin (OSP) is an add-on plugin for FMOD Studio that ...

PC SDK
The Oculus Debug Tool moved from the SDK to the runtime. The new ...... WPA is included in the Windows Assessment and Deployment Kit (Windows ADK): ..... can see that all of the GPU time used by the server is applied toward rendering:.

PC SDK
This guide describes major changes since the last release, provides instructions ... reason, and for future improvements, we recommend submitting depth data with your eye buffers. ...... profile, implement head-tracking, sensor fusion, stereoscopic 3

Mobile
obtain additional information. Enforcement. Corrective action must occur within 15 business days, unless otherwise provided. We may impose a fine for each day ...

Audio SDK
ignoring localization, if we are unable to compensate for head motion, then sound .... Bluetooth technology is not recommended for audio output. ... By the same token, an .... paradoxically make things seem less realistic, as we are conditioned by ..

Mobile SDK Version 1.0.3

2 | Introduction | Mobile

Copyrights and Trademarks ©

2017 Oculus VR, LLC. All Rights Reserved.

OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights reserved. BLUETOOTH is a registered trademark of Bluetooth SIG, Inc. All other trademarks are the property of their respective owners. Certain materials included in this publication are reprinted with the permission of the copyright holder.

2 |  | 

Mobile | Contents | 3

Contents Mobile SDK Getting Started Guide.....................................................................6 Getting Started with the SDK........................................................................................................................... 6 Mobile Development with Unity and Unreal.................................................................................................... 7 Mobile Development with Unity.................................................................................................................. 7 Mobile Development with Unreal................................................................................................................8 System and Hardware Requirements................................................................................................................ 9 Device Setup....................................................................................................................................................11 Android Development Software Setup for Windows......................................................................................12 Java Development Kit (JDK)...................................................................................................................... 13 Android Studio Installation.........................................................................................................................13 Standalone Android SDK Tools................................................................................................................. 13 Installing Additional Packages and Tools.................................................................................................. 14 Android Native Development Kit (NDK)....................................................................................................14 Gradle......................................................................................................................................................... 15 Setting up your System to Detect your Android Device...........................................................................15 Android Development Software Setup for Mac OS X.................................................................................... 15 Xcode..........................................................................................................................................................16 Java Development Kit (JDK)...................................................................................................................... 16 Android Studio Installation.........................................................................................................................16 Standalone Android SDK Tools................................................................................................................. 16 Installing Additional Packages and Tools.................................................................................................. 17 Android Native Development Kit (NDK)....................................................................................................17 Gradle......................................................................................................................................................... 18

Mobile Development Basics.............................................................................. 20 Oculus Signature File (osig) and Application Signing.....................................................................................20 Universal Menu and Volume........................................................................................................................... 21 Reserved User Interactions.........................................................................................................................21 Universal Menu Implementation................................................................................................................ 22 Developer Mode: Running Apps Outside of the Gear VR Headset............................................................... 22 Android Studio Basics..................................................................................................................................... 23

Native Development Overview..........................................................................28 Native Source Code........................................................................................................................................ 28 Native Samples................................................................................................................................................ 29 Android Manifest Settings............................................................................................................................... 30

Native Engine Integration..................................................................................31 VrApi.................................................................................................................................................................31 Lifecycle and Rendering.............................................................................................................................31 Frame Timing............................................................................................................................................. 34 Latency Controls.........................................................................................................................................34 Asynchronous TimeWarp............................................................................................................................36 Power Management................................................................................................................................... 39 Advanced Rendering....................................................................................................................................... 42 Multi-View................................................................................................................................................... 42

Native Application Framework.......................................................................... 47 Creating New Apps with the Framework Template....................................................................................... 47

4 | Contents | Mobile

UI and Input Handling.....................................................................................................................................48 Native SoundEffectContext............................................................................................................................. 49 Runtime Threads.............................................................................................................................................. 50

Other Native Libraries........................................................................................51 VrAppSupport.................................................................................................................................................. 51 LibOVRKernel................................................................................................................................................... 53

Media and Assets.............................................................................................. 54 Mobile VR Media Overview............................................................................................................................ 54 Panoramic Stills...........................................................................................................................................54 Panoramic Videos....................................................................................................................................... 54 Movies on Screens..................................................................................................................................... 55 Movie Meta-data........................................................................................................................................ 55 Oculus 360 Photos and Videos Meta-data................................................................................................56 Media Locations......................................................................................................................................... 56 Native VR Media Applications.........................................................................................................................57 Models..............................................................................................................................................................59 Oculus Cinema Theater Creation.............................................................................................................. 59 FBX Converter............................................................................................................................................ 65

Mobile Best Practices........................................................................................ 73 Rendering Guidelines...................................................................................................................................... 73 Mobile VR Performance............................................................................................................................. 73 Frame Rate................................................................................................................................................. 74 Scenes.........................................................................................................................................................74 Resolution................................................................................................................................................... 75 User Interface Guidelines................................................................................................................................ 75 Stereoscopic UI Rendering.........................................................................................................................75 The Infinity Problem................................................................................................................................... 75 Depth In-Depth.......................................................................................................................................... 76 Gazing Into Virtual Reality..........................................................................................................................76 Adreno Hardware Profile................................................................................................................................. 77

Testing and Troubleshooting.............................................................................79 Tools and Procedures......................................................................................................................................79 Android System Properties and Local Preferences....................................................................................79 Screenshot and Video Capture..................................................................................................................82 Oculus Remote Monitor and VrCapture......................................................................................................... 84 VrCapture....................................................................................................................................................84 Oculus Remote Monitor............................................................................................................................. 86 Android Debugging.........................................................................................................................................99 Adb........................................................................................................................................................... 100 Logcat....................................................................................................................................................... 101 Application Performance Analysis................................................................................................................. 102 Basic Performance Stats through Logcat.................................................................................................103 SysTrace.................................................................................................................................................... 104 NDK Profiler..............................................................................................................................................104 Rendering Performance: Tracer for OpenGL ES..................................................................................... 106 Android Studio Native Debugging............................................................................................................... 108 Native Debugging with Android Studio.................................................................................................. 108 Native Debugging with ndk-gdb............................................................................................................. 111

Mobile Native SDK Migration Guide...............................................................114 Migrating to Mobile SDK 1.0.3..................................................................................................................... 114

Mobile | Contents | 5

Migrating Migrating Migrating Migrating

to to to to

Mobile Mobile Mobile Mobile

SDK SDK SDK SDK

1.0.0..................................................................................................................... 122 0.6.2.0.................................................................................................................. 125 0.6.1.0.................................................................................................................. 127 0.6.0..................................................................................................................... 129

Release Notes.................................................................................................. 136 1.0 Release Notes..........................................................................................................................................136 0.6 Release Notes..........................................................................................................................................139 0.5 Release Notes..........................................................................................................................................145 0.4 Release Notes..........................................................................................................................................149 System Activities/VrApi Release Notes......................................................................................................... 152 1.10.x Release Notes............................................................................................................................... 152 1.0.x Release Notes................................................................................................................................. 152 System Driver Release Notes........................................................................................................................ 156 1.0.x Release Notes................................................................................................................................. 156 Oculus Remote Monitor Release Notes........................................................................................................156 1.x Release Notes.................................................................................................................................... 157

6 | Mobile SDK Getting Started Guide | Mobile

Mobile SDK Getting Started Guide Welcome to the Mobile SDK! The Getting Started Guide gives an overview of the SDK and how to work with it, reviews hardware specifications and system requirements, and provides a walkthrough for the one-time setup of the basic tools for mobile development for the Oculus mobile platform.

Getting Started with the SDK This section will orient you to Oculus mobile VR development and the contents of this guide. SDK Contents • VrApi for third-party engine integration (not required for Unity or Unreal). • Native application framework for building high-performance VR Applications from scratch. • Additional libraries providing support for GUI, locale, and other functionality. • Native project sample applications and source to provide reference model for creating your own VR applications. • Tools and resources to assist with native development. Mobile SDK Intro Documentation • Getting Started Guide: A one-time guide to environment setup. • Mobile Development Basics: Information every developer should know about Oculus mobile development. Every developer should read through this guide. Native Developers Most of the Mobile SDK guide is written for native developers. Complete the setup described in the Getting Started Guide, then move on to the Native Development Overview on page 28. Unity and Unreal Developers Mobile developers working with Unity and Unreal should begin with Mobile Development with Unity and Unreal on page 7, as setup and development differ substantially from native setup and development. Platform Features Mobile applications may use our Platform SDK (available separately from our Downloads page) to add features related to security (e.g., entitlements), community (e.g., rooms, matchmaking), revenue (e.g., in-app purchases), and engagement (e.g., leaderboards). For more information, see our Platform SDK documentation. Application Submission For information on preparing to submit your mobile VR application to Oculus for distribution through the Oculus Store, see our Publishing Guide. Thank you for joining us at the forefront of virtual reality!

Mobile | Mobile SDK Getting Started Guide | 7

Questions? Visit our developer support forums at https://developer.oculus.com. Our Support Center can be accessed at https://support.oculus.com/.

Mobile Development with Unity and Unreal Unity and Unreal provide built-in support for Oculus mobile development. If you wish to use the mobile SDK with other engines, please see Native Engine Integration on page 31.

Mobile Development with Unity Unity 5.1+ provides built-in VR support for Gear VR, enabled by simply checking a box in Player Settings. We provide a free Utilities for Unity 5 unitypackage that includes supplemental scripts, scenes, and prefabs to assist development. The Utilities package is available from our Downloads page. The Android SDK is required for mobile development in Unity. However, most Unity developers do not need to install Android Studio or NDK. Unity developers should follow the instructions in our Device Setup guide, and install the Java Development Kit (JDK) and Android SDK before beginning development. To download the Android SDK without the Android Studio IDE, look for the “Get Just the Command Line Tools” option on the Android Studio Download page. Once you have installed the necessary Android development tools and the Android SDK and completed the necessary integration steps, you can build Unity Android applications normally and enable virtual reality by enabling VR support in the Player Settings. For more information on Oculus development using Unity, see our Utilities for Unity 5.x Developer Guide. We recommend reviewing all of the relevant performance and design documentation, especially the Unity Best Practices: Mobile. Mobile apps are subject to more stringent limitations and requirements than Rift apps, and those limits should be taken into consideration from the onset. If you are interested in submitting an app to the Oculus Store, be sure to have a good understanding of our standards and requirements, which you can review in our Publishing Guide. For information about our minimum-recommended Unity Editor versions, please see Unity Version Compatibility in our Unity guide. Unity’s Getting Started with Android Development also has useful information on getting started. Other sections of Mobile SDK guide that you will find useful include Mobile Best Practices and Testing and Troubleshooting. Mobile Unity Resources We provide additional resources to assist mobile developers using Unity. Utilities The Utilities for Unity 5 unitypackage includes scenes and script specifically for mobile development. A partial list of Utilities assets includes: • • • •

OVRManager: an interface for controlling VR camera behavior with a number of useful features, OVRPlayerController: a VR first-person control prefab, OVRInput: provides a unified API for Xbox, Touch, and Oculus Remote, OVRHaptics: provides an API for Oculus Touch haptic feedback,

8 | Mobile SDK Getting Started Guide | Mobile

• OVRScreenshot: a tool for taking cubemap screenshots of Unity applications, • Adaptive Resolution: automatically scales down resolution as GPU exceeds 85% utilization, and • Basic sample scenes. For more information, see our Detailed Look in our Utilities for Unity Developer Guide. Samples • The Oculus Sample Framework for Unity 5 includes sample scenes and guidelines for common features such as crosshairs, driving, and video rendering to a 2D textured quad. See Unity Sample Framework for more information. Audio • Oculus Native Spatializer Plugin for Unity 5 (included with Oculus Audio SDK Plugins) - provides easy-to-use high-quality audio spatialization (see Oculus Native Spatializer for Unity). • Oculus OVRLipSync - a Unity 5 plugin for animating avatar mouths to match speech sounds (see OVRLipSync).

Mobile Development with Unreal The Android SDK is required for mobile development with Unreal. However, most Unreal developers do not need to install Android Studio or NDK. Unreal developers should follow the instructions in our Device Setup on page 11 guide, and install the Java Development Kit (JDK) and Android SDK before beginning development. To download the Android SDK without the Android Studio IDE, look for the “Get Just the Command Line Tools” option on the Android Studio Download page. Once you have installed the necessary Android development tools and the Android SDK, you can build Unreal Android applications normally For more information on Oculus development using Unreal, see Unreal Mobile Development in our Unreal Developer Guide. We recommend reviewing all of the relevant performance and design documentation in our Mobile SDK guide. Mobile apps are subject to more stringent limitations and requirements than Rift apps, and those limits should be taken into consideration from the onset. If you’re interested in shipping an app through the Oculus Store, be sure to have a good understanding of our standards and requirements, which you can review in our Publishing Guide. Other sections of Mobile SDK guide that you will find useful include Mobile Best Practices and Testing and Troubleshooting. For general information about working with the Oculus integration with Unreal, have a look at our Unreal Developer Guide. Note that there are several versions of the engine binary and source available - our guide focuses on features included in the Oculus Unreal GitHub repository. Mobile Unreal Resources We provide additional resources to assist mobile developers using Unreal. Blueprints • We provide Blueprints for common operations including Gear VR touchpad support and entitlement checking. Audio

Mobile | Mobile SDK Getting Started Guide | 9

• For spatialized audio, use FMOD with our Oculus Spatialization Plugin for FMOD. Note that our Wwise Oculus Spatialization Plugin does not currently support Android.

System and Hardware Requirements Please begin by making sure that you are using supported hardware and devices for this release of the Oculus Mobile SDK. Operating System Requirements The Oculus Mobile SDK currently supports the following operating systems: • Windows 7/8/10 • Mac OS: 10.10+ (x86 only) Minimum System Requirements The following computer system requirements for the Oculus Mobile SDK are based on the Android SDK system requirements: • 2.0+ GHz processor • 2 GB system RAM Supported VR Headsets • Samsung Gear VR Supported Devices • • • • • •

Samsung Galaxy S7 Edge Samsung Galaxy S7 Samsung Note5 Samsung Galaxy S6 Edge+ Samsung Galaxy S6 Edge Samsung Galaxy S6

Gear VR Innovator v2 • Samsung Galaxy S6 Edge • Samsung Galaxy S6 Gear VR Innovator v1 • Samsung Note4 Target Device Requirements • API Level 19 (Android 4.4.2) or later Accessories Bluetooth Gamepad We recommend using a compatible Bluetooth gamepad for development, such as the SteelSeries Free or Moga Pro. A gamepad is necessary for testing the sample applications which come with this release.

10 | Mobile SDK Getting Started Guide | Mobile

Compatible gamepads must have the following features: • Wireless Bluetooth connection (BT3.0) • Compatible with Android devices • Start and Select buttons Typical controls include: • One Analog Stick • Action Button (4) • Trigger Button (2) Bluetooth Keyboard (optional) It is useful (but not required) to have a Bluetooth keyboard during development. The Logitech K810 is known to function well. Hardware Details Device

SoC

GPU

CPU

Memory

Bandwidth

Exynos Note4

Exynos 5433

Mali T760 MP6

octa-core A57/ A53

LPDDR3 2x 32bit

12.9 GB/s

1.9/1.3 GHz

825 MHz

quad-core Krait 450

LPDDR3 2x 64bit

2.7 GHz

800 MHz

octa-core A57/ A53

LPDDR4 2x 64bit

1.9/1.3 GHz

1552 MHz

octa-core A57/ A53

LPDDR4 2x 64bit

1.9/1.3 GHz

1552 MHz

octa-core M1/ A53

LPDDR4 2x 64bit

2.3/1.5 GHz

1794 MHz

Adreno 530

quad-core Kryo

624 MHz

2.1/1.1 GHz

LPDDR4 2x 64bit

700 MHz Snapdragon Note4

APQ 8084

Adreno 420 600 MHz

Exynos S6

Exynos 7420

Mali T760 MP8 772 MHz

Exynos Note5

Exynos 7420

Mali T760 MP8 772 MHz

Exynos S7

Exynos 8890

Mali T880 MP12 650 MHz

Snapdragon S7

MSM 8996

1804 MHz

25.0 GB/s

24.2 GB/s

24.2 GB/s

28.0 GB/s

28.2 GB/s

Mobile | Mobile SDK Getting Started Guide | 11

Device Setup This section will provide information on how to set up your supported device for running, debugging, and testing your Gear VR application. Please review the System and Hardware Requirements above for the list of supported devices for this SDK release. Note: This information is accurate at the time of publication. We cannot guarantee the consistency or reliability of third-party applications discussed in these pages, nor can we offer support for any of the third-party applications we describe. Configuring Your Device for Debugging In order to test and debug applications on your Android device, you will need to enable specific developer options on the device: 1. Configure Developer Options in Settings • Enable USB Debugging • Allow mock locations • Verify apps via USB 2. Configure Display Options in Settings • Disable lock screen • Set Display Timeout (optional) Developer Options Note: Depending on which mobile device you are using, options and menu names may vary slightly. Developer options may be found under: Settings -> System -> Developer options. Developer options may be hidden by default. If so, you can expose these options with the following steps: 1. Locate Build number option in Settings. Android M and later: Go to Settings -> System -> About device -> Software Info. Earlier Android Versions: Go to Settings -> System -> About device. 2. Scroll down to Build number. 3. Press Build number seven times. You should be informed that Developer options has been enabled. Once you have found Developer options, enable the following: USB Debugging: This will allow the tools to install and launch deployed apps over USB. You should see the screen shown on the accompanying figure.

12 | Mobile SDK Getting Started Guide | Mobile

Note: If the above screen does not appear, ensure that your computer recognizes the device when it is connected. If not, you may need to pull down the notifications bar on your phone and find the USB connection setting, and set USB to Software installation (it may be set to Charging by default). If you still do not see the pictured screen, try a different USB cable. If your phone is recognized by your computer but you do not see the above screen, try toggling USB Debugging off then back on. Check Always allow this computer and hit OK. To purge the authorized whitelist for USB Debugging, press Revoke USB debugging authorizations from the Developer options menu and press OK. Allow mock locations: This will allow you to send mock location information to the device (convenient for apps which use Location Based Services). Verify apps via USB: This will check installed apps from ADB/ADT for harmful behavior. Display Options The following display options are found in: Home -> Apps -> Settings -> Sound and Display. Lock screen/Screen Security/Screen lock: Set to None to make the Home screen is instantly available, without swipe or password. Useful to quickly get in and out of the phone. Display/Screen timeout: Set the time to your desired duration. Useful if you are not actively accessing the device but wish to keep the screen awake longer than the default 30 seconds. See Android Debugging for more information.

Android Development Software Setup for Windows In order to develop Android applications, you must have the following software installed on your system: 1. Java Development Kit (JDK) 2. Android Studio Development Bundle or Standalone Android SDK Tools 3. Android Native Development Kit (NDK) Gradle installation is recommended but not required. See Gradle for more information.

Mobile | Mobile SDK Getting Started Guide | 13

Java Development Kit (JDK)

The Java Development Kit is a prerequisite for Android Studio and Gradle. The latest version which has been tested with this release is JDK 8u91, available from the Java Archive Downloads page: http://www.oracle.com/technetwork/java/javase/downloads/java-archivejavase8-2177648.html The latest JDK version is available here: http://www.oracle.com/technetwork/java/javase/downloads/jdk8downloads-2133151.html Once downloaded and installed, add the environment variable JAVA_HOME and set its value to the JDK install location. For example, the value may be C:\Program Files\Java\jdk1.8.0_91, if you have installed the x64 version. Based on the default installation path of Java SE 8u91, the correct syntax when using set from the command line is: set JAVA_HOME=”C:\Program Files\Java\jdk1.8.0_45”

Note: The JAVA_HOME value must be your actual path, which may differ from these examples. Additionally, add the JDK to the value of your PATH, e.g. C:\Program Files\Java\jdk1.8.0_45\bin

Android Studio Installation To use Android Studio as your primary IDE, download the Android Studio Bundle from the following location: https://developer.android.com/studio/index.html The Android Studio Development Bundle includes the basic tools you need to begin developing Java Android Applications: • • • •

Android Studio IDE Android SDK tools Latest Android Platform Latest System Image for Emulator

Once downloaded, refer to the readme.txt included with the Android SDK, or follow Android’s installation instructions: https://developer.android.com/studio/install.html?pkg=studio

Standalone Android SDK Tools

Some developers may wish to develop mobile VR applications without using Android Studio. This section describes how to set up for development using the basic set of tools needed for development using the standalone Android SDK and Gradle. 1. Download the SDK Tools Package from the following location: https://developer.android.com/studio/ index.html (under “Get just the command line tools”) 2. Save the exe to the directory where you would like to install it, e.g.: C:\Dev\Android\android-sdk-r24.4.1\ 3. Once downloaded, double-click the exe to unpack its contents into the parent directory. 4. Add the environment variable ANDROID_HOME, and set the value to your Android SDK location. For example: C:\Dev\Android\android-android-sdk-r24.4.1 5. Using the above location as an example, the correct syntax for setting your environment variable using set from the command line is: set ANDROID_HOME=C:\Dev\Android\android-sdk-r24.4.1

14 | Mobile SDK Getting Started Guide | Mobile

6. Add the SDK tools and platform tools directories to your PATH. The correct syntax for setting your path using set from the command line is: set PATH=%PATH%;%ANDROID_HOME%\tools;%ANDROID_HOME% \platform-tools

Installing Additional Packages and Tools Android Studio You must download additional packages required by the Mobile SDK via the Android SDK Manager, found in Tools > Android > SDK Manager. Android Studio may prompt you to take this step automatically the first time you launch it. Standalone Android SDK Tools To launch the Android SDK manager, double-click the SDK Manager.exe file at the root of the SDK Tools Package location. The following packages are required for native development: • Android SDK 4.42. API level 19 or later • Android Build Tools Once the Android SDK is installed, launch the SDK Manager to verify that you have installed the latest stable SDK Tools, Platform-tools, and Build-tools. Also verify that you have at least one SDK Platform-tools installed, preferably the one you intend to target with your application.

Android Native Development Kit (NDK)

The Android Native Development Kit (NDK) is a toolset that allows you to implement parts of your app using native code languages such as C and C++. It is used extensively by the sample applications included with this release. Note: You may install the NDK during the Android Studio installation process, but we recommend installing it manually to be sure that your command-line environment is set up properly and agrees with your Studio setup. The last version of the NDK known to work with the Mobile SDK is r11c. 1. Download the appropriate version of NDK from the following location: https://developer.android.com/ndk/ downloads/index.html. 2. Save the exe to the directory where you would like to install it, e.g., C:\Dev\Android\android-ndk-r11c\. 3. Once downloaded, double-click the exe to unpack its contents into the parent directory. 4. Add the NDK location to your PATH. For example: C:\Dev\Android\android-ndk-r11c\ 5. Add the environment variable ANDROID_NDK, and set the value to your Android NDK location. For example: C:\Dev\Android\android-ndk-r11c 6. Using the above location as an example, the correct syntax for setting your environment variable using set from the command line is: set ANDROID_NDK=C:\Dev\Android\android-ndk-r11c

Mobile | Mobile SDK Getting Started Guide | 15

Gradle

Gradle is a build automation suite used by our standalone build scripts and by Android Studio to manage dependencies and allow for custom build logic. Gradle replaces the previous Android build system, Ant, which is now deprecated for use in Android development. It is not necessary to install Gradle to use the Mobile SDK. Oculus Mobile SDK 1.0+ build scripts use the Gradle Wrapper (gradlew), a small wrapper that automatically downloads and installs Gradle the first time you build a project. However, if you wish to install the full version of Gradle, we have included instructions. The latest version which has been tested with this release is Gradle 2.10. Choose the Complete download if you want the Gradle source and offline documentation. To install Gradle: 1. 2. 3. 4. 5.

Download version 2.10 from http://gradle.org/gradle-download/ Extract into the desired development directory, e.g., C:\Dev\Android\gradle-2.10. Add the Gradle bin directory to your PATH variable, e.g.: C:\Dev\Android\gradle-2.10\bin Restart your terminal window to fetch the updated PATH variable. You should now be able to build with: build -g

Setting up your System to Detect your Android Device

You must set up your system to detect your Android device over USB in order to run, debug, and test your application on an Android device. If you are developing on Windows, you may need to install a USB driver for adb after installing the Android SDK. For an installation guide and links to OEM drivers, see the Android OEM USB Drivers document. Samsung Android drivers may be found on their developer site: http://developer.samsung.com/android/toolssdks/Samsung-Android-USB-Driver-for-Windows Windows may automatically detect the correct device and install the appropriate driver when you connect your device to a USB port on your computer. Access the Device Manager through the Windows Control Panel. If the device was automatically detected, it will show up under Portable Devices in the Device Manager. Otherwise, look under Other Devices in the Device Manager and select the device to manually update the driver. To verify that the driver successfully recognized the device, open a command prompt and type the command: adb devices

Note: You will need to successfully setup your Android development environment in order to use this command. For more information, see the next section: Android Development Environment Setup If the device does not show up, verify that the device is turned on with enough battery power, and that the driver is installed properly.

Android Development Software Setup for Mac OS X In order to develop Android applications, you must have the following software installed on your system: 1. Xcode 2. Java Development Kit (JDK)

16 | Mobile SDK Getting Started Guide | Mobile

3. Android Studio Development Bundle, or Standalone Android SDK Tools 4. Android Native Development Kit (NDK) Gradle installation is recommended but not required. See Gradle for more information. Your Samsung device may display a notification recommending you install Android File Transfer, a handy application for transferring files between OS X and Android.

Xcode

Before installing any Android development tools, you must install Xcode. Once installation is complete, some of the following steps (such as installing the JDK) may be unnecessary. To download Xcode, visit https://developer.apple.com/xcode/download/

Java Development Kit (JDK)

Install the JDK if it is not already present on your system. If you have already installed Xcode, this step may be unnecessary. The latest version tested with this release is JDK 8u91 - it may be downloaded at the following location: http:// www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html.

Android Studio Installation To use Android Studio as your primary IDE, download the Android Studio Bundle from the following location: https://developer.android.com/sdk/index.html The Android Studio Development Bundle includes the basic tools you need to begin developing Java Android Applications: • • • •

Android Studio IDE Android SDK tools Latest Android Platform Latest System Image for Emulator

Once downloaded, follow the install instructions located here: https://developer.android.com/sdk/installing/ index.html?pkg=studio

Standalone Android SDK Tools

Some developers may wish to develop mobile VR applications without using Android Studio. This section describes how to set up for development using the basic set of tools needed for development using the standalone Android SDK and Gradle. 1. Download the SDK Tools Package from the following location: https://developer.android.com/studio/ index.html (under “Get just the command line tools”). 2. Once downloaded, unzip the archive into the directory where you would like to install it, e.g., ~/Dev/ Android/android-sdk-r24.4.1. 3. Set the environment variable ANDROID_HOME to your Android SDK location. The correct syntax for setting your environment variable is: export ANDROID_HOME=/android-sdk-r24.4.1 4. Add the SDK tools and platform tools directories to your PATH. The correct syntax for setting your path is: export PATH=${PATH}:$ANDROID_HOME/tools:$ANDROID_HOME/platform-tools

Mobile | Mobile SDK Getting Started Guide | 17

Installing Additional Packages and Tools Android Studio You must download additional packages required by the Mobile SDK via the Android SDK Manager, found in Tools > Android > SDK Manager. Android Studio may prompt you to take this step automatically the first time you launch it. Standalone Android SDK Tools To launch the Android SDK manager without Android Studio, open a terminal and navigate to the tools/ directory of the SDK Tools Package location, then execute the program ‘android’. If you would rather download packages directly from the command-line, instead run android list sdk -a to see a list of packages, then run android update sdk -a -u -t [package id]. For help locating the package id, run android -h update sdk. You must download additional packages required by the Mobile SDK via the Android Studio SDK Manager, found in Tools > Android > SDK Manager. Android Studio may prompt you to take this step automatically the first time you launch it. The following packages are required for native development: • Android SDK, API level 19 or later • Android Build Tools Once the Android SDK is installed, launch the SDK Manager to verify that you have installed the latest stable SDK Tools, Platform-tools, and Build-tools. Also verify that you have at least one SDK Platform-tools installed, preferably the one you intend to target with your application.

Android Native Development Kit (NDK)

The Android Native Development Kit (NDK) is a toolset that allows you to implement parts of your app using native code languages such as C and C++. It is used extensively by the sample applications which come with this release. The latest version which has been tested with this release is NDK r11c - it is available for download at the following location: https://developer.android.com/tools/sdk/ndk/index.html. Once downloaded, extract the NDK to your home/dev folder (~/dev). Your dev folder should look something like the following:

18 | Mobile SDK Getting Started Guide | Mobile

Note that Android Studio and the NDK are extracted into their own folders. These folders may be given any name you wish, but they must be installed into separate folders to avoid any conflict between the two packages. You can read more about installation and use of the NDK here: http://developer.android.com/tools/sdk/ndk/ index.html#installing.

Gradle

Gradle is a build automation suite used by our standalone build scripts and by Android Studio to manage dependencies and allow for custom build logic. Gradle replaces the previous Android build system, Ant, which is now deprecated for use in Android development. It is not necessary to install Gradle to use the Mobile SDK. Oculus Mobile SDK 1.0+ build scripts use the Gradle Wrapper (gradlew), a small wrapper that automatically downloads and installs Gradle the first time you build a project. However, if you wish to install the full version of Gradle, we have included instructions. The latest version which has been tested with this release is Gradle 2.10. Choose the Complete download if you want the Gradle source and offline documentation. Manual Installation 1. Download from http://gradle.org/gradle-download/ (requires version 2.5+) 2. Extract into some common development directory. For example: ~/android/gradle-2.10 3. Add the Gradle bin directory to your PATH variable. For example: ~/android/gradle-2.10/bin 4. Open a new Terminal window to refresh the PATH variable. You should now be able to build with ./build.py -g Install using Homebrew: Requires Homebrew, an optional package manager for OS X. 1. Run brew install gradle in the terminal. 2. Open a new terminal window to refresh the PATH variable. You should now be able to build with ./build.py -g

Mobile | Mobile SDK Getting Started Guide | 19

Install using MacPorts: Requires MacPorts, an optional package manager for OS X. 1. Run sudo port install gradle in the terminal. 2. Open a new terminal window to refresh the PATH variable. You should now be able to build with ./build.py -g

20 | Mobile Development Basics | Mobile

Mobile Development Basics This guide reviews basic development steps you'll need to know, such as application signing and required support for Universal Menu and volume handling.

Oculus Signature File (osig) and Application Signing Oculus mobile apps require two distinct signatures at different stages of development. • Oculus Signature File (required during development, remove for submission) • Android Application Signature (required for submission) Oculus Signature File (osig) During development, your application must be signed with an Oculus-issued Oculus Signature File, or osig. This signature comes in the form of a file that you include in your application in order to access protected lowlevel VR functionality on your mobile device. Each signature file is tied to a specific device, so you will need to generate osig files for each device that you use for development. When your application is submitted and approved, Oculus will modify the APK so that it can be used on all devices. Please see our osig self-service portal for more information and instructions on how to request an osig for development: https://dashboard.oculus.com/tools/osig-generator/ Android Application Signing Android uses a digital certificate (also called a keystore) to cryptographically validate the identity of application authors. All Android applications must be digitally signed with such a certificate in order to be installed and run on an Android device. All developers must create their own unique digital signature and sign their applications before submitting them to Oculus for approval. For more information and instructions, please see Android's "Signing your Applications" documentation: http://developer.android.com/tools/publishing/app-signing.html Make sure to save the certificate file you use to sign your application. Every subsequent update to your application must be signed with the same certificate file, or it will fail. Note: Your application must be signed by an Android certificate before you submit it to the Oculus Store. Android Application Signing and Unity Unity automatically signs Android applications with a temporary debug certificate by default. Before building your final release build, create a new Android keystore and assign it with the Use Existing Keystore option, found in Edit > Project Settings > Player > Publishing Options. For more information, see the “Android” section of Unity's documentation here: http://docs.unity3d.com/Manual/class-PlayerSettings.html.

Mobile | Mobile Development Basics | 21

Universal Menu and Volume The Universal Menu provides features such as the Pass-Through Camera, shortcut to Oculus Home, Reorient, and Do Not Disturb options, along with various system state indicators such as Wi-Fi signal strength and battery level.

The Universal Menu is part of the Oculus System Activities application which is installed to the user's device along with Oculus Home and Horizon. The Universal Menu is activated when the user initiates the relevant reserved button interactions described below.

Reserved User Interactions

Back button and volume button behaviors must conform to specific requirements. Volume Button Interactions The volume buttons are reserved, and volume adjustment on the Samsung device is handled automatically. Volume control dialog display is also handled automatically by the VrApi as of Mobile SDK 1.0.3. Do not implement your own additional volume display handling, or users will see two juxtaposed displays. You may override automatic volume display handling if necessary by setting VRAPI_FRAME_FLAG_INHIBIT_VOLUME_LAYER as an ovrFrameParm flag. Back Button Interactions Back button long-presses must always be associated with the Universal Menu. Short-presses are typically (but not necessarily) treated as a generic back action. For example, a short-press on the back button may bring up

22 | Mobile Development Basics | Mobile

the application’s own menu. In another application, a short-press may act as a generic back navigation in the UI hierarchy unless the root is reached, at which point it may bring up an application-specific menu, or enter the Universal Menu with a confirmation dialog, allowing the user to exit the application to Oculus Home. Long-press A long-press occurs when a user presses the back button and holds it for more than 0.75 seconds, then releases it. • A long-press must always open the Universal Menu. • Apps must implement Universal Menu access through integration with the Oculus Mobile SDK when longpresses are detected. Short-press A short-press occurs when a user presses the back button once within a 0.25 second window, then releases it. • If a single press of the back button is longer than a short-press (0.25 seconds) but shorter than a long-press (0.75 seconds), it results in an aborted long-press and cancels the Universal Menu timer. • The way in which a back action is handled by an application depends on the application's current state. Back actions usually prompt apps to navigate one level up in an interface hierarchy. For example, if the top-level screen of an app menu is active, a short-press will exit the app menu. If no satisfactory stateful condition is identified by the application, the short-press opens the Universal Menu with a confirmation dialog allowing the user to exit the app and return to Oculus Home.

Universal Menu Implementation Native apps For native applications, the Universal Menu may be started with App::StartSystemActivity(). In native apps, the application is responsible for hooking the back key short-presses by overloading VrAppInterface::OnKeyEvent() and deciding when the user is at the root of the application’s UI, at which point it should ignore the back key event by returning false. This will allow VrAppFramework to handle the back key and start the Universal Menu quit confirmation dialog. Unity apps See OVRPlatformMenu.cs in the Utilities for Unity 5 for sample execution. For more information, see Design Considerations: Universal Menu in our Unity 5 guide.

Developer Mode: Running Apps Outside of the Gear VR Headset To run a VR application without loading your mobile device into a headset, you must enable Developer Mode. It is often convenient during development to run VR applications without needing to insert your device into the Gear VR headset, which can be a time-consuming process during development. When Developer Mode is enabled, any Oculus application on your mobile device will launch with distortion and stereoscopic rendering applied without inserting the device into a headset. Devices in Developer Mode implement limited orientation tracking using phone sensors. Orientation tracking may be disabled with the

Mobile | Mobile Development Basics | 23

appropriate setting to System Properties or Local Preferences (see Android System Properties and Local Preferences on page 79 for details). Enable Developer Mode Note: Before enabling Developer Mode, you must have previously built an apk signed with your osig file and installed it to your device. Otherwise, you will see an error message saying "You are not a developer!" 1. Go to Settings > Application Manager 2. Select Gear VR Service 3. Select Manage Storage 4. Click on VR Service Version several times until the Developer Mode toggle shows up 5. Toggle Developer Mode Note: If you do not see the Developer Mode toggle switch after tapping VR Service Version several times, close Gear VR Service and relaunch, and you should see it.

Android Studio Basics This guide introduces the Android Studio IDE and review its basic relevant features. Getting Started with Oculus Native Samples: Import Gradle Project When you first launch Android Studio after initial setup, you will be presented with a Welcome screen. Select Import project and point it at our root Gradle project.

24 | Mobile Development Basics | Mobile

You may import any build.gradle project in any of the Mobile SDK VrSamples directories, and Android Studio will automatically import the required dependencies without bringing in all of the samples.

Project Overview We recommend changing the Project Overview from the default Android to Project, which provides a good overview of the entire directory structure, with imported project directories shown in bold.

Mobile | Mobile Development Basics | 25

Project Settings After importing your project, verify that the SDK and NDK locations are configured properly in File > Project Structure..., or your build will fail.

26 | Mobile Development Basics | Mobile

Select Target Configuration, Build, and Run You may build and run your application on your device directly from Android Studio. This will compile your project, build the APK, copy it to the phone via USB/Wi-Fi, and prepare it for launching. If your phone is set to Developer Mode (see Developer Mode for instructions), your application will play without being inserted into your Gear VR headset. Otherwise, when the process completes you will be prompted to insert your mobile device into the headset to launch the application. Make sure you have followed the configuration steps in the Mobile SDK Setup Guide to ensure your device is configured appropriately. Applications must be signed with an Oculus Signature File (osig) in order to run (see Application Signing for more information). Select the target configuration you wish to build before building by selecting Edit Configurations… in the project dropdown menu in the Android Studio toolbar. Immediately to the right of the project dropdown, you will see Run and Debug icons. Click on either icon to build and run your project.

After clicking on Run or Debug, the Choose Device dialog window will open. This is usually set to an emulator by default. Select Choose a running device and select the appropriate option. At this point, the phone may prompt you to enable USB debugging. If the device is not connected and there is no prompt, try the following: 1. Go to Developer Options on your phone.

Mobile | Mobile Development Basics | 27

2. Toggle USB Debugging off and then back on. This should get the prompt to show up. You may also build and clean your project without running it by using the Build menu.

Syncing Projects If you edit a *.gradle file or install an update to the Oculus Mobile SDK which includes updated Gradle Projects, you must Gradle Sync to update the Android Studio project files.

Click on Sync Project with Gradle Files in the Android Studio toolbar as shown. Migrating Eclipse Projects to Android Studio For a general discussion of how to import existing Eclipse projects into Android Studio, please see the instructions provided by Android here: http://developer.android.com/sdk/installing/migrate.html

28 | Native Development Overview | Mobile

Native Development Overview Welcome to the Native Development Guide. This guide describes the libraries, tools, samples, and other material provided with this SDK for native development of mobile VR applications. While native software development is comparatively rudimentary, it is closer to the metal and allows implementing very high performance virtual reality experiences without the overhead of elaborate environments such as you would find with a typical game engine. It is not feature rich, but it provides the basic infrastructure you will need to get started with your own high-performance virtual reality experience. This SDK includes several sample projects which provide an overview of the native source code. See Native Samples on page 29 for details. Note: This guide is intended to provide a high-level orientation and discussion of native development with the mobile SDK. Be sure to review the header files for any libraries you use for more extensive, in-depth discussion.

Native Source Code This section describes Gear VR native source code development. The native SDK provides four basic native libraries: • • • •

VrApi: the minimal API for VR; VrAppFramework: the application framework used by native apps; VrAppSupport: support for GUI, Locale handling, sound, etc.; LibOVRKernel: a low-level Oculus library for containers, mathematical operations, etc.

VrApi provides the minimum required API for rendering scenes in VR. Applications may query VrApi for orientation data, and submit textures to apply distortion, sensor fusion, and compositing. Developers working with a third-party engine other than Unity or Unreal will use VrApi to integrate the mobile SDK. For detailed information, see Native Engine Integration on page 31. The VrAppFramework handles VrApi integration and provides a wrapper around Android activity that manages the Android lifecycle. The VrAppFramework is the basis for several of our samples and first-party applications, including Oculus Video and Oculus 360 Photos. If you are not using Unity, Unreal, or another integration and you would like a basic framework to help get started, we recommend that you have a look. See Native Application Framework on page 47 for more information. VrAppSupport and LibOVRKernel provide minimal functionality to applications using VrAppFramework such as GUI, sound, and locale management. See Other Native Libraries on page 51 for more information. The Vr App Interface (part of VrAppFramework) has a clearly-defined lifecycle, which may be found in VrAppFramework/Include/App.h. LibOVRKernel and VrAppFramework ship with full source as well as pre-built libs and jar files. The VrApi is shipped as a set of public include files, a pre-built shared library, and a jar file. Providing the VrApi in a separate shared library allows the VrApi implementation to be updated after an application has been released, making it easy to apply hot fixes, implement new optimizations, and add support for new devices without requiring applications to be recompiled with a new SDK. The VrApi is periodically updated automatically to Samsung phones - for release notes, see System Activities/VrApi Release Notes. See the VrSamples/Native/VrCubeWorld projects for examples of how to integrate VrApi into third party engines as well as how to use VrAppFramework. Please see Native Samples on page 29 for more details.

Mobile | Native Development Overview | 29

Main Components Component

Description

Source code folder

VrApi

The Virtual Reality API provides a minimal set of entry points for enabling VR rendering in native applications and third-party engines.

VrApi

VrApi Includes

Header files for the VrApi library.

VrApi/Includes

Application Framework

Framework and support code for native applications. Includes code for rendering, user interfaces, sound playback, and more. It is not meant to provide the functionality of a full-fledged engine, but it does provide structure and a lot of useful building blocks for building native applications.

VrAppFramework/ Src

VrAppFramework Includes

Header files for the VrAppFramework library.

VrAppFramework/ Include

Native Samples

Sample projects illustrating use of VrApi and VrAppFramework.

VrSamples/Native

LibOVRKernel

The Oculus library.

LibOVRKernel/Src

Native Samples The mobile SDK includes a set of sample projects that prove out virtual reality application development on the Android platform and demonstrate high-performance virtual reality experiences on mobile devices. Sample Applications and Media Note: The sample applications included with the SDK are provided as a convenience for development purposes. Some of these apps are similar to apps available for download from the Oculus Store. Due to the potential for conflict with these versions, we do not recommend running these sample apps on the same device on which you have installed your retail Gear VR experience. If you are using the Samsung Note 4, please take care to secure the retail media content bundled with the SM-R320. It will be difficult if not impossible to replace. The following samples may be found in \VrSamples\Native\: • • • •

Oculus 360 Photos: A viewer for panoramic stills. Oculus 360 Videos: A viewer for panoramic videos. Oculus Cinema: Plays 2D and 3D movies in a virtual movie theater. VR Cube World: A sample scene of colorful cubes illustrating the basic construction of native apps using different tools provided by the SDK. There are three versions of VR Cube World: • VrCubeWorld_SurfaceView is closest to the metal. This sample uses a plain Android SurfaceView and handles all Android Activity and Android Surface lifecycle events in native code. This sample does not use the application framework or LibOVRKernel - it only uses the VrApi. It provides a good example of how to integrate the VrApi in an existing engine. The MULTI_THREADED define encapsulates the code that shows how the VrApi can be integrated into an engine with a separate renderer thread. • VrCubeWorld_NativeActivity uses the Android NativeActivity class to avoid manually handling all the lifecycle events. This sample does not use the application framework or LibOVRKernel, it uses VrApi only. It provides a good example of how to integrate the VrApi in an existing engine that uses a NativeActivity. The MULTI_THREADED define encapsulates the code that shows how the VrApi can be integrated into an engine with a separate renderer thread. • VrCubeWorld_Framework uses Oculus Mobile Application Framework. When starting a new application from scratch without using a pre-existing engine, the application framework can make development

30 | Native Development Overview | Mobile

significantly easier. The application framework presents a much simplified application lifecycle, allowing a developer to focus on the virtual reality experience without having to deal with many of the platformspecific details. Installation To install these sample applications and associated data to your mobile device, perform the following steps: 1. Connect to the device via USB. 2. Run installtophone.bat from your Oculus Mobile SDK directory, e.g.: C:\Dev\Oculus\Mobile \installToPhone.bat. 3. Issue the following commands from C:\Dev\Oculus\Mobile\: adb push sdcard_SDK /sdcard/ adb install -r *.apk

4. Alternately, you may copy the files directly onto your mobile device using Windows Explorer, which may be faster in some cases.

Android Manifest Settings Configure your manifest with the necessary VR settings, as shown in the following manifest segment. Note: These manifest requirements are intended for development and differ from our submission requirements. Before submitting your application, please be sure to follow the manifest requirements described by our Publishing Guide. <meta-data android:name="com.samsung.android.vr.application.mode" android:value="vr_only"/>

• Replace with your actual package name, such as "com.oculus.cinema". • The Android theme should be set to the solid black theme for comfort during application transitioning: Theme.Black.NoTitleBar.Fullscreen • The vr_only meta data tag should be added for VR mode detection.

• The required screen orientation is landscape: android:screenOrientation="landscape" • It is recommended that your configChanges are as follows: android:configChanges="screenSize| orientation|keyboardHidden|keyboard" • The minSdkVersion and targetSdkVersion are set to the API level supported by the device. For the current set of devices, the API level is 19. • Do not add the noHistory attribute to your manifest. Applications submission requirements may require additional adjustments to the manifest. Please refer to the submission guidelines available in our Publishing Guide.

Mobile | Native Engine Integration | 31

Native Engine Integration This guide describes how to integrate the mobile native SDK with a game engine using VrApi. VrApi provides the minimum required API for rendering scenes in VR. Applications may query VrApi for orientation data, and submit textures to apply distortion, sensor fusion, and compositing. We have provided the source for VrCubeWorld_NativeActivity and VrCubeWorld_SurfaceView, simple sample applications using VrApi, to serve as references. Please see Native Samples on page 29 for more details.

VrApi Lifecycle and Rendering

Multiple Android activities that live in the same address space can cooperatively use the VrApi. However, only one activity can be in "VR mode" at a time. The following explains when an activity is expected to enter/leave VR mode. Android Activity lifecycle An Android Activity can only be in VR mode while the activity is in the resumed state. The following shows how VR mode fits into the Android Activity lifecycle. 1. 2. 3. 4. 5. 6. 7. 8.

VrActivity::onCreate() <---------+ VrActivity::onStart() <-------+ | VrActivity::onResume() <---+ | | vrapi_EnterVrMode() | | | vrapi_LeaveVrMode() | | | VrActivity::onPause() -----+ | | VrActivity::onStop() ----------+ | VrActivity::onDestroy() ---------+

Android Surface lifecycle An Android Activity can only be in VR mode while there is a valid Android Surface. The following shows how VR mode fits into the Android Surface lifecycle. 1. 2. 3. 4. 5.

VrActivity::surfaceCreated() <----+ VrActivity::surfaceChanged() | vrapi_EnterVrMode() | vrapi_LeaveVrMode() | VrActivity::surfaceDestroyed() ---+

Note that the lifecycle of a surface is not necessarily tightly coupled with the lifecycle of an activity. These two lifecycles may interleave in complex ways. Usually surfaceCreated() is called after onResume() and surfaceDestroyed() is called between onPause() and onDestroy(). However, this is not guaranteed and, for instance, surfaceDestroyed() may be called after onDestroy() or even before onPause(). An Android Activity is only in the resumed state with a valid Android Surface between surfaceChanged() or onResume(), whichever comes last, and surfaceDestroyed() or onPause(), whichever comes first. In other words, a VR application will typically enter VR mode from surfaceChanged() or onResume(), whichever comes last, and leave VR mode from surfaceDestroyed() or onPause(), whichever comes first.

32 | Native Engine Integration | Mobile

Android VR lifecycle This is a high-level overview of the rendering pipeline used by VrApi. For more information, see VrApi\Include \VrApi.h. 1. Initialize the API. 2. Create an EGLContext for the application. 3. Get the suggested resolution to create eye texture swap chains with vrapi_GetSystemPropertyInt( &java, VRAPI_SYS_PROP_SUGGESTED_EYE_TEXTURE_WIDTH ). 4. Allocate a texture swap chain for each eye with the application's EGLContext current. 5. Get the suggested FOV to setup a projection matrix. 6. Setup a projection matrix based on the suggested FOV. Note that this is an infinite projection matrix for the best precision. 7. Android Activity/Surface lifecycle loop. a. b. c. d.

Acquire ANativeWindow from Android Surface. Enter VR mode once the Android Activity is in the resumed state with a valid ANativeWindow. Frame loop, possibly running on another thread. Get the HMD pose, predicted for the middle of the time period during which the new eye images will be displayed. The number of frames predicted ahead depends on the pipeline depth of the engine and the synthesis rate. The better the prediction, the less black will be pulled in at the edges. e. Apply the head-on-a-stick model if there is no positional tracking. f. Advance the simulation based on the predicted display time. g. Render eye images and setup ovrFrameParms using 'ovrTracking'. h. Render to 'textureId' using the 'eyeViewMatrix' and 'eyeProjectionMatrix'. Insert 'fence' using eglCreateSyncKHR. i. Hand over the eye images to the time warp. j. Must leave VR mode when the Android Activity is paused or the Android Surface is destroyed or changed. k. Destroy the texture swap chains. Make sure to delete the swapchains before the application's EGLContext is destroyed. 8. Shut down the API. Integration The API is designed to work with an Android Activity using a plain Android SurfaceView, where the Activity lifecycle and the Surface lifecycle are managed completely in native code by sending the lifecycle events (onResume, onPause, surfaceChanged etc.) to native code. The API does not work with an Android Activity using a GLSurfaceView. The GLSurfaceView class manages the window surface and EGLSurface and the implementation of GLSurfaceView may unbind the EGLSurface before onPause() gets called. As such, there is no way to leave VR mode before the EGLSurface disappears. Another problem with GLSurfaceView is that it creates the EGLContext using eglChooseConfig(). The Android EGL code pushes in multisample flags in eglChooseConfig() if the user has selected the "force 4x MSAA" option in settings. Using a multisampled front buffer is completely wasted for TimeWarp rendering. Alternately, an Android NativeActivity can be used to avoid manually handling all the lifecycle events. However, it is important to select the EGLConfig manually without using eglChooseConfig() to make sure the front buffer is not multisampled. The vrapi_GetSystemProperty* functions can be called at any time from any thread. This allows an application to setup its renderer, possibly running on a separate thread, before entering VR mode.

Mobile | Native Engine Integration | 33

On Android, an application cannot just allocate a new window/frontbuffer and render to it. Android allocates and manages the window/frontbuffer and (after the fact) notifies the application of the state of affairs through lifecycle events (surfaceCreated / surfaceChanged / surfaceDestroyed). The application (or 3rd party engine) typically handles these events. Since the VrApi cannot just allocate a new window/frontbuffer, and the VrApi does not handle the lifecycle events, the VrApi somehow has to take over ownership of the Android surface from the application. To allow this, the application can explicitly pass the EGLDisplay, EGLContext EGLSurface or ANativeWindow to vrapi_EnterVrMode(), where the EGLSurface is the surface created from the ANativeWindow. The EGLContext is used to match the version and config for the context used by the background time warp thread. This EGLContext, and no other context can be current on the EGLSurface. If, however, the application does not explicitly pass in these objects, then vrapi_EnterVrMode() must be called from a thread with an OpenGL ES context current on the Android window surface. The context of the calling thread is then used to match the version and config for the context used by the background TimeWarp thread. The TimeWarp will also hijack the Android window surface from the context that is current on the calling thread. On return, the context from the calling thread will be current on an invisible pbuffer, because the time warp takes ownership of the Android window surface. Note that this requires the config used by the calling thread to have an EGL_SURFACE_TYPE with EGL_PBUFFER_BIT. Before getting sensor input, the application also needs to know when the images that are going to be synthesized will be displayed, because the sensor input needs to be predicted ahead for that time. As it turns out, it is not trivial to get an accurate predicted display time. Therefore the calculation of this predicted display time is part of the VrApi. An accurate predicted display time can only really be calculated once the rendering loop is up and running and submitting frames regularly. In other words, before getting sensor input, the application needs an accurate predicted display time, which in return requires the renderer to be up and running. As such, it makes sense that sensor input is not available until vrapi_EnterVrMode() has been called. However, once the application is in VR mode, it can call vrapi_GetPredictedDisplayTime() and vrapi_GetPredictedTracking() at any time from any thread. The VrApi allows for one frame of overlap which is essential on tiled mobile GPUs. Because there is one frame of overlap, the eye images have typically not completed rendering by the time they are submitted to vrapi_SubmitFrame(). To allow the time warp to check whether the eye images have completed rendering, the application can explicitly pass in a sync object (CompletionFence) for each eye image through vrapi_SubmitFrame(). Note that these sync objects must be EGLSyncKHR because the VrApi still supports OpenGL ES 2.0. If, however, the application does not explicitly pass in sync objects, then vrapi_SubmitFrame() must be called from the thread with the OpenGL ES context that was used for rendering, which allows vrapi_SubmitFrame() to add a sync object to the current context and check if rendering has completed. Note that even if no OpenGL ES objects are explicitly passed through the VrApi, then vrapi_EnterVrMode() and vrapi_SubmitFrame() can still be called from different threads. vrapi_EnterVrMode() needs to be called from a thread with an OpenGL ES context that is current on the Android window surface. This does not need to be the same context that is also used for rendering. vrapi_SubmitFrame() needs to be called from the thread with the OpenGL ES context that was used to render the eye images. If this is a different context than the context used to enter VR mode, then for stereoscopic rendering this context *never* needs to be current on the Android window surface. Eye images are passed to vrapi_SubmitFrame() as "texture swap chains" (ovrTextureSwapChain). These texture swap chains are allocated through vrapi_CreateTextureSwapChain(). This is important to allow these textures to be allocated in special system memory. When using a static eye image, the texture swap chain does not need to be buffered and the chain only needs to hold a single texture. When the eye images are dynamically updated, the texture swap chain needs to be buffered. When the texture swap chain is passed to vrapi_SubmitFrame(), the application also passes in the chain index to the most recently updated texture.

34 | Native Engine Integration | Mobile

Frame Timing vrapi_SubmitFrame() controls the synthesis rate through an application specified ovrFrameParms::MinimumVsyncs. vrapi_SubmitFrame() also controls at which point during a display refresh cycle the calling thread gets released. vrapi_SubmitFrame() only returns when the previous eye images have been consumed by the asynchronous time warp thread, and at least the specified minimum number of V-syncs have passed since the last call to vrapi_SubmitFrame(). The asynchronous time warp thread consumes new eye images and updates the V-sync counter halfway through a display refresh cycle. This is the first time the time warp can start updating the first eye, covering the first half of the display. As a result, vrapi_SubmitFrame() returns and releases the calling thread halfway through a display refresh cycle. Once vrapi_SubmitFrame() returns, synthesis has a full display refresh cycle to generate new eye images up to the next halfway point. At the next halfway point, the time warp has half a display refresh cycle (up to V-sync) to update the first eye. The time warp then effectively waits for V-sync and then has another half a display refresh cycle (up to the next-next halfway point) to update the second eye. The asynchronous time warp uses a high priority GPU context and will eat away cycles from synthesis, so synthesis does not have a full display refresh cycle worth of actual GPU cycles. However, the asynchronous time warp tends to be very fast, leaving most of the GPU time for synthesis. Instead of just using the latest sensor sampling, synthesis uses predicted sensor input for the middle of the time period during which the new eye images will be displayed. This predicted time is calculated using vrapi_GetPredictedDisplayTime(). The number of frames predicted ahead depends on the pipeline depth, the extra latency mode, and the minimum number of V-syncs in between eye image rendering. Less than half a display refresh cycle before each eye image will be displayed, the asynchronous time warp will get new predicted sensor input using the very latest sensor sampling. The asynchronous time warp then corrects the eye images using this new sensor input. In other words, the asynchronous time warp will always correct the eye images even if the predicted sensor input for synthesis was not perfect. However, the better the prediction for synthesis, the less black will be pulled in at the edges by the asynchronous time warp. The application can improve the prediction by fetching the latest predicted sensor input right before rendering each eye, and passing a, possibly different, sensor state for each eye to vrapi_SubmitFrame(). However, it is very important that both eyes use a sensor state that is predicted for the exact same display time, so both eyes can be displayed at the same time without causing intra frame motion judder. While the predicted orientation can be updated for each eye, the position must remain the same for both eyes, or the position would seem to judder "backwards in time" if a frame is dropped.

Latency Controls Ideally the eye images are only displayed for the MinimumVsyncs display refresh cycles that are centered about the eye image predicted display time. In other words, a set of eye images is first displayed at prediction time minus MinimumVsyncs / 2 display refresh cycles. The eye images should never be shown before this time because that can cause intra frame motion judder. Ideally the eye images are also not shown after the prediction time plus MinimumVsyncs / 2 display refresh cycles, but this may happen if synthesis fails to produce new eye images in time. • • • •

MinimumVsyncs = 1 ExtraLatencyMode = off Expected single-threaded simulation latency = 33 milliseconds The ATW brings this down to 8-16 milliseconds. |-------|-------|-------| - V-syncs | * | * | * | - eye image display periods (* = predicted time in middle of display period) \ / \ / \ / ^ \ / ^ | +---- The asynchronous time warp projects the second eye image onto the display.

Mobile | Native Engine Integration | 35

| \ / | +---- The asynchronous time warp projects the first eye image onto the display. | | | | | +---- Call vrapi_SubmitFrame before this point. | | vrapi_SubmitFrame inserts a GPU fence and hands over eye images to the asynchronous time warp. | | The asynchronous time warp checks the fence and uses the new eye images if rendering has completed. | | | +---- Generate GPU commands and execute commands on GPU. | +---- vrapi_SubmitFrame releases the renderer thread. • • • •

MinimumVsyncs = 1 ExtraLatencyMode = on Expected single-threaded simulation latency = 49 milliseconds The ATW brings this down to 8-16 milliseconds. |-------|-------|-------|-------| - V-syncs | * | * | * | * | - display periods (* = predicted time in middle of display period) \ / \ / \ / ^ \ / ^ | +---- The asynchronous time warp projects second eye image onto the display. | \ / | +---- The asynchronous time warp projects first eye image onto the display. | \ / | | \ / +---- Submit frame before this point. | \ / Frame submission inserts a GPU fence and hands over eye textures | \ / to the asynchronous time warp. The asynchronous time warp checks | | the fence and uses the new eye textures if rendering has completed. | | | +---- Generate GPU commands on CPU and execute commands on GPU. | +---- Frame submission releases the renderer thread.

• • • •

MinimumVsyncs = 2 ExtraLatencyMode = off Expected single-threaded simulation latency = 58 milliseconds The ATW brings this down to 8-16 milliseconds. |-------|-------|-------|-------|-------| - V-syncs * | * | * | - eye image display periods (* = predicted time in middle of display period) \ / \ / \ / \ / \ / ^ \ / ^ | | | +---- The asynchronous time warp re-projects the second eye image onto the display. | \ / | | | +---- The asynchronous time warp re-projects the first eye image onto the display. | \ / | | +---- The asynchronous time warp projects the second eye image onto the display. | \ / | +---- The asynchronous time warp projects the first eye image onto the display. | \ / | | \ / +---- Call vrapi_SubmitFrame before this point. | | vrapi_SubmitFrame inserts a GPU fence and hands over eye images to the asynchronous time warp. | | The asynchronous time warp checks the fence and uses the new eye images if rendering has completed.

36 | Native Engine Integration | Mobile

| | | +---- Generate GPU commands and execute commands on GPU. | +---- vrapi_SubmitFrame releases the renderer thread. • • • •

MinimumVsyncs = 2 ExtraLatencyMode = on Expected single-threaded simulation latency = 91 milliseconds The ATW brings this down to 8-16 milliseconds.

|-------|-------|-------|-------|-------|-------|-------| - V-syncs * | * | * | * | - eye image display periods (* = predicted time in middle of display period) \ / \ / \ / \ / \ / ^ \ / ^ | | | +---- The asynchronous time warp re-projects the second eye image onto the display. | \ / | | | +---- The asynchronous time warp reprojects the first eye image onto the display. | \ / | | +---- The asynchronous time warp projects the second eye image onto the display. | \ / | +---- The asynchronous time warp projects the first eye image onto the display. | \ / | | \ / +---- Call vrapi_SubmitFrame before this point. | \ / vrapi_SubmitFrame inserts a GPU fence and hands over eye images to the asynchronous time warp. | +------+------+ The asynchronous time warp checks the fence and uses the new eye images if rendering has completed. | | | +---- Generate GPU commands and execute commands on GPU. | +---- vrapi_SubmitFrame releases the renderer thread.

Asynchronous TimeWarp

TimeWarp transforms stereoscopic images based on the latest head-tracking information to significantly reduce the motion-to-photon delay. reducing latency and judder in VR applications. Overview In a basic VR game loop, the following occurs: 1. The software requests your head orientation. 2. The CPU processes the scene for each eye. 3. The GPU renders the scenes. 4. The Oculus Compositor applies distortion and displays the scenes on the headset. The following shows a basic example of a game loop: Figure 1: Basic Game Loop

Mobile | Native Engine Integration | 37

When frame rate is maintained, the experience feels real and is enjoyable. When it doesn’t happen in time, the previous frame is shown which can be disorienting. The following graphic shows an example of judder during the basic game loop: Figure 2: Basic Game Loop with Judder

When you move your head and the world doesn’t keep up, this can be jarring and break immersion. ATW is a technique that shifts the rendered image slightly to adjust for changes in head movement. Although the image is modified, your head does not move much, so the change is slight. Additionally, to smooth issues with the user’s computer, game design or the operating system, ATW can help fix “potholes” or moments when the frame rate unexpectedly drops. The following graphic shows an example of frame drops when ATW is applied: Figure 3: Game Loop with ATW

At the refresh interval, the Compositor applies TimeWarp to the last rendered frame. As a result, a TimeWarped frame will always be shown to the user, regardless of frame rate. If the frame rate is very bad, flicker will be noticeable at the periphery of the display. But, the image will still be stable. ATW is automatically applied by the Oculus Compositor; you do not need to enable or tune it. Although ATW reduces latency, make sure that your application or experience makes frame rate. Discussion Stereoscopic eye views are rendered to textures, which are then warped onto the display to correct for the distortion caused by wide angle lenses in the headset. To reduce the motion-to-photon delay, updated orientation information is retrieved for the headset just before drawing the time warp, and a transformation matrix is calculated that warps eye textures from where they were at the time they were rendered to where they should be at the time they are displayed. Many people are skeptical on first hearing about this, but for attitude changes, the warped pixels are almost exactly correct. A sharp rotation will leave some pixels black at the edges, but this turns out to be minimally distracting. The time warp is taken a step farther by making it an "interpolated time warp." Because the video is scanned out at a rate of about 120 scan lines a millisecond, scan lines farther to the right have a greater latency than lines to the left. On a sluggish LCD this doesn't really matter, but on a crisp switching OLED, users may feel like the world is subtly stretching or shearing when they turn quickly. This is corrected by predicting the head attitude at the beginning of each eye, a prediction of < 8 milliseconds, and the end of each eye, < 16 milliseconds. These predictions are used to calculate time warp transformations, and the warp is interpolated between these two values for each scan line drawn. The time warp may be implemented on the GPU by rendering a full screen quad with a fragment program that calculates warped texture coordinates to sample the eye textures. However, for improved performance the time warp renders a uniformly tessellated grid of triangles over the whole screen where the texture coordinates are

38 | Native Engine Integration | Mobile

setup to sample the eye textures. Rendering a grid of triangles with warped texture coordinates basically results in a piecewise linear approximation of the time warp. If the time warp runs asynchronous to the stereoscopic rendering, then it may also be used to increase the perceived frame rate and to smooth out inconsistent frame rates. By default, the time warp currently runs asynchronously for both native and Unity applications.

TimeWarp Minimum Vsyncs The TimeWarp MinimumVsyncs parameter default value is 1 for a 60 FPS target. Setting it to 2 will reduce the maximum application frame rate to no more than 30 FPS. The asynchronous TimeWarp thread will continue to render new frames with updated head tracking at 60 FPS, but the application will only have an opportunity to generate 30 new stereo pairs of eye buffers per second. You can set higher values for experimental purposes, but the only sane values for shipping apps are 1 and 2. There are two cases where you might consider explicitly setting this: If your application can't hold 60 FPS most of the time, it might be better to clamp at 30 FPS all the time, rather than have the app smoothness or behavior change unpredictably for the user. In most cases, we believe that simplifying the experiences to hold 60 FPS is the correct decision, but there may be exceptions. Rendering at 30 application FPS will save a significant amount of power and reduce the thermal load on the device. Some applications may be able to hit 60 FPS, but run into thermal problems quickly, which can have catastrophic performance implications -- it may be necessary to target 30 FPS if you want to be able to play for extended periods of time. See Power Management for more information regarding thermal throttle mitigation strategies.

Consequences of not rendering at 60 FPS These consequences apply whether you have explicitly set MinimumVsyncs or your app is simply going that slow by itself. If the viewpoint is far away from all geometry, nothing is animating, and the rate of head rotation is low, there will be no visual difference. When any of these conditions are not present, there will be greater or lesser artifacts to balance. If the head rotation rate is high, black at the edges of the screen will be visibly pulled in by a variable amount depending on how long it has been since an eye buffer was submitted. This still happens at 60 FPS, but because the total time is small and constant from frame to frame, it is almost impossible to notice. At lower frame rates, you can see it snapping at the edges of the screen. There are two mitigations for this: 1) Instead of using either "now" or the time when the frame will start being displayed as the point where the head tracking model is queried, use a time that is at the midpoint of all the frames that the eye buffers will be shown on. This distributes the "unrendered area" on both sides of the screen, rather than piling up on one. 2) Coupled with that, increasing the field of view used for the eye buffers gives it more cushion off the edges to pull from. For native applications, we currently add 10 degrees to the FOV when the frame rate is below 60. If the resolution of the eye buffers is not increased, this effectively lowers the resolution in the center of the screen. There may be value in scaling the FOV dynamically based on the head rotation rates, but you would still see an initial pop at the edges, and changing the FOV continuously results in more visible edge artifacts when mostly stable. TimeWarp does not currently attempt to compensate for changes in position, only attitude. We don't have real position tracking in mobile yet, but we do use a head / neck model that provides some eye movement based on rotation, and apps that allow the user to navigate around explicitly move the eye origin. These values will

Mobile | Native Engine Integration | 39

not change at all between eye updates, so at 30 eye FPS, TimeWarp would be smoothly updating attitude each frame, but movement would only change every other frame. Walking straight ahead with nothing really close by works rather better than might be expected, but sidestepping next to a wall makes it fairly obvious. Even just moving your head when very close to objects makes the effect visible. There is no magic solution for this. We do not have the performance headroom on mobile to have TimeWarp do a depth buffer informed reprojection, and doing so would create new visual artifacts in any case. There is a simplified approach that we may adopt that treats the entire scene as a single depth, but work on it is not currently scheduled. It is safe to say that if your application has a significant graphical element nearly stuck to the view, like an FPS weapon, that it is not a candidate for 30 FPS. Turning your viewpoint with a controller is among the most nauseating things you can do in VR, but some games still require it. When handled entirely by the app, this winds up being like a position change, so a low-frame-rate app would have smooth "rotation" when the user's head was moving, but chunky rotation when they use the controller. To address this, TimeWarp has an "ExternalVelocity" matrix parameter that can allow controller yaw to be smoothly extrapolated on every rendered frame. We do not currently have a Unity interface for this. In-world animation will be noticeably chunkier at lower frame rates, but in-place doesn't wind up being very distracting. Objects on trajectories are more problematic, because they appear to be stuttering back and forth as they move, when you track them with your head. For many apps, monoscopic rendering may still be a better experience than 30 FPS rendering. The savings are not as large, but it is a clear tradeoff without as many variables. If you go below 60 FPS, Unity apps may be better off without the multi-threaded renderer, which adds a frame of latency. 30 FPS with GPU pipeline and multi-threaded renderer is getting to be a lot of latency, and while TimeWarp will remove all of it for attitude, position changes including the head model, will feel very lagged. Note that this is all bleeding edge, and some of this guidance is speculative.

Power Management

Power management is a crucial consideration for mobile VR development. A current-generation mobile device is amazingly powerful for something that you can stick in your pocket - you can reasonably expect to find four 2.6 GHz CPU cores and a 600 MHz GPU. Fully utilized, they can deliver more performance than an Xbox 360 or PS3 in some cases. A governor process on the device monitors an internal temperature sensor and tries to take corrective action when the temperature rises above certain levels to prevent malfunctioning or scalding surface temperatures. This corrective action consists of lowering clock rates. If you run hard into the limiter, the temperature will continue climbing even as clock rates are lowered, and CPU clocks may drop all the way down to 300 MHz. The device may even panic under extreme conditions. VR performance will catastrophically drop along the way. The default clock rate for VR applications is 1.8 GHz on two cores, and 389 MHz on the GPU. If you consistently use most of this, you will eventually run into the thermal governor, even if you have no problem at first. A typical manifestation is poor app performance after ten minutes of good play. If you filter logcat output for "thermal" you will see various notifications of sensor readings and actions being taken. (For more on logcat, see Android Debugging: Logcat.) A critical difference between mobile and PC/console development is that no optimization is ever wasted. Without power considerations, if you have the frame ready in time, it doesn't matter if you used 90% of

40 | Native Engine Integration | Mobile

the available time or 10%. On mobile, every operation drains the battery and heats the device. Of course, optimization entails effort that comes at the expense of something else, but it is important to note the tradeoff.

Fixed Clock Level API

The Fixed Clock Level API allows the application to set a fixed CPU level and a fixed GPU level. On current devices, the CPU and GPU clock rates are completely fixed to the application set values until the device temperature reaches the limit, at which point the CPU and GPU clocks will change to the power save levels. This change can be detected (see Power State Notification and Mitigation Strategy below). Some apps may continue operating in a degraded fashion, perhaps by changing to 30 FPS or monoscopic rendering. Other apps may display a warning screen saying that play cannot continue. The fixed CPU level and fixed GPU level set by the Fixed Clock Level API are abstract quantities, not MHz / GHz, so some effort can be made to make them compatible with future devices. For current hardware, the levels can be 0, 1, 2, or 3 for CPU and GPU. 0 is the slowest and most power efficient; 3 is the fastest and hottest. Typically the difference between the 0 and 3 levels is about a factor of two. Not all clock combinations are valid for all devices. For example, the highest GPU level may not be available for use with the two highest CPU levels. If an invalid matrix combination is provided, the system will not acknowledge the request and clock settings will go into dynamic mode. VrApi will assert and issue a warning in this case. Note: The combinations (2,3) and (3,3) are currently allowed for the initial release of the device. However, we strongly discourage their use, as they are likely to lead quickly to overheating. Settings -1, 4, and 5 are not allowed.

Power Management and Performance

There are no magic settings in the SDK to fix power consumption - this is critical. The length of time your application will be able to run before running into the thermal limit depends on two factors: how much work your app is doing, and what the clock rates are. Changing the clock rates all the way down only yields about a 25% reduction in power consumption for the same amount of work, so most power saving has to come from doing less work in your app. If your app can run at the (0,0) setting, it should never have thermal issues. This is still two cores at around 1 GHz and a 240 MHz GPU, so it is certainly possible to make sophisticated applications at that level, but Unitybased applications might be difficult to optimize for this setting. There are effective tools for reducing the required GPU performance: • • • • •

Don’t use chromatic aberration correction on TimeWarp. Don’t use 4x MSAA. Reduce the eye target resolution. Using 16-bit color and depth buffers may help. It is probably never a good trade to go below 2x MSAA – you should reduce the eye target resolution instead.

These all entail quality tradeoffs which need to be balanced against steps you can take in your application: • Reduce overdraw (especially blended particles) and complex shaders. • Always make sure textures are compressed and mipmapped. In general, CPU load seems to cause more thermal problems than GPU load. Reducing the required CPU performance is much less straightforward. Unity apps should always use the multithreaded renderer option, since two cores running at 1 GHz do work more efficiently than one core running at 2 GHz.

Mobile | Native Engine Integration | 41

If you find that you just aren’t close, then you may need to set MinimumVsyncs to 2 and run your game at 30 FPS, with TimeWarp generating the extra frames. Some things work out okay like this, but some interface styles and scene structures highlight the limitations. For more information on how to set MinimumVsyncs, see the TimeWarp technical note. In summary, our general advice: If you are making an app that will probably be used for long periods of time, like a movie player, pick very low levels. Ideally use (0,0), but it is possible to use more graphics if the CPUs are still mostly idle, perhaps up to (0,2). If you are okay with the app being restricted to ten-minute chunks of play, you can choose higher clock levels. If it doesn’t work well at (2,2), you probably need to do some serious work. With the clock rates fixed, observe the reported FPS and GPU times in logcat. The GPU time reported does not include the time spent resolving the rendering back to main memory from on-chip memory, so it is an underestimate. If the GPU times stay under 12 ms or so, you can probably reduce your GPU clock level. If the GPU times are low, but the frame rate isn’t 60 FPS, you are CPU limited. Always build optimized versions of the application for distribution. Even if a debug build performs well, it will draw more power and heat up the device more than a release build. Optimize until it runs well. For more information on how to improve your Unity application’s performance, see Best Practices: Mobile in our Unity documentation.

Power State Notification and Mitigation Strategy

The mobile SDK provides power level state detection and handling. Power level state refers to whether the device is operating at normal clock frequencies or if the device has risen above a thermal threshold and thermal throttling (power save mode) is taking place. In power save mode, CPU and GPU frequencies will be switched to power save levels. The power save levels are equivalent to setting the fixed CPU and GPU clock levels to (0, 0). If the temperature continues to rise, clock frequencies will be set to minimum values which are not capable of supporting VR applications. Once we detect that thermal throttling is taking place, the app has the choice to either continue operating in a degraded fashion or to immediately exit to the Oculus Menu with a head-tracked error message. In the first case, when the application first transitions from normal operation to power save mode, the following will occur: • The Universal Menu will be brought up to display a dismissible warning message indicating that the device needs to cool down. • Once the message is dismissed, the application will resume in 30Hz TimeWarp mode with correction for chromatic aberration disabled. • If the device clock frequencies are throttled to minimum levels after continued use, a non-dismissible error message will be shown and the user will have to undock the device. In this mode, the application may choose to take additional app-specific measures to reduce performance requirements. For Native applications, you may use the following AppInterface call to detect if power save mode is active: GetPowerSaveActive(). For Unity, you may use the following plugin call: OVR_IsPowerSaveActive(). See OVR/Script/Util/OVRModeParms.cs for further details. In the second case, when the application transitions from normal operation to power save mode, the Universal Menu will be brought up to display a non-dismissible error message and the user will have to undock the device to continue. This mode is intended for applications which may not perform well at reduced levels even with 30Hz TimeWarp enabled.

42 | Native Engine Integration | Mobile

You may use the following calls to enable or disable the power save mode strategy: For Native, set settings.ModeParms.AllowPowerSave in VrAppInterface::Configure () to true for power save mode handling, or false to immediately show the head-tracked error message. For Unity, you may enable or disable power save mode handling via OVR_VrModeParms_SetAllowPowerSave(). See OVR/Script/Util/OVRModeParms.cs for further details.

Advanced Rendering This section describes advanced rendering features available through the SDK.

Multi-View Overview With stock OpenGL, the method of stereo rendering is achieved by rendering to the two eye buffers sequentially. This typically doubles the application and driver overhead, despite the fact that the command streams and render states are almost identical. The GL_OVR_multiview extension addresses the inefficiency of sequential multi-view rendering by adding a means to render to multiple elements of a 2D texture array simultaneously. Using the multi-view extension, draw calls are instanced into each corresponding element of the texture array. The vertex program uses a new ViewID variable to compute per-view values, typically the vertex position and view-dependent variables like reflection. The formulation of the extension is high level in order to allow implementation freedom. On existing hardware, applications and drivers can realize the benefits of a single scene traversal, even if all GPU work is fully duplicated per-view. But future support could enable simultaneous rendering via multi-GPU, tile-based architectures could sort geometry into tiles for multiple views in a single pass, and the implementation could even choose to interleave at the fragment level for better texture cache utilization and more coherent fragment shader branching. The most obvious use case in this model is to support two simultaneous views: one view for each eye. However, multi-view can also be used for foveated rendering, where two views are rendered per eye, one with a wide field of view and the other with a narrow one. The nature of wide field of view planar projection is that the sample density can become unacceptably low in the view direction. By rendering two inset eye views per eye, the required sample density is achieved in the center of projection without wasting samples, memory, and time by oversampling in the periphery. Basic Usage The GL_OVR_multiview extension is not a turn-key solution that can simply be enabled to support multi-view rendering in an application. An application must explicitly support GL_OVR_multiview to get the benefits. The GL_OVR_multiview extension is used by the application to render the scene, and the VrApi is unaware of its use. The VrApi supports sampling from the layers of a texture array, but is otherwise completely unaware of how the application produced the texture data, whether multi-view rendering was used or not. However, because of various driver problems, an application is expected to query the VrApi to find out whether or not multi-view is properly supported on a particular combination of device and OS.

Mobile | Native Engine Integration | 43

Restructuring VrAppFramework Rendering For Multi-view The following section describes how to convert your application to be Multi-View compliant based on the VrAppFramework Multi-View setup. In order to set up your rendering path to be multi-view compliant, your app should specify a list of surfaces and render state back to App Frame(). Immediate GL calls inside the app main render pass are not compatible with multi-view rendering and not allowed. The first section below describes how to transition your app from rendering with DrawEyeview and instead return a list of surfaces back to the application framework. The section below describes multi-view rendering considerations and how to enable it in your app. Return Surfaces From Frame Set up the Frame Result: Apps should set up the ovrFrameResult which is returned by Frame with the following steps: 1. Set up the ovrFrameParms - storage for which should be maintained by the application. 2. Set up the FrameMatrices - this includes the CenterEye and View and Projection matrices for each eye. 3. Generate a list of render surfaces and append to the frame result Surfaces list. a. Note: The surface draw order will be the order of the list, from lowest index (0) to highest index. b. Note: Do not free any resources which surfaces in list rely on while Frame render is in flight. 4. Optionally, specify whether to clear the color or depth buffer with clear color. OvrSceneView Example An example using the OvrSceneView library scene matrices and surface generation follows: ovrFrameResult OvrApp::Frame( const ovrFrameInput & vrFrame ) { ... // fill in the frameresult info for the frame. ovrFrameResult res; // Let scene construct the view and projection matrices needed for the frame. Scene.GetFrameMatrices( vrFrame.FovX, vrFrame.FovY, res.FrameMatrices ); // Let scene generate the surface list for the frame. Scene.GenerateFrameSurfaceList( res.FrameMatrices, res.Surfaces ); // Initialize the FrameParms. FrameParms = vrapi_DefaultFrameParms( app->GetJava(), VRAPI_FRAME_INIT_DEFAULT, vrapi_GetTimeInSeconds(), NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { FrameParms.Layers[0].Textures[eye].ColorTextureSwapChain = vrFrame.ColorTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].DepthTextureSwapChain = vrFrame.DepthTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].TextureSwapChainIndex = vrFrame.TextureSwapChainIndex; FrameParms.Layers[0].Textures[eye].TexCoordsFromTanAngles = vrFrame.TexCoordsFromTanAngles; FrameParms.Layers[0].Textures[eye].HeadPose = vrFrame.Tracking.HeadPose; } FrameParms.ExternalVelocity = Scene.GetExternalVelocity(); FrameParms.Layers[0].Flags = VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION; res.FrameParms = (ovrFrameParmsExtBase *) & FrameParms; return res; }

Custom Rendering Example

44 | Native Engine Integration | Mobile

First, you need to make sure any immediate GL render calls are represented by an ovrSurfaceDef. In the DrawEyeView path, custom surface rendering was typically done by issuing immediate GL calls. glActiveTexture( GL_TEXTURE0 ); glBindTexture( GL_TEXTURE_2D, BackgroundTexId ); glDisable( GL_DEPTH_TEST ); glDisable( GL_CULL_FACE ); GlProgram & prog = BgTexProgram; glUseProgram( prog.Program ); glUniform4f( prog.uColor, 1.0f, 1.0f, 1.0f, 1.0f ); globeGeometry.Draw(); glUseProgram( 0 ); glActiveTexture( GL_TEXTURE0 ); glBindTexture( GL_TEXTURE_2D, 0 );

Instead, with the multi-view compliant path, an ovrSurfaceDef and GlProgram would be defined at initialization time as follows. static ovrProgramParm BgTexProgParms[] = { { "Texm", ovrProgramParmType::FLOAT_MATRIX4 }, { "UniformColor", ovrProgramParmType::FLOAT_VECTOR4 }, { "Texture0", ovrProgramParmType::TEXTURE_SAMPLED }, }; BgTexProgram= GlProgram::Build( BgTexVertexShaderSrc, BgTexFragmentShaderSrc, BgTexProgParms, sizeof( BgTexProgParms) / sizeof( ovrProgramParm ) ); GlobeSurfaceDef.surfaceName = "Globe"; GlobeSurfaceDef.geo = BuildGlobe(); GlobeSurfaceDef.graphicsCommand.Program = BgTexProgram; GlobeSurfaceDef.graphicsCommand.GpuState.depthEnable = false; GlobeSurfaceDef.graphicsCommand.GpuState.cullEnable = false; GlobeSurfaceDef.graphicsCommand.UniformData[0].Data = &BackGroundTexture; GlobeSurfaceDef.graphicsCommand.UniformData[1].Data = &GlobeProgramColor;

At Frame time, the uniform values can be updated, changes to the gpustate can be made, and the surface(s) added to the render surface list. Note: This manner of uniform parm setting requires the application to maintain storage for the uniform data. Future SDKs will provide helper functions for setting up uniform parms and materials. An example of setting up FrameResult using custom rendering follows: ovrFrameResult OvrApp::Frame( const ovrFrameInput & vrFrame ) { ... // fill in the frameresult info for the frame. ovrFrameResult res; // calculate the scene matrices for the frame. res.FrameMatrices.CenterView = vrapi_GetCenterEyeViewMatrix( &app->GetHeadModelParms(), &vrFrame.Tracking, NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { res.FrameMatrices.EyeView[eye] = vrapi_GetEyeViewMatrix( &app->GetHeadModelParms(), &CenterEyeViewMatrix, eye ); res.FrameMatrices.EyeProjection[eye] = ovrMatrix4f_CreateProjectionFov( vrFrame.FovX, vrFrame.FovY, 0.0f, 0.0f, 1.0f, 0.0f ); } // Update uniform variables and add needed surfaces to the surface list. BackGroundTexture = GlTexture( BackgroundTexId, 0, 0 ); GlobeProgramColor = Vector4f( 1.0f, 1.0f, 1.0f, 1.0f ); res.Surfaces.PushBack( ovrDrawSurface( &GlobeSurfaceDef ) ); // Initialize the FrameParms.

Mobile | Native Engine Integration | 45

FrameParms = vrapi_DefaultFrameParms( app->GetJava(), VRAPI_FRAME_INIT_DEFAULT, vrapi_GetTimeInSeconds(), NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { FrameParms.Layers[0].Textures[eye].ColorTextureSwapChain = vrFrame.ColorTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].DepthTextureSwapChain = vrFrame.DepthTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].TextureSwapChainIndex = vrFrame.TextureSwapChainIndex; FrameParms.Layers[0].Textures[eye].TexCoordsFromTanAngles = vrFrame.TexCoordsFromTanAngles; FrameParms.Layers[0].Textures[eye].HeadPose = vrFrame.Tracking.HeadPose; } FrameParms.ExternalVelocity = Scene.GetExternalVelocity(); FrameParms.Layers[0].Flags = VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION; res.FrameParms = (ovrFrameParmsExtBase *) & FrameParms; return res; }

Specify the Render Mode: In your app Configure(), specify the appropriate render mode. To configure the app to render using the surfaces returned by Frame, set the following: settings.RenderMode = RENDERMODE_STEREO;

Note: A debug render mode option has been provided which alternates rendering between the deprecated DrawEyeView path and the new path for returning a render surface list from Frame. This aides the conversion process so it is easier to spot discrepancies between the two paths. settings.RenderMode = RENDERMODE_DEBUG_ALTERNATE_STEREO;

Multi-view Render Path Before enabling the multi-view rendering path, you will want to make sure your render data is multi-view compatible. This involves: Position Calculation App render programs should no longer specify Mvpm directly and should instead calculate gl_Position using the system provided TransformVertex() function which accounts for the correct view and projection matrix for the current viewID. Per-Eye View Calculations Apps will need to take into consideration per-eye-view calculations: Examples follow: Per-Eye Texture Matrices: In the DrawEyeView path, the texture matrix for the specific eye was bound at the start of each eye render pass. For multi-view, an array of texture matrices indexed by VIEW_ID should be used. Note: Due to a driver issue with the Adreno 420, KitKat, and version 300 programs, uniform array of matrices should be contained inside a uniform buffer object. Stereo Images: In the DrawEyeView path, the image specific to the eye was bound at the start of each eye render pass. For multi-view, while an array of texture index by VIEW_ID would be preferable, Android KitKat does not support the use of texture arrays. Instead, both textures can be specified in the fragment shader and the selection determined by the VIEW_ID. External Image Usage

46 | Native Engine Integration | Mobile

Applications which make use of image_external, i.e. video rendering applications, must take care when constructing image_external shader programs. Not all drivers support image_external as version 300. The good news is that drivers which fully support multiview will support image_external in the version 300 path, which means image_external programs will work correctly when the multi-view path is enabled. However, for drivers which do not fully support multi-view, these shaders will be compiled as version 100. These shaders must continue to work in both paths, i.e., version 300 only constructs should not be used and the additional extension specification requirements, listed above, should be made. For some cases, the cleanest solution may be to only use image_external during Frame to copy the contents of the external image to a regular texture2d which is then used in the main app render pass (which could eat into the multi-view performance savings) Enable Multi-view Rendering Finally, to enable the multi-view rendering path, set the render mode in your app Configure() to the following: settings.RenderMode = RENDERMODE_MULTIVIEW;

Mobile | Native Application Framework | 47

Native Application Framework The VrAppFramework provides a wrapper around Android activity that manages the Android lifecycle. We have provided the simple sample scene VrCubeWorld_Framework to illustrate using the application framework with GuiSys, OVR_Locale, and SoundEffectContext. Please see Native Samples on page 29 for details.

Creating New Apps with the Framework Template This section will get you started with creating new native applications using VrAppFramework. Template Project Using the Application Framework VrTemplate is the best starting place for creating your own mobile app using VrAppFramework. The VrTemplate project is set up for exploratory work and as a model for setting up similar native applications using the Application Framework. We include the Python script make_new_project.py (for Mac OS X) and make_new_project.py.bat (with wrapper for Windows) to simplify renaming the project name set in the template. Usage Example: Windows To create your own mobile app based on VrTemplate, perform the following steps: 1. Run \VrSamples\Native\VrTemplate\make_new_project.py.bat, passing the name of your new app and your company as arguments. For example: make_new_project.py.bat VrTestApp YourCompanyName

2. Your new project will now be located in \VrSamples\Native\VrTestApp. The packageName will be set to com.YourCompanyName.VrTestApp. 3. Copy your oculussig file to the Project/assets/ folder of your project. (See Application Signing for more information.) 4. Navigate to your new project directory. With your Android device connected, execute the build.bat located inside your test app directory to verify everything is working. 5. build.bat should build your code, install it to the device, and launch your app on the device. One parameter controls the build type: • clean - cleans the project’s build files

• debug - builds a debug application • release - builds a release application • -n - skips installation of the application to the phone The Java file VrSamples\Native\VrTemplate\Projects\Android\src\oculus\MainActivity.java handles loading the native library that was linked against VrApi, then calls nativeSetAppInterface() to allow the C++ code to register the subclass of VrAppInterface that defines the application. See VrAppFramework/Include/App.h for comments on the interface. The standard Oculus convenience classes for string, vector, matrix, array, et cetera are available in the Oculus LibOVRKernel, located at LibOVRKernel\Src\. You will also find convenience code for OpenGL ES texture, program, geometry, and scene processing that is used by the demos.

48 | Native Application Framework | Mobile

Integration with Third-Party Engines It is generally easy to pull code into the VrTemplate project, but some knowledge of the various VR-related systems is necessary to integrate them directly with your own engine. The file VrApi/Include/VrApi.h provides the minimum API for VR applications.

UI and Input Handling This guide describes resources for handling UI and input for native apps using VrAppFramework. Native User Interface Applications using the application framework have access to the VrGUI interface code. The VrGUI system is contained in VrAppSupport/VrGui/Src. Menus are represented as a generic hierarchy of menu objects. Each object has a local transform relative to its parent and local bounds. VrGUI may be used to implement menus in a native application, such as the Folder Browser control used in Oculus 360 Photos SDK and Oculus 360 Videos SDK (found in FolderBrowser.cpp). VrGUI allows for functional improvements by implementing a component model. When a menu object is created, any number of components can be specified. These components derive from a common base class, defined in VRMenuComponent.h, that handles events sent from the parent VRMenu. Components can be written to handle any event listed in VRMenuEvent.h. Events are propagated to child objects either by broadcasting to all children or by dispatching them along the path from the menu’s root to the currently focused object. When handling an event a component can consume it by returning MSG_STATUS_CONSUMED, which will halt further propagation of that event instance. See VRMenuEventHandler.cpp for implementation details. Examples of reusable components can be found in the native UI code, including DefaultComponent.h, ActionComponents.h and ScrollBarComponent.h. Input Handling Input to the application is intercepted in the Java code in VrActivity.java in the dispatchKeyEvent() method. If the event is NOT of type ACTION_DOWN or ACTION_UP, the event is passed to the default dispatchKeyEvent() handler. If this is a volume up or down action, it is handled in Java code. Otherwise the key is passed to the buttonEvent() method, which passes the event to nativeKeyEvent(). nativeKeyEvent() posts the message to an event queue, which is then handled by the AppLocal::Command() method. In previous versions of the SDK, AppLocal::Command() called AppLocal::KeyEvent() with the event parameters. However, in 0.6.0 this was changed to buffer the events into the InputEvents array. Up to 16 events can be buffered this way, per frame. Each time the VrThreadFunction executes a frame loop, these buffered events are passed into VrFrameBuilder::AdvanceVrFrame(), which composes the VrFrame structure for that frame. The InputEvents array is then cleared. After composition, the current VrFrame is passed to FrameworkInputProcessing(), which iterates over the input event list and passes individual events to the native application via VrAppInterface::OnKeyEvent(). Native applications using the VrAppFramework should overload this method in their implementation of VrAppInterface to receive key events. The application is responsible for sending events to the GUI or other systems from its overloaded VrAppInterface::OnKeyEvent(), and returning true if the event is consumed at any point. If

Mobile | Native Application Framework | 49

OnKeyEvent() returns true, VrAppFramework assumes the application consumed the event and will not act upon it. If the application passes an input event to the VrGUI System, any system menus or menus created by the application have a chance to consume it in their OnEvent_Impl implementation by returning MSG_STATUS_CONSUMED, or pass it to other menus or systems by returning MSG_STATUS_ALIVE.

Native SoundEffectContext Use SoundEffectContext to easily play sound effects and replace sound assets without recompilation. SoundEffectContext consists of a simple sound asset management class, SoundAssetMapping, and sound pool class, SoundPool. The SoundAssetMapping is controlled by a JSON file in which sounds are mapped as key-value pairs, where a value is the actual path to the .wav file. For example: "sv_touch_active" : "sv_touch_active.wav"

In code, we use the key to play the sound, which SoundEffectManger then resolves to the actual asset. For example: soundEffectContext->Play( “sv_touch_active” );

The string “sv_touch_active” is first passed to SoundEffectContext, which resolves it to an absolute path, as long as the corresponding key was found during initialization. The following two paths specify whether the sound file is in the res/raw folder of VrAppFramework (e.g., sounds that may be played from any app, such as default sounds or Universal Menu sounds), or the assets folder of a specific app: “res/raw/ sv_touch_active.wav"

or “assets/ sv_touch_active.wav"

If SoundEffectContext fails to resolve the passed-in string within the SoundEffectContext::Play function, the string is passed to SoundPooler.play in Java. In SoundPooler.play, we first try to play the passed-in sound from res/raw, and if that fails, from the current assets folder. If that also fails, we attempt to play it as an absolute path. The latter allows for sounds to be played from the phone’s internal memory or SD card. The JSON file loaded by SoundAssetMapping determines which assets are used with the following scheme: 1. Try to load sounds_assets.json in the Oculus folder on the sdcard: sdcard/Oculus/sound_assets.json 2. If we fail to find the above file, we the load the following two files in this order: res/raw/sound_assets.json assets/sound_assets.json The loading of the sound_assets.json in the first case allows for a definition file and sound assets to be placed on the SD card in the Oculus folder during sound development. The sounds may be placed into folders if desired, as long as the relative path is included in the definition. For example, if we define the following in sdcard/Oculus/sound_assets.json: "sv_touch_active" : "SoundDev/my_new_sound.wav"

we would replace all instances of that sound being played with our new sound within the SoundDev folder.

50 | Native Application Framework | Mobile

The loading of the two asset definition files in the second step allows for overriding the framework's built-in sound definitions, including disabling sounds by redefining their asset as the empty string. For example: "sv_touch_active" : ""

The above key-value pair, if defined in an app’s sound_assets.json (placed in its asset folder), will disable that sound completely, even though it is still played by other VrAppSupport code such as VrGUI. You will find the sound effect source code in VrAppSupport/VrSound/.

Runtime Threads The UI thread is the launch thread that runs the normal Java code. The VR Thread is spawned by the UI thread and is responsible for the initialization, the regular frame updates, and for drawing the eye buffers. All of the AppInterface functions are called on the VR thread. You should put any heavyweight simulation code in another thread, so this one basically just does drawing code and simple frame housekeeping. Currently this thread can be set to the real-time SCHED_FIFO mode to get more deterministic scheduling, but the time spent in this thread may have to be limited. Non-trivial applications should create additional threads -- for example, music player apps run the decode and analyze in threads, app launchers load JPG image tiles in threads, et cetera. Whenever possible, do not block the VR thread on any other thread. It is better to have the VR thread at least update the view with new head tracking, even if the world simulation hasn't finished a time step. The Talk To Java (TTJ) Thread is used by the VR thread to issue Java calls that aren't guaranteed to return almost immediately, such as playing sound pool sounds or rendering a toast dialog to a texture. Sensors have their own thread so they can be updated at 500 Hz.

Mobile | Other Native Libraries | 51

Other Native Libraries This guide describes other native libraries included with the Mobile SDK. Additional supplemental libraries included with this SDK include: • VrAppSupport • LibOVRKernel

VrAppSupport SystemUtils Library All Gear VR application interactions with the System Activities application are managed through the SystemUtils library under VrAppSupport, which contains the System Activities API. All native applications must now link in this VrAppSupport/SystemUtils library by including the following line in the Android.mk file and adding the SystemUtils.aar (jar additionally provided) as a build dependency: $(call import-module,VrAppSupport/SystemUtils/Projects/AndroidPrebuilt/jni)

In order to utilize the SystemUtils library, applications must do the following: • Call SystemActivites_Init() when the application initializes. • Call SystemActivities_Update() to retrieve a list of pending System Activities events. • The application may then handle any events in this list that it wishes to process. The application is expected to call SystemActivities_RemoveAppEvent() to remove any events that it handled. • Call SystemActivities_PostUpdate() during the application’s frame update. • Call SystemActivities_Shutdown() when the application is being destroyed. Gear VR applications may use this API to send a specially formatted launch intent when launching the System Activities application. All applications may start System Activities in the Universal Menu and Exit to Home menu by calling SystemActivities_StartSystemActivity() with the appropriate PUI_ command. See the SystemActivities.h header file in the VrAppSupport/SystemUtils/Include folder for more details, and for a list of available commands. In theory, Gear VR applications may receive messages from System Activities at any time, but in practice these are only sent when System Activities returns to the app that launched it. These messages are handled via SystemActivities_Update() and SystemActivites_PostUpdate(). When Update is called, it returns an array of pending events. Applications may handle or consume events in this events array. If an application wishes to consume an event, it must be removed from the events array using SystemActivities_RemoveAppEvent(). This array is then passed through PostUpdate, where default handling is performed on any remaining events that have default behaviors, such as the reorient event. This example sequence illustrates how System Activities typically interacts with an application: 1. The user long-presses the back button. 2. The application detects the long-press and launches System Activities with SystemActivities_StartSystemActivity and the PUI_GLOBAL_MENU command. 3. In the Universal Menu, the user selects Reorient.

52 | Other Native Libraries | Mobile

4. System Activities sends a launch intent to the application that launched the Universal Menu. 5. 6. 7. 8. 9.

System Activities sends a reorient message to the application that launched it. The application receives the message and adds it to the System Activities event queue. The application resumes. The application calls SystemActivities_Update() to get a list of pending events. If the application has special reorient logic, e.g., to re-align UI elements to be in front of the user after reorienting, it can call vrapi_RecenterPose(), reposition its UI, and then remove the event from list of events. 10.The application calls SystemActivities_PostUpdate(). If the reorient event was not handled by the application, it is handled inside of PostUpdate. In the case of an exitToHome message, the SystemUtils library always handles the message with a default action of calling finish() on the application’s main Activity object without passing it on through the Update/ PostUpdate queue. This is done to allow the application to exit while still backgrounded. The VrAppFramework library already handles System Activities events, so native applications using the VrAppFramework do not need to make any System Activities API calls. After VrAppFramework calls SystemActivities_Update(), it places a pointer to the event array in the VrFrame object for that frame. Applications using the framework can then handle those System Activities events in their frame loop, or ignore them entirely. Any handled events should be removed from the event array using SystemActivities_RemoveAppEvent() before the application returns from it’s VrAppInterface::Frame() method. The Unity plugin (in Legacy 0.8+ and Utilities 0.1.3+) already includes the SystemUtils library and internally calls the System Activities API functions as appropriate. Unity applications cannot currently consume System Activities events, but can handle them by adding an event delegate using SetVrApiEventDelegate() on the OVRManager class. VrGui Library The VrGui library contains functionality for a fully 3D UI implemented as a scene graph. Each object in the scene graph can be functionally extended using components. This library has dependencies on VrAppFramework and must be used in conjunction with it. See the CinemaSDK, Oculus360PhotosSDK and Oculus360Videos SDKs for examples. VrLocale Library The VrLocale library is a wrapper for accessing localized string tables. The wrapper allows for custom string tables to be used seamlessly with Android’s own string tables. This is useful when localized content is not embedded in the application package itself. This library depends on VrAppFramework. VrModel Library The VrModel library implements functions for loading 3D models in the .ovrScene format. These models can be exported from .fbx files using the FbxConverter utility (see FBXConverter). This library depends on rendering functionality included in VrAppFramework. VrSound The VrSound library implements a simple wrapper for playing sounds using the android.media.SoundPool class. This library does not provide low latency or 3D positional audio and is only suitable for playing simple UI sound effects.

Mobile | Other Native Libraries | 53

LibOVRKernel LibOVRKernel is a reduced set of libraries primarily of low-level functions, containers, and mathematical operations. It is an integral part of VrAppFramework, but may generally be disregarded by developers using VrApi.

54 | Media and Assets | Mobile

Media and Assets This guide describes how to work with still images, videos, and other media for use with mobile VR applications.

Mobile VR Media Overview Author all media, such as panoramas and movies, at the highest-possible resolution and quality, so they can be resampled to different resolutions in the future. This topic entails many caveats and tradeoffs.

Panoramic Stills Use 4096x2048 equirectangular projection panoramas for both games and 360 photos. 1024x1024 cube maps is for games, and 1536x1536 cube maps is for viewing in 360 Photos with the overlay code.

Panoramic Videos The Qualcomm H.264 video decoder is spec driven by the ability to decode 4k video at 30 FPS, but it appears to have some headroom above that. The pixel rate can be flexibly divided between resolution and frame rate, so you can play a 3840x1920 video @ 30 FPS or a 2048x2048 video @ 60 FPS. The Android software layer appears to have an arbitrary limit of 2048 rows on video decode, so you may not choose to encode, say, a 4096x4096 video @ 15 FPS. The compressed bit rate does not appear to affect the decoding rate; panoramic videos at 60 Mb/s decode without problems, but most scenes should be acceptable at 20 Mb/s or so. The conservative specs for panoramic video are: 2880x1440 @ 60 FPS or 2048x2048 @ 60 FPS stereo. If the camera is stationary, 3840x1920 @ 30 FPS video may be considered. Oculus 360 Video is implemented using equirectangular mapping to render panoramic videos. Top-bottom, bottom-top, left-right, and right-left stereoscopic video support is implemented using the following naming convention for videos: “_TB.mp4” Top / bottom stereoscopic panoramic video “_BT.mp4” Bottom / top stereoscopic panoramic video “_LR.mp4” Left / right stereoscopic panoramic video “_RL.mp4” Right / left stereoscopic panoramic video Default

Non stereoscopic video if width does not match height, otherwise loaded as top / bottom stereoscopic video

Mobile | Media and Assets | 55

Movies on Screens Comfortable viewing size for a screen is usually less than 70 degrees of horizontal field of view, which allows the full screen to be viewed without turning your head significantly. For video playing on a surface in a virtual world using the recommended 1024x1024 eye buffers, anything over 720x480 DVD resolution is wasted, and if you don’t explicitly build mipmaps for it, it will alias and look worse than a lower resolution video. With the TimeWarp overlay plane code running in Oculus Cinema on the 1440 devices, 1280x720 HD resolution is a decent choice. The precise optimum depends on seating position and may be a bit lower, but everyone understands 720P, so it is probably best to stick with that. Use more bit rate than a typical web stream at that resolution, as the pixels will be magnified so much. The optimal bit rate is content dependent, and many videos can get by with less, but 5 Mb/s should give good quality. 1080P movies play, but the additional resolution is wasted and power consumption is needlessly increased. 3D movies should be encoded “full side by side” with a 1:1 pixel aspect ratio. Content mastered at 1920x1080 compressed side-by-side 3D should be resampled to 1920x540 resolution full side-by-side resolution.

Movie Meta-data When loading a movie from the sdcard, Oculus Cinema looks for a sidecar file with metadata. The sidecar file is simply a UTF8 text file with the same filename as the movie, but with the extension .txt. It contains the title, format (2D/3D), and category. { "title": "format": "category": "theater": }

"Introducing Henry", "2D", "trailers", ""

Title is the name of the movie. Oculus Cinema will use this value instead of the filename to display the movie title. Format describes how the film is formatted. If left blank, it will default to 2D (unless the movie has ‘3D’ in it’s pathname). Format may be one of the following values: 2D

Full screen 2D movie

3D

3D movie with left and right images formatted side-by-side

3DLR 3DLRF

3D movie with left and right images formatted side-by-side full screen (for movies that render too small in 3DLR)

3DTB

3D movie with left and right images formatted top-and-bottom

3DTBF

3D movie with left and right images formatted top-and-bottom full screen (for movies that render too small in 3DTB)

Category can be one of the following values: Blank

Movie accessible from “My Videos” tab in Oculus Cinema

56 | Media and Assets | Mobile

Trailers

Movie accessible from “Trailers” tab in Oculus Cinema

Multiscreen Movie accessible from “Multiscreen” tab in Oculus Cinema

Oculus 360 Photos and Videos Meta-data The retail version of 360 Photos stores all its media attribution information in a meta file that is packaged into the apk. This allows the categories to be dynamically created and/or altered without the need to modify the media contents directly. For the SDK 360 Photos, the meta file is generated automatically using the contents found in Oculus/360Photos. The meta data has the following structure in a meta.json file which is read in from the assets folder: {

"Categories":[ { “name” : “Category1” }, { “name” : “Category2” }

], "Data":[ { "title": "Media title", "author": "Media author", "url": "relative/path/to/media" "tags" : [ { "category" : "Category2" } ] } { "title": "Media title 2", "author": "Media author 2", "url": "relative/path/to/media2" "tags" : [ { "category" : "Category" }, { "category" : "Category2" } ] } }

For both the retail and sdk versions of 360 Videos, the meta data structure is not used and instead the categories are generated based on what’s read in from the media found in Oculus/360Videos.

Media Locations The SDK comes with three applications for viewing stills and movies. The Oculus Cinema application can play both regular 2D movies and 3D movies. The Oculus 360 Photos application can display 360 degree panoramic stills and the Oculus 360 Videos application can display 360 degree panoramic videos. These applications have the ability to automatically search for media in specific folders on the device. Oculus 360 Photos uses metadata which contains the organization of the photos it loads in addition to allowing the data to be dynamically tagged and saved out to its persistent cache. This is how the “Favorite” feature works, allowing the user to mark photos as favorites without any change to the media storage itself. The following table indicates where to place additional media for these applications. Media

Folders

Application

2D Movies

Movies\

Oculus Cinema

DCIM\

Mobile | Media and Assets | 57

Media

Folders

Application

Oculus\Movies \My Videos 3D Movies

Movies\3D

Oculus Cinema

DCIM\3D Oculus\Movies \My Videos\3D 360 degree panoramic stills

Oculus \360Photos

Oculus 360 Photos

(In the app assets\meta.json) 360 degree panoramic videos

Oculus \360Videos

Movie Oculus\Cinema Theater .ovrscene \Theaters

Oculus 360 Videos

Oculus Cinema

Native VR Media Applications Oculus Cinema Oculus Cinema uses the Android MediaPlayer class to play videos, both conventional (from /sdcard/Movies/ and /sdcard/DCIM/) and side by side 3D (from /sdcard/Movies/3D and /sdcard/DCIM/3D), in a virtual movie theater scene (from sdCard/Oculus/Cinema/Theaters). See Mobile VR Media Overview for more details on supported image and movie formats. Before entering a theater, Oculus Cinema allows the user to select different movies and theaters. New theaters can be created in Autodesk 3DS Max, Maya, or Luxology MODO, then saved as one or more Autodesk FBX files, and converted using the FBX converter that is included with this SDK. See the FBX Converter guide for more details on how to create and convert new theaters. The FBX converter is launched with a variety of command-line options to compile these theaters into models that can be loaded in the Oculus Cinema application. To avoid having to re-type all the command-line options, it is common practice to use a batch file that launches the FBX converter with all the command-line options. This package includes two such batch files, one for each example theater: • SourceAssets/scenes/cinema.bat • SourceAssets/scenes/home_theater.bat Each batch file will convert one of the FBX files with associated textures into a model which can be loaded by the Oculus Cinema application. Each batch file will also automatically push the converted FBX model to the device and launch the Oculus Cinema application with the theater. The FBX file for a theater should include several specially named meshes. One of the meshes should be named screen. This mesh is the surfaces onto which the movies will be projected. Read the FBX converter

58 | Media and Assets | Mobile

documentation to learn more about tags. Up to 8 seats can be set up by creating up to 8 tags named cameraPosX where X is in the range [1, 8]. A theater is typically one big mesh with two textures. One texture with baked static lighting for when the theater lights are one, and another texture that is modulated based on the movie when the theater lights are off. The lights are gradually turned on or off by blending between these two textures. To save battery, the theater is rendered with only one texture when the lights are completely on or completely off. The texture with baked static lighting is specified in the FBX as the diffuse color texture. The texture that is modulated based on the movie is specified as the emissive color texture. The two textures are typically 4096 x 4096 with 4-bits/texel in ETC2 format. Using larger textures may not work on all devices. Using multiple smaller textures results in more draw calls and may not allow all geometry to be statically sorted to reduce the cost of overdraw. The theater geometry is statically sorted to guarantee frontto-back rendering on a per triangle basis which can significantly reduce the cost of overdraw. Read the FBX Converter guide to learn about optimizing the geometry for the best rendering performance. In addition to the mesh and textures, Oculus Cinema currently requires a 350x280 icon for the theater selection menu. This is included in the scene with a command-line parameter since it is not referenced by any geometry, or it can be loaded as a .png file with the same filename as the ovrscene file. Oculus 360 Photos Oculus 360 Photos is a viewer for panoramic stills. The SDK version of the application presents a single category of panorama thumbnail panels which are loaded in from Oculus/360Photos on the SDK sdcard. Gazing towards the panels and then swiping forward or back on the Gear VR touchpad will scroll through the content. When viewing a panorama still, touch the Gear VR touchpad again to bring back up the panorama menu which displays the attribution information if properly set up. Additionally the top button or tapping the back button on the Gear VR touchpad will bring back the thumbnail view. The bottom button will tag the current panorama as a Favorite which adds a new category at the top of the thumbnail views with the panorama you tagged. Pressing the Favorite button again will untag the photo and remove it from Favorites. Gamepad navigation and selection is implemented via the left stick and d-pad used to navigate the menu, the single dot button selects and the 2-dot button backs out a menu. See Mobile VR Media Overview for details on creating custom attribution information for panoramas. Oculus 360 Videos Oculus 360 Videos works similarly to 360 Photos as they share the same menu functionality. The application also presents a thumbnail panel of the movie read in from Oculus/360Videos which can be gaze selected to play. Touch the Gear VR touchpad to pause the movie and bring up a menu. The top button will stop the movie and bring up the movie selection menu. The bottom button restarts the movie. Gamepad navigation and selection is implemented via the left stick and d-pad used to navigate the menu, the single dot button selects and the 2-dot button backs out a menu. See Mobile VR Media Overview for details on the supported image and movie formats. VrScene By default VrScene loads the Tuscany scene from the Oculus demos, which can be navigated using a gamepad. However, VrScene accepts Android Intents to view different .ovrscene files, so it can also be used as a generic scene viewer during development. New scenes can be created in Autodesk 3DS Max, Maya, or Luxology MODO, then saved as one or more Autodesk FBX files, and converted using the FBX converter that is included with this SDK. See the FBX converter document for more details on creating and converting new FBX scenes

Mobile | Media and Assets | 59

Models This section describes creating custom theater environments for Oculus Cinema and using the Oculus FBX Converter tool.

Oculus Cinema Theater Creation This section describes how to create and compile a movie theater FBX for use with Oculus Cinema. Figure 4: The cinema scene in MODO and its item names

Steps to Create a Theater: 1. 2. 3. 4.

Create a theater mesh. Create a screen mesh. Create two theater textures (one with the lights on and one with the lights off). Create meshes that define the camera/view positions.

5. Create meshes that use an additive material (optional, for dim lights). 6. Run the FBX Converter tool.

Detailed Instructions 1. Create a theater mesh • Nothing special here. Just create a theater mesh, but here are some tips for how to optimize the mesh. Keep the polycount as low as physically possible as a rule. Rendering is a taxing operation, and savings in this area will benefit your entire project. As a point of reference, the "cinema" theater mesh has a polycount of 42,000 tris. Review Performance Guidelines for detailed information about per-scene polycount targets.

60 | Media and Assets | Mobile

• Polycount optimization can be executed in a variety of ways. A good example is illustrated with the “cinema” mesh included in the SDK source files for Oculus Cinema. When you load the source mesh, you’ll notice that all the chairs near player positions are high poly while chairs in the distance are low poly. It is important to identify and fix any visual faceting caused by this optimization by placing cameras at defined seating positions in your modeling tool. If certain aspects of the scene look “low poly” then add new vertices in targeted areas of those silhouettes to give the impression that everything in the scene is higher poly than it actually is. This process takes a bit of effort, but it is definitely a big win in terms of visual quality and performance. You may also consider deleting any polys that never face the user when a fixed camera position is utilized. Developing a script in your modeling package should help make quick work of this process. For our “cinema” example case, this process reduced the polycount by 40%. Figure 5: This image shows the poly optimizations of chairs in the “cinema” mesh. [MODO]

• If you’ve optimized your mesh and still land above the recommended limits, you may be able to push more polys by cutting the mesh into 9 individual meshes arranged in a 3x3 grid. If you can only watch movies from the center grid cell, hopefully some of the meshes for the other grid cells will not draw if they're out of your current view. Mesh visibility is determined by a bounding box and if you can see the bounding box of a mesh, then you're drawing every single one of its triangles. The figure below illustrates where cut points could be placed if the theater mesh requires this optimization. Note that the

Mobile | Media and Assets | 61

cut at the back of the theater is right behind the last camera position. This keeps triangles behind this cut point from drawing while the user is looking forward: Figure 6: [MODO]

• Another optimization that will help rendering performance is polygon draw order. The best way to get your polygons all sorted is to move your theater geometry such that the average of all the camera positions is near 0,0,0. Then utilize the -sort origin command-line option of the FBX Converter when compiling this mesh (see Step 7 for more information). • Material : The more materials you apply to your mesh, the more you slow down the renderer. Keep this number low. We apply one material per scene and it's basically one 4k texture (well, it's actually two, because you have to have one for when the lights are on and one for when the lights are off - this is covered in Step 3 below). • Textures : You’ll need to add your two textures to the material that show the theater with the lights on and lights off. The way you do that is you add the texture with the lights on and set its mode to diffuse color (in MODO) and set the mode of the texture with the lights off to luminous color. 2. Create a screen mesh • Mesh : Create a quad and have its UVs use the full 0-1 space, or it won't be drawing the whole movie screen. • Mesh Name : Name the mesh "screen". • Material : Apply a material to it called "screen". You may add a tiny 2x2 image called "screen.png" to the material, but it is not absolutely necessary. 3. Create two theater textures (one with the lights on and one with the lights off) Create a CG scene in MODO that is lit as if the lights were turned on. This scene is then baked to a texture. Next, turn off all room lights in the MODO scene and add one area light coming from the screen. This

62 | Media and Assets | Mobile

scene is baked out as the second texture. In the cinema that is included with the SDK, the material is named "cinema." The lights-on image is named "cinema_a.png" and lights-off image is named "cinema_b.png" Figure 7: Two images for the cinema mesh, with the screen and additive image. [MODO]

4. Create meshes that define the camera/view positions • A camera/view position for one of the possible seats in the theater is defined by creating a mesh that consists of a single triangle with a 90 degree corner. The vert at the 90 degree corner is used for the camera position. Name the mesh “cameraPos1”. When you process the scene using the FBX Converter, use the -tag command-line parameter to specify which meshes are used to create the camera positions, and use the -remove parameter to remove them from the scene so that they’re not drawn. For example, when converting cinema.fbx, we use -tag screen cameraPos1 cameraPos2 cameraPos3 remove cameraPos1 cameraPos2 cameraPos3 on the command line to convert the camera position meshes to tags and remove them.

Mobile | Media and Assets | 63

• Max number of seating positions : The current Cinema application supports up to 8 camera positions. Figure 8: Three camera position meshes. [MODO]

5. Create meshes that use an additive material (optional, for when the lights are off) • Different real-world movie theaters leave different lights on while movies are playing. They may dimly light stair lights, walkway lights, or wall lights. To recreate these lights in the virtual theater, bake them to some polys that are drawn additively. • To make an additive material: simply create a material and append “_additive” to the material name, then add an image and assign it to luminous color. 6. Run the FBX Converter tool Note: For more information on the FBX Converter tool, see FBX Converter. • To avoid retyping all the FBX Converter command-line options, it is common practice to create a batch file that launches the FBX Converter. For the example cinema, the batch is placed in the folder above where the FBX file is located: \OculusSDK\Mobile\Main\SourceAssets\scenes\cinema\cinema.fbx \OculusSDK\Mobile\Main\SourceAssets\scenes\cinema.bat

• The cinema.bat batch file contains the following: FbxConvertx64.exe -o cinema -pack -cinema -stripModoNumbers -rotate 90 -scale 0.01 -flipv attrib position uv0 -sort origin -tag screen cameraPos1 cameraPos2 cameraPos3 -remove cameraPos1 cameraPos2 cameraPos3 -render cinema\cinema.fbx -raytrace screen -include cinema\icon.png

Note: This is a single line in the batch file that we’ve wrapped for the sake of clarity. Here is an explanation of the different command-line options used to compile the cinema. FbxConvertx64.exe

The FBX Converter executable.

-o cinema

The -o option specifies the name of the .ovrscene file.

-pack

Makes the FBX Converter automatically run the cinema_pack.bat batch file that packages everything into the .ovrscene file.

64 | Media and Assets | Mobile

FbxConvertx64.exe

The FBX Converter executable.

-cinema

The .ovrscene file is automatically loaded into Cinema instead of VrScene when the device is connected.

-stripMODONumbers

MODO often appends “ {2}” to item names because it does not allow any duplicate names. This option strips those numbers.

-rotate 90

Rotates the whole theater 90 degrees about Y because the cinema was built to look down +Z, while Cinema looks down +X.

-scale 0.01

Scales the theater by a factor of 100 (when MODO saves an FBX file, it converts meters to centimeters).

-flipv

Flips the textures vertically, because they are flipped when they are compressed.

-attrib position uv0

Makes the FBX Converter remove all vertex attributes except for the ‘position’ and first texture coordinate.

-sort origin

Sorts all triangles front-to-back from 0,0,0 to improve rendering performance.

-tag screen cameraPos1 Creates tags for the screen and view positions in the theater. cameraPos2 These tags are used by Cinema. cameraPos3 -remove cameraPos1 cameraPos2 cameraPos3

Keeps the view position meshes from being rendered.

-render cinema \cinema.fbx

Specifies the FBX file to compile.

-raytrace screen

Allows gaze selection on the theater screen.

-include cinema \icon.png

Includes an icon for the theater in the .ovrscene file.

• These are most of the options you’ll need for the FbxConvert.exe. Additional options exist, but they are not typically needed to create a theater. For a complete list of options and more details on how to create and convert FBX files, see FBX Converter. • An example of another command-line option is -discrete [ ...]. Use this when you cut your mesh into chunks to reduce the number of polys being drawn when any of these meshes are offscreen. By default, the FBX Converter optimizes the geometry for rendering by merging the geometry into as few meshes as it can. If you cut the scene up into 9 meshes, the FBX Converter will merge them back together again unless you use the -discrete command-line option. 7. Copy your .ovrscene files to sdCard/Oculus/Cinema/Theaters

Mobile | Media and Assets | 65

Theater Design Principles

Theaters are static scenes that we want users to spend dozens of hours in. It is therefore especially important to design them to very high standards. In this section, we offer some guidelines to assist with theater design. Dimensions and Perspective The viewpoint should be centered on the screen. The screen should be exactly 16:9 aspect ratio, exactly flat, and exactly axial. The screen in the SDK source files for Oculus Cinema covers 60 degrees horizontal FOV. It is more comfortable to have a larger screen that is farther away, because the eyes will have less focus/ vergence mismatch. The sample Oculus Cinema screen is five meters from the viewpoint, which requires the screen to be six meters wide. Three meters away and 3.46 meters x 1.95 meters or four meters and 4.62 x 2.60 may be more reasonable actual targets. The large screen would go nearly to the floor to remain centered on the viewpoint unless the couch area is raised up. Do not over-optimize geometry for the viewpoint for any model that will also be used for PC with position tracking. For example, don’t remove occluded back sides, just make them cruder and lower texture detail if you need to optimize. Aliasing All of the geometry in the scene should be designed to minimize edge aliasing. Aliasing will show up at every silhouette edge and at every texture seam. With a nearly constant viewpoint it should be possible to place texture seams on the back sides of objects. Venetian blinds are almost worst case for silhouette edges – many long edges. Surfaces that are very nearly coplanar with the viewpoint are even worse than silhouette edges, causing the entire surface to go from not-drawn to a scattering of pixels using the deepest mipmap as the user’s head moves around. This can be worked around in the geometric design. You have a large enough triangle budget that there shouldn’t be any starkly visible polygon edges. Keeping the use of complex curved surfaces to a minimum in the design is important. Color and Texture Don’t use solid black in any environment source textures. OLED displays have technical problems with very deep blacks, so design rooms with fairly bright surfaces. The black leather chairs in the current home theater are problematic for this. You may compress all textures with ETC2 or ASTC texture compression to conserve power - we recommend using ASTC compression. Make sure that no alpha channel is encoded. If you place text on any surfaces, make it as close to parallel with the view plane as possible, and make it large enough that it is easy to read. It should either be headlines or a tiny blur, nothing in between at a barely-legible state. Cut the texture sizes to only what is needed for the sized-to-view-position texture layouts – don’t expand them to fit a 2x 4k x 4k budget, since additional resolution will be completely wasted. In theory, arbitrary texture sizes are fine, but we recommend staying on power of two boundaries. No need to remain square – 4k x 2k and so on are fine.

FBX Converter A tool to convert FBX files into geometry for rendering, collision detection and gaze selection in virtual reality experiences.

66 | Media and Assets | Mobile

Overview

The FBX Converter reads one or more Autodesk FBX files and creates a file with models stored in JSON format accompanied with a binary file with raw data. The JSON file created by the FBX Converter contains the following: • A render model. This model has a list of textures, a list of joints, a list of tags, and a list of surfaces. Each surface has a material with a type and references to the textures used. The material textures may include a diffuse, specular, normal, emissive and reflection texture. Each surface also has a set of indexed vertices. The vertices may include the following attributes: position, normal, tangent, binormal, color, uv0, uv1, joint weights, joint indices. Two sets of UVs are supported, one set for a diffuse/normal/specular texture and a separate set for an optional emissive texture. • A wall collision model. This model is used to prevent an avatar from walking through walls. This is a list of polytopes where each polytope is described by a set of planes. The polytopes are not necessarily bounded or convex. • A ground collision model. This model determines the floor height of the avatar. This model is also no more than a list of polytopes. • A ray trace model. This is a triangle soup with a Surface Area Heuristic (SAH) optimized KD-tree for fast ray tracing. Meshes or triangles will have to be annotated to allow interactive focus tracking or gaze selection. Textures for the render model can be embedded in an FBX file and will be extracted by the FBX Converter. Embedded textures are extracted into a folder named .fbm/, which is a sub-folder of the folder where the FBX file .fbx is located. Instead of embedding textures, they can also simply be stored in the same folder as the FBX file. The following source texture formats are supported: BMP, TGA, PNG, JPG. For the best quality, a lossy compression format like JPG should be avoided. The JSON file with models and the binary file are stored in a temporary folder named _tmp/, which is a sub-folder of the folder where the FBX Converter is launched, where is the output file name specified with the -o command-line option. The FBX Converter will also create a _pack.bat batch file in the folder where the FBX Converter is launched. This batch file is used to compress the render model textures to a platform specific compression format. A texture will be compressed to ETC2 (with or without alpha) for OpenGL ES mobile platforms and to S3TC (either DXT1/BC1 or DXT5/BC3) for the PC. The batch file uses file time stamps only to compress textures for which there is not already a compressed version that is newer than the source texture. The -clean command-line option may be used to force recompression of all textures. The batch file will also copy the JSON model file, the binary file with raw data and the platform specific compressed textures into a folder named _tmp/pack/, which is a sub-folder of the aforementioned _tmp/ folder. 7-Zip is then used to zip up the 'pack' folder into a single package that can be loaded by the application. The -pack command-line option can be used to automatically execute the _pack.bat batch file from the FBX Converter. Coordinate System The Oculus SDK uses the same coordinates system as the default coordinate system in 3D Studio Max or Luxology MODO. This is a right handed coordinate system with: +X

right

-X

left

+Y

up

-Y

down

+Z

backward

-Z

forward

Mobile | Media and Assets | 67

The Oculus SDK uses the metric system for measurements, where one unit is equal to one meter. 3D Studio Max and Luxology MODO do not use any specific unit of measure, but one unit in either application maps to one unit in the Oculus SDK. However, when the data from Luxology MODO is saved to an FBX file, all units are automatically multiplied by one hundred. In other words, the unit of measure in the FBX file ends up being centimeter. Therefore there is always a scale of 1/100 specified on the FBX Converter command-line when converting FBX files from Luxology MODO: -scale 0.01 The FBX Converter supports several command-line options to transform the FBX geometry (translate, rotate, scale, et cetera). The transformations will be applied to the geometry in the order in which they are listed on the command-line. Materials Each render model surface stored in the JSON models file has a material. Such a material has a type and references to the textures used. The material textures may include a diffuse, specular, normal, emissive and reflection texture. These textures are retrieved from the FBX file as: 'DiffuseColor' 'NormalMap' 'SpecularColor' 'EmissiveColor' 'ReflectionColor' Most modeling tools will map similarly named textures to the above textures in the FBX file. For instance, using Luxology MODO, the 'Emissive color' texture is mapped to the 'EmissiveColor' texture in the FBX file. During rendering the diffuse texture is multiplied with the emissive texture as follows: color = DiffuseColor(uv0) * EmissiveColor(uv1) * 1.5

Surface reflections look into a cube map (or environment map). The textures for the 6 cube map sides should be named: _right. _left. _up. _down. _backward. _forward. The reflection texture 'ReflectionColor' should be set to one of these 6 textures used to create the cube map. The FBX Converter automatically picks up the other 5 textures and combines all 6 textures into a cube map. The normal map that is used to calculate the surface reflection is expected to be in local (tangent) space. During rendering the color of reflection mapped materials is calculated as follows: surfaceNormal = normalize( NormalMap(uv0).x * tangent + NormalMap(uv0).y * binormal + NormalMap(uv0).z * normal ) reflection = dot( eyeVector, surfaceNormal ) * 2.0 * surfaceNormal - eyeVector color = DiffuseColor(uv0) * EmissiveColor(uv1) * 1.5 + SpecularColor(uv0) * ReflectionColor(reflection) The material type is one of the following: 1. 2. 3. 4.

opaque perforated transparent additive

68 | Media and Assets | Mobile

The first three material types are based on the alpha channel of the diffuse texture. The -alpha commandline option must be used to enable the 'perforated' and 'transparent' material types. Without the -alpha command-line option, the alpha channel of the diffuse texture will be removed. The 'additive' material type cannot be derived from the textures. An additive texture is specified by appending _additive to the material name in the FBX file. Animations There is currently not a full blown animation system, but having vertices weighted to joints is still very useful to programmatically move geometry, while rendering as few surfaces as possible. Think about things like the buttons and joystick on the arcade machines in VrArcade. An artist can setup the vertex weighting for skinning, but the FBX Converter also has an option to rigidly bind the vertices of a FBX source mesh to a single joint. In this case the joint name will be the name of the FBX source mesh. The meshes that need to be rigidly skinned to a joint are specified using the -skin command-line option. There is currently a limit of 16 joints per FBX file. The FBX Converter can also apply some very basic parametric animations to joints. These simple animations are specified using the -anim command-line option. The types of animation are rotate, sway and bob. One of these types is specified directly following the -anim command-line option. Several parameters that define the animation are specified after the type. For the rotate and sway these parameters are pitch, yaw and roll in degrees per second. For the bob the parameters are x, y and z in meters per second. Following these parameters, a time offset and scale can be specified. The time offset is typically use to animated multiple joints out of sync and the time scale can be used to speed up or slow down the animation. Last but not least, one or more joints are specified to which the animation should be applied. When a mesh is rigidly skinned to a joint using the -skin command-line option, the FBX Converter stores the mesh node transform on the joint. This mesh node transform is used as the frame of reference (pivot and axes) for animations. Tags A tag is used to define a position and frame of reference in the world. A tag can, for instance, be used to define a screen or a view position in a cinema. A tag can also be used to place objects in the world. The -tag command-line option is used to turn one or more FBX meshes from the render model into tags. The name of a tag will be the name of the mesh. The position and frame of reference are derived from the first triangle of the mesh and are stored in a 4x4 matrix. The position is the corner of the triangle that is most orthogonal. The edges that come out of this corner define the first two basis vectors of the frame of reference. These basis vectors are not normalized to maintain the dimensions of the frame. The third basis vector is the triangle normal vector. Multiple tags can be created by specifying multiple FBX mesh names after the -tag command-line option. The -tag command-line option does not remove the listed meshes from the render model. The -remove command-line option can be used to remove the meshes from the render model.

Command-Line Interface

The FBX Converter is a command-line tool. To run the FBX Converter open a Windows Command Prompt, which can be found in the Windows Start menu under All Programs -> Accessories. A command prompt can also be opened by typing cmd in the Windows Run prompt in the Start menu. Once a command prompt has been opened, we recommend launching the FBX Converter from the folder where the source FBX files are located. The FBX Converter comes with the following tools: FbxConvertx64.exe TimeStampx64.exe PVRTexTool/*

(from Oculus VR) (from Oculus VR) (version 3.4, from the PowerVR SDK version 3.3)

Mobile | Media and Assets | 69

7Zip/*

(version 9.20, from www.7-zip.org)

The FbxConvert64.exe is the executable that is launched by the user. The other executables are directly or indirectly used by the FbxConvertx64.exe executable. Options The FBX Converter supports the following command-line options: Command

Description

-o

Specify the name for the .ovrscene file. Specify this name without extension.

-render

Specify model used for rendering.

-collision

Specify model or meshes for wall collision.

-ground

Specify model or meshes for floor collision.

-raytrace

Specify model or meshes for focus tracking.

-translate

Translate the models by x,y,z.

-rotate

Rotate the models about the Y axis.

-scale

Scale the models by the given factor.

-swapXZ

Swap the X and Z axis.

-flipU

Flip the U texture coordinate.

-flipV

Flip the V texture coordinate.

-stripModoNumbers

Strip duplicate name numbers added by MODO.

-sort <+|->

Sort geometry along axis or from origin.

-expand

Expand collision walls by this distance. Defaults to 0.5

-remove [ ...]

Remove these source meshes for rendering.

-atlas [ ...]

Create texture atlas for these meshes.

-discrete [ ...]

Keep these meshes separate for rendering.

-skin [ ...]

Skin these source meshes rigidly to a joint.

-tag [ ...]

Turn 1st triangles of these meshes into tags.

-attrib [ ...]

Only keep these attributes: [position, normal, tangent, binormal, color, uv0, uv1, auto].

-anim Apply parametric animation to joints. [ ...] -anim [ ...] -anim [...] -ktx

Compress textures to KTX files (default).

-pvr

Compress textures to PVR files.

-dds

Compress textures to DDS files.

-alpha

Keep texture alpha channels if present.

-clean

Delete previously compressed textures.

70 | Media and Assets | Mobile

Command

Description

-include [ ...]

Include these files in the package.

-pack

Automatically run _pack.bat file.

-zip

7-Zip compression level (0=none, 9=ultra).

-fullText

Store binary data as text in JSON file.

-noPush

Do not push to device in batch file.

-noTest

Do not run a test scene from batch file.

-cinema

Launch VrCinema instead of VrScene.

-expo

Launch VrExpo instead of VrScene.

The -collision, -ground and -raytrace command-line options may either specify a separate FBX file or a list of meshes from the FBX file specified with the -render command-line option. If the collision and ray-trace meshes are in the same FBX file as the to be rendered meshes but the collision and ray-trace surface should not be rendered, then these meshes can be removed for rendering using the -remove command-line option. Note that the -collision, -ground, -raytrace, -remove, -atlas, -discrete, -skin and -tag command-line options accept wild cards like * and ?. For instance, to make all surfaces discrete use: discrete * Batch Execution Instead of typing all the command-line options on the command prompt, it is common practice to use a batch file to launch the FBX Converter with a number of options. This allows for quick iteration on the assets while consistently using the same settings. The following is the contents of the batch file that was used to convert the FBX for the home theater: FbxConvertx64.exe -o home_theater -pack -stripModoNumbers -rotate 180 -scale 0.01 -translate 0.45 0 -3 -swapxz -flipv -sort origin -tag screen -render home_theater\home_theater.fbx -raytrace screen

Troubleshooting The FBX Converter prints several things on the screen such as configuration options and warnings and errors. Warnings (e.g., missing textures) are printed in yellow, and errors (e.g., missing executables) are printed in red.

Optimization The FBX Converter implements various command-line options that can be used to optimize the geometry for rendering. Reducing Draw Calls The FBX Converter automatically merges FBX meshes that use the same material such that they will be rendered as a single surface. At some point it may become necessary to automatically break up surfaces for culling granularity. However, currently it is more important to reduce the number of draw calls due to significant driver overhead on mobile platforms. Source meshes that need to stay separate for some reason can be flagged using the -discrete command-line option of the FBX Converter. To further reduce the number of draw calls, or to statically sort all geometry into a single surface, the FBX Converter can also create one of more texture atlases using the -atlas option. This option takes a list of FBX source meshes that need to have their textures combined into an atlas. Multiple atlases can be created by specifying the -atlas command-line option multiple times with different mesh names. Textures that are placed in an atlas cannot be tiled (repeated) on a mesh and the texture coordinates of the source mesh need to all be in the [0, 1] range.

Mobile | Media and Assets | 71

Reducing Vertices During conversion, the FBX Converter displays the total number of triangles and the total number of vertices of the render geometry. The number of vertices is expected to be in the same ballpark as the number of triangles. Having over two times more vertices than triangles may have performance implications. The number of unique vertices can be reduced by removing vertex attributes that are not necessary for rendering. Unused vertex attributes are generally wasteful and removing them may increase rendering performance just by improving GPU vertex cache usage. An FBX file may store vertex attributes that are not used for rendering. For instance, vertex normals may be stored in the FBX file, but they will not be used for rendering unless there is some form of specular lighting. The FBX file may also store a second set of texture coordinates that are not used when there are no emissive textures. The -attrib command-line option of the FBX Converter can be used to keep only those attributes that are necessary to correctly render the model. For instance, if the model only renders a diffuse texture with baked lighting then all unnecessary vertex attributes can be removed by using -attrib position uv0. The -attrib command-line option also accepts the auto keyword. By using the auto keyword the FBX Converter will automatically determine which vertex attributes need to be kept based on the textures specified per surface material. The auto keyword can be specified in combination with other vertex attributes. For instance: -attrib auto color will make sure that the color attribute will always be kept and the other vertex attributes will only be kept if they are needed to correctly render based on the specified textures. The following table shows how the FBX Converter determines which attributes to keep when the auto keyword is specified: position

always automatically kept

normal

if NormalMap or SpecularColor texture is specified

tangent

if NormalMap texture is specified

binormal

if NormalMap texture is specified

uv0

if DiffuseColor or SpecularColor texture is specified

uv1

if EmissiveColor texture is specified

color

never automatically kept

Reducing Overdraw To be able to render many triangles, it is important to minimize overdraw as much as possible. For scenes or models that do have overdraw, it is very important that the opaque geometry is rendered front-to-back to significantly reduce the number of shading operations. Scenes or models that will only be displayed from a single viewpoint, or a limited range of view points, can be statically sorted to guarantee front-to-back rendering on a per triangle basis. The FBX Converter has a -sort command-line option to statically sort all the geometry. The -sort option first sorts all the vertices. Then it sorts all the triangles based on the smallest vertex index. Next to sorting all the triangles this also results in improved GPU vertex cache usage. The -sort option can sort all geometry along one of the coordinate axes or it can sort all geometry from the origin. Sorting along an axis is useful for diorama-like scenes. Sorting from the origin is useful for theater-like scenes with a full 360 view. Sorting along an axis is done by specifying + or - one of the coordinate axis (X, Y or Z). For instance, to sort all geometry front-to-back along the X axis use: -sort +X Sorting from the origin can be done by specifying + or - origin. For instance, to sort all geometry front-to-back from the origin use: -sort +origin For sorting from the origin to be effective, the origin of the FBX model or scene must be the point from which the model or scene will be viewed. Keep in mind that sorting from the origin happens after any translations

72 | Media and Assets | Mobile

have been applied to the FBX geometry using the -translate command-line option. In other words, when using the -sort +origin command-line option in combination with the -translate option, the scene will be sorted from the translated position instead of the original FBX origin. Scenes that can be viewed from multiple vantage points may need to be manually broken up into reasonably sized blocks of geometry that will be dynamically sorted front-to-back at run-time. If the meshes the scene is broken up into use the same material, then the -discrete command-line option can be used to keep the meshes separate for rendering.

Mobile | Mobile Best Practices | 73

Mobile Best Practices Welcome to the Mobile Best Practices Guide. In this guide, we'll review best practices for rendering and for dealing with user interface components.

Rendering Guidelines This section contains guidelines for VR application development in the unique domain of mobile development.

Mobile VR Performance Be conservative on performance. Even though two threads are dedicated to the VR application, a lot happens on Android systems that we can’t control, and performance has more of a statistical character than we would like. Some background tasks even use the GPU occasionally. Pushing right up to the limit will undoubtedly cause more frame drops, and make the experience less pleasant. You aren't going to be able to pull off graphics effects under these performance constraints that people haven't seen years ago on other platforms, so don't try to compete there. The magic of a VR experience comes from interesting things happening in well-composed scenes, and the graphics should largely try not to call attention to themselves. Even if you consistently hold 60 FPS, more aggressive drawing consumes more battery power, and subtle improvements in visual quality generally aren’t worth taking 20 minutes off the battery life for a title. Keep rendering straightforward. Draw everything to one view, in a single pass for each mesh. Tricks with resetting the depth buffer and multiple camera layers are bad for VR, regardless of their performance issues. If the geometry doesn't work correctly - all rendered into a single view (FPS hands, et cetera) - then it will cause perception issues in VR, and you should fix the design. You can't handle a lot of blending for performance reasons. If you have limited navigation capabilities in the title and can guarantee that the effects will never cover the entire screen, then you will be okay. Don't use alpha tested / pixel discard transparency -- the aliasing will be awful, and performance can still be problematic. Coverage from alpha can help, but designing a title that doesn't require a lot of cut out geometry is even better. Most VR scenes should be built to work with 16 bit depth buffer resolution and 2x MSAA. If your world is mostly pre-lit to compressed textures, there will be little difference between 16 and 32 bit color buffers. Favor modest "scenes" instead of "open worlds". There are both theoretical and pragmatic reasons why you should, at least in the near term. The first generation of titles should be all about the low hanging fruit, not the challenges. The best-looking scenes will be uniquely textured models. You can load quite a lot of textures -- 128 Megs of textures is okay. With global illumination baked into the textures, or data actually sampled from the real world, you can make reasonably photo realistic scenes that still run 60 FPS stereo. The contrast with much lower fidelity dynamic elements may be jarring, so there are important stylistic decisions to be made. Panoramic photos make excellent and efficient backdrops for scenes. If you aren't too picky about global illumination, allowing them to be swapped out is often nice. Full image-based lighting models aren't performance-practical for entire scenes, but are probably okay for characters that can't cover the screen.

74 | Mobile Best Practices | Mobile

Frame Rate Thanks to the Asynchronous TimeWarp, looking around in the Gear VR will always be smooth and judder-free at 60 FPS, regardless of how fast or slow the application is rendering. This does not mean that performance is no longer a concern, but it gives a lot more margin in normal operation, and improves the experience for applications that do not hold perfectly at 60 FPS. If an application does not consistently run at 60 FPS, then animating objects move choppier, rapid head turns pull some black in at the edges, player movement doesn't feel as smooth, and gamepad turning looks especially bad. However, the asynchronous TimeWarp does not require emptying the GPU pipeline and makes it easier to hold 60 FPS than without. Drawing anything that is stuck to the view will look bad if the frame rate is not held at 60 FPS, because it will only move on eye frame updates, instead of on every video frame. Don't make heads up displays. If something needs to stay in front of the player, like a floating GUI panel, leave it stationary most of the time, and have it quickly rush back to center when necessary, instead of dragging it continuously with the head orientation.

Scenes Per scene targets: • 50k to 100k triangles • 50k to 100k vertices • 50 to 100 draw calls An application may be able to render more triangles by using very simple vertex and fragment programs, minimizing overdraw, and reducing the number of draw calls down to a dozen. However, lots of small details and silhouette edges may result in visible aliasing despite MSAA. It is good to be conservative! The quality of a virtual reality experience is not just determined by the quality of the rendered images. Low latency and high frame rates are just as important in delivering a high quality, fully immersive experience, if not more so. Keep an eye on the vertex count because vertex processing is not free on a mobile GPU with a tiling architecture. The number of vertices in a scene is expected to be in the same ballpark as the number of triangles. In a typical scene, the number of vertices should not exceed twice the number of triangles. To reduce the number of unique vertices, remove vertex attributes that are not necessary for rendering. Textures are ideally stored with 4 bits per texel in ETC2 format for improved rendering performance and an 8x storage space reduction over 32-bit RGBA textures. Loading up to 512 MB of textures is feasible, but the limited storage space available on mobile devices needs to be considered. For a uniquely textured environment in an application with limited mobility, it is reasonable to load 128 MB of textures. Baking specular and reflections directly into the textures works well for applications with limited mobility. The aliasing from dynamic shader based specular on bumped mapped surfaces is often a net negative in VR, but simple, smooth shapes can still benefit from dynamic specular in some cases. Dynamic lighting with dynamic shadows is usually not a good idea. Many of the good techniques require using the depth buffer, which is particularly expensive on mobile GPUs with a tiling architecture. Rendering a shadow buffer for a single parallel light in a scene is feasible, but baked lighting and shadowing usually results in better quality. To be able to render many triangles, it is important to reduce overdraw as much as possible. In scenes with overdraw, it is important that the opaque geometry is rendered front-to-back to significantly reduce the number of shading operations. Scenes that will only be displayed from a single viewpoint can be statically sorted to guarantee front-to-back rendering on a per triangle basis. Scenes that can be viewed from multiple vantage

Mobile | Mobile Best Practices | 75

points may need to be broken up into reasonably sized blocks of geometry that will be sorted front-to-back dynamically at run-time.

Resolution Due to distortion from the optics, the perceived size of a pixel on the screen varies across the screen. Conveniently, the highest resolution is in the center of the screen where it does the most good, but even with a 2560x1440 screen, pixels are still large compared to a conventional monitor or mobile device at typical viewing distances. With the current screen and optics, central pixels cover about 0.06 degrees of visual arc, so you would want a 6000 pixel long band to wrap 360 degrees around a static viewpoint. Away from the center, the gap between samples would be greater than one, so mipmaps should be created and used to avoid aliasing. For general purpose, rendering this requires 90 degree FOV eye buffers of at least 1500x1500 resolution, plus the creation of mipmaps. While the system is barely capable of doing this with trivial scenes at maximum clock rates, thermal constraints make this unsustainable. Most game style 3D VR content should target 1024x1024 eye buffers. At this resolution, pixels will be slightly stretched in the center, and only barely compressed at the edges, so mipmap generation is unnecessary. If you have lots of performance headroom, you can experiment with increasing this a bit to take better advantage of the display resolution, but it is costly in power and performance. Dedicated “viewer” apps (e-book reader, picture viewers, remote monitor view, et cetera) that really do want to focus on peak quality should consider using the TimeWarp overlay plane to avoid the resolution compromise and double-resampling of distorting a separately rendered eye view. Using an sRGB framebuffer and source texture is important to avoid “edge crawling” effects in high contrast areas when sampling very close to optimal resolution.

User Interface Guidelines Graphical User Interfaces (GUIs) in virtual reality present unique challenges that can be mitigated by following the guidelines in this document. This is not an exhaustive list, but provides some guidance and insight for firsttime implementers of Virtual Reality GUIs (VRGUIs).

Stereoscopic UI Rendering If any single word can help developers understand and address the challenges of VRGUIs, it is “stereoscopic”. In VR, everything must be rendered from two points of view -- one for each eye. When designing and implementing VRGUIs, frequent consideration of this fact can help bring problems to light before they are encountered in implementation. It can also aid in understanding the fundamental constraints acting on VRGUIs. For example, stereoscopic rendering essentially makes it impossible to implement an orthographic Heads Up Display (HUD), one of the most common GUI implementations for 3D applications -- especially games.

The Infinity Problem Neither orthographic projections nor HUDs in themselves are completely ruled out in VR, but their standard implementation, in which the entire HUD is presented via the same orthographic projection for each eye view, generally is. Projecting the HUD in this manner requires the user to focus on infinity when viewing the HUD. This effectively places the HUD behind everything else that is rendered, as far as the user’s brain is concerned. This can

76 | Mobile Best Practices | Mobile

confuse the visual system, which perceives the HUD to be further away than all other objects, despite remaining visible in front of them. This generally causes discomfort and may contribute to eyestrain. Orthographic projection should be used on individual surfaces that are then rendered in world space and displayed at a reasonable distance from the viewer. The ideal distance varies, but is usually between 1 to 3 meters. Using this method, a normal 2D GUI can be rendered and placed in the world and the user’s gaze direction used as a pointing device for GUI interaction. In general, an application should drop the idea of orthographically projecting anything directly to the screen while in VR mode. It will always be better to project onto a surface that is then placed in world space, though this provides its own set of challenges.

Depth In-Depth Placing a VRGUI in world space raises the issue of depth occlusion. In many 3D applications, it is difficult, even impossible, to guarantee that there will always be enough space in front of the user’s view to place a GUI without it being coincident with, or occluded by, another rendered surface. If, for instance, the user toggles a menu during gameplay, it is problematic if the menu appears behind the wall that the user is facing. It might seem that rendering without depth testing should solve the problem, but that creates a problem similar to the infinity problem, in which stereoscopic separation suggests to the user that the menu is further away than the wall, yet the menu draws on top of the wall. There are some practical solutions to this: • Render the VRGUI surfaces in two passes, once with depth pass and once with depth fail, using a special shader with the fail pass that stipples or blends with any surface that is closer to the view point than the VRGUI. • Project the VRGUI surfaces onto geometry that is closer than the ideal distance. This may not solve all problems if the geometry can be so close that the VRGUI is out of the users view, but it may give them an opportunity to back away while fitting well with any game that presupposes the VRGUIs are physical projections of light into world space. • Move the user to another scene when the VRGUI interface comes up. This might be as simple as fading the world to black as the interface fades in. • Stop rendering the world stereoscopically, i.e., render both eye views with the same view transform, while the VRGUI is visible, then render the GUI stereoscopically but without depth testing. This will allow the VRGUI surfaces to have depth while the world appears as a 2 dimensional projection behind it. • Treat VRGUIs as actual in-world objects. In fantasy games, a character might bring up a book or scroll, upon which the GUI is projected. In a modern setting, this might be a smart phone held in the character’s hand. In other instances, a character might be required to move to a specific location in the world -- perhaps a desktop computer -- to interact with the interface. In all these cases, the application would still need to support basic functionality, such as the ability to exit and return to the system launcher, at all times. It is generally not practical to disallow all VRGUI interaction and rendering if there is not enough room for the VRGUI surfaces, unless you have an application that never needs to display any type of menu (even configuration menus) when rendering the world view.

Gazing Into Virtual Reality There is more than one way to interact with a VRGUI, but gaze tracking may be the most intuitive. The direction of the user’s gaze can be used to select items in a VRGUI as if it were a mouse or a touch on a touch device. A mouse is a slightly better analogy because, unlike a touch device, the pointer can be moved around the interface without first initiating a “down” event. Like many things in VR, the use of gaze to place a cursor has a few new properties to consider. First, when using the gaze direction to select items, it is important to have a gaze cursor that indicates where gaze has to be

Mobile | Mobile Best Practices | 77

directed to select an item. The cursor should, like all other VR surfaces, be rendered stereoscopically. This gives the user a solid indication of where the cursor is in world space. In testing, implementations of gaze selection without a cursor or crosshair have been reported as more difficult to use and less grounded. Second, because gaze direction moves the cursor, and because the cursor must move relative to the interface to select different items, it is not possible to present the viewer with an interface that is always within view. In one sense, this is not possible with traditional 2D GUIs either, since the user can always turn their head away from the screen, but there are differences. With a normal 2D GUI, the user does not necessarily expect to be able to interact with the interface when not looking at the device that is presenting it, but in VR the user is always looking at the device -- they just may not be looking at the VRGUI. This can allow the user to “lose” the interface and not realize it is still available and consuming their input, which can further result in confusion when the application doesn’t handle input as the user expects (because they do not see a menu in their current view). There are several approaches to handling this issue: • Close the interface if it goes outside of some field of view from the user’s perspective. This may be problematic for games if the interface pauses gameplay, as gameplay would just resume when the interface closes. • Automatically drag the interface with the view as the user turns, either keeping the gaze cursor inside of the interface controls, or keeping it at the edge of the screen where it is still visible to the user. • Place an icon somewhere on the periphery of the screen that indicates the user is in menu mode and then allow this icon to always track with the view. Another frequently-unexpected issue with using a gaze cursor is how to handle default actions. In modern 2D GUIs, a button or option is often selected by default when a GUI dialog appears - possibly the OK button on a dialog where a user is usually expected to proceed without changing current settings, or the CANCEL button on a dialog warning that a destructive action is about to be taken. However, in VRGUIs, the default action is dictated by where the gaze cursor is pointing on the interface when it appears. OK can only be the default in a VRGUI if the dialog pops up with OK under the gaze cursor. If an application does anything other than place a VRGUI directly in front of the viewer (such as placing it on the horizon plane, ignoring the current pitch), then it is not practical to have the concept of a default button, unless there is an additional form of input such as a keyboard that can be used to interact with the VRGUI.

Adreno Hardware Profile The Adreno has a sizable (512k - 1 meg) on-chip memory that framebuffer operations are broken up into. Unlike the PowerVR or Mali tile based GPUs, the Adreno has a variable bin size based on the bytes per pixel needed for the buffers -- a 4x MSAA, 32 bit color 4x MRT, 32 bit depth render target will require 40 times as many tiles as a 16-bit depth-only rendering. Vertex shaders are run at least twice for each vertex, once to determine in which bins the drawing will happen, and again for each bin that a triangle covers. For binning, the regular vertex shader is stripped down to only the code relevant to calculating the vertex positions. To avoid polluting the vertex cache with unused attributes, rendering with separate attribute arrays may provide some benefit. The binning is done on a per-triangle basis, not a per-draw call basis, so there is no benefit to breaking up large surfaces. Because scenes are rendered twice for stereoscopic view, and because the binning process doubles it again (at least), vertex processing is more costly than you might expect. Avoiding any bin fills from main memory and unnecessary buffer writes is important for performance. The VrApi framework handles this optimally, but if you are doing it yourself, make sure you invalidate color buffers before using them, and discard depth buffers before flushing the eye buffer rendering. Clears still cost some performance, so invalidates should be preferred when possible. There is no dedicated occlusion hardware like one would find in PowerVR chips, but early Z rejection is performed, so sorting draw calls to roughly front-to-back order is beneficial.

78 | Mobile Best Practices | Mobile

Texture compression offers significant performance benefits. Favor ETC2 compressed texture formats, but there is still sufficient performance to render scenes with 32 bit uncompressed textures on every surface if you really want to show off smooth gradients glGenerateMipMap() is fast and efficient; you should build mipmaps even for dynamic textures (and of course for static textures). Unfortunately, on Android, many dynamic surfaces (video, camera, UI, etc) come in as SurfaceTextures / samplerExternalOES, which don't have mip levels at all. Copying to another texture and generating mipmaps there is inconvenient and costs a notable overhead, but is still worth considering. sRGB correction is free on texture sampling, but has some cost when drawing to an sRGB framebuffer. If you have a lot of high contrast imagery, being gamma correct can reduce aliasing in the rendering. Of course, smoothing sharp contrast transitions in the source artwork can also help it. 2x MSAA runs at full speed on chip, but it still increases the number of tiles, so there is some performance cost. 4x MSAA runs at half speed, and is generally not fast enough unless the scene is very undemanding.

Mobile | Testing and Troubleshooting | 79

Testing and Troubleshooting Welcome to the testing and troubleshooting guide. This guide describes tools and procedures necessary for testing and troubleshooting mobile VR applications.

Tools and Procedures Welcome to the testing and troubleshooting guide.

Android System Properties and Local Preferences

Useful testing and debugging device configuration options may be controlled by Android System Properties or Local Preferences. Local Preferences are a set of key-value pairs written to /sdcard/.oculusprefs that are useful for storing debug settings that are tied to a device, rather than a particular application or user. The values controlled by Local Preferences are mapped as Android System Properties on phones running System Activities 1.0.3.0 or later. We recommend controlling settings with System Properties rather than Android System Properties, as they do not require enabling Developer Mode or SD card read access. Local Preferences and System Properties may both be set independently. If SD Card read access is provided by your application and Developer Mode is enabled on your mobile device, then any Local Preferences settings (including defaults) will override the corresponding Android System Properties settings. System Properties Note: System Properties reset when the device reboots. All commands are case sensitive. Basic Usage Set value: adb shell setprop

Example: adb shell setprop debug.oculus.frontbuffer 0

Result: Disables front buffer rendering. Get current value: adb shell getprop

Example: adb shell getprop debug.oculus.gpuLevelENTRY

80 | Testing and Troubleshooting | Mobile

Result: Returns currently-configured GPU Level, e.g., 1. Get all current values: adb shell getprop | fgrep "debug.oculus"

Result: Returns all System Preferences settings on the device. System Property

Values

Function

debug.oculus.enableCapture

0, 1

Enables support for Oculus Remote Monitor to connect to the application.

debug.oculus.cpuLevel

0, 1, 2, 3

Changes the fixed CPU level.

debug.oculus.gpuLevel

0, 1, 2, 3

Changes the fixed GPU level.

debug.oculus.asyncTimewarp

0, 1

Set to 0 to disable Asynchronous TimeWarp/enable Synchronous TimeWarp. Default is 1 (ATW is enabled).

debug.oculus.frontbuffer

0, 1

Disable/enable front buffer.

debug.oculus.calibrationLines

0, 1

Disable/enable calibration lines.

debug.oculus.gpuTimings

0, 1, 2

Turns on GPU timings in logcat (off by default due to instability). A setting of 2 is necessary on phones using the Mali GPU.

debug.oculus.simulateUndock

-1, 0, 1, …, 10

Simulate an undocking event after the specified number of seconds.

debug.oculus.enableVideoCapture

0, 1

When enabled, each enterVrMode generates a new mp4 file in /sdcard/ oculus/VideoShots/. Videos are full resolution, undistorted, single-eye, with full-compositing support. Defaults are 1024 resolution at 5 Mb/s.

debug.oculus.phoneSensors

0, 1

Set to 0 to disable limited orientation tracking provided by phone sensors while in Developer Mode (enabled by default).

Local Preferences Note: Modifying Local Preferences requires that your application have READ_EXTERNAL_STORAGE permission. Modifying System Properties does not require read access, and we generally recommend using that method instead. For security reasons, Local Preferences are not allowed unless Developer Mode is enabled for your device. To enable Developer Mode for your device, please refer to Developer Mode. All commands are case sensitive. The following Local Preferences are currently available via the Unity and Unreal integrations and VR app framework:

Mobile | Testing and Troubleshooting | 81

Local Preference

Values

Function

dev_enableCapture

0, 1

Enables support for Oculus Remote Monitor to connect to the application.

dev_cpuLevel

0, 1, 2, 3

Changes the fixed CPU level.

dev_gpuLevel

0, 1, 2, 3

Changes the fixed GPU level.

dev_gpuTimings

0, 1, 2

Turns on GPU timings in logcat (off by default due to instability). A setting of 2 is necessary on phones using the Mali GPU.

dev_asyncTimewarp

0, 1

Set to 0 to disable Asynchronous TimeWarp/enable Synchronous TimeWarp. Default is 1 (ATW is enabled).

dev_frontbuffer

0, 1

Disable/enable front buffer.

dev_videoCapture

0, 1

When enabled, each enterVrMode generates a new mp4 file in /sdcard/ oculus/VideoShots/. Videos are full resolution, undistorted, single-eye, with full-compositing support. Defaults are 1024 resolution at 5 Mb/s.

dev_usePhoneSensors

0, 1

Set to 0 to disable limited orientation tracking provided by phone sensors while in Developer Mode (enabled by default).

Note: When measuring GPU rendering times, we recommend setting dev_asyncTimewarp to 0 to avoid inflating the eye buffer rendering times with the TimeWarp rendering times. Be sure to re-set dev_asyncTimewarp to 1 when testing is complete, as disabling ATW in normal operation is never advised and may cause judder. To set a local preference value, you may do the following via command line: adb shell "echo dev_gpuTimings 1 > /sdcard/.oculusprefs"

You may also set multiple values at once: adb shell "echo dev_cpuLevel 1 dev_gpuLevel 2 > /sdcard/.oculusprefs"

To add values to append to an existing set of preferences: adb shell "echo dev_asyncTimewarp 0 >> /sdcard/.oculusprefs"

To remove all local preference settings: adb shell "rm /sdcard/.oculusprefs"

After setting the values, pause and then resume your app for the Local Preferences file to be updated. Note: Remember to clear or remove your local prefs file when finished so there are no surprises.

82 | Testing and Troubleshooting | Mobile

For native applications, you may query your own Local Preferences debug option with the following: const char * myDebugOptionStr = ovr_GetLocalPreferenceValueForKey( "myDebugOption", "0" ); if ( atoi( myDebugOptionStr ) > 0 ) { // perform debug code }

Screenshot and Video Capture

Full-resolution, undistorted, single-eye, full-layer-support 2D screenshots and video capture for VR apps are available through the Universal Menu. Video capture is also available by configuring localprefs. Output File Details Video Capture writes 1024 resolution, 5 Mb per second mp4 files to the /sdcard/oculus/VideoShots/ directory. Capture files do not currently include audio. Screenshot writes 1024x1024 jpg files to the /sdcard/Oculus/Screenshots/ directory. Note: Despite the occurrence of sdcard in this directory path, this is an internal device storage path.

Requirements • Android 5.0 or later • Target applications must be built against Mobile SDK 1.0.0 or later. Using the Universal Menu 1. Launch the application that you would like to take a screenshot of. 2. Open the Universal Menu and select the Utilities menu as shown with the gaze cursor.

Mobile | Testing and Troubleshooting | 83

3. To take a screenshot, select Screenshot. The Universal Menu will close, returning immediately to the running application. The screenshot will execute five seconds after you select the option, giving you a moment to position your view within the scene properly. The countdown is indicated by a blinking red dot in the upper left of your view. The output jpg will be written to the directory indicated above.

4. To take a video capture, select Capture Video. The Universal Menu will close, returning immediately to the running application. Video recording begins immediately upon returning to the application, and continues until the user returns to the Universal Menu, switches applications, or exits VR mode. The output mp4 will be written to the directory indicated above.

84 | Testing and Troubleshooting | Mobile

Video Capture using Android System Properties Note: Screenshot is not available with Android System Properties. To enable video capture, set the debug.oculus.enableVideoCapture to 1 with the following command: adb shell setprop debug.oculus.enableVideoCapture 1

When enabled, each enterVrMode will generate a new mp4 file, and every vrapi_EnterVrMode() will create a new video file. For example, if you launch an app from Home, you may find one video file for your Home session, one for the app you launch, one for System Activities if you long-press, and so forth. To help ensure that there is no disruption to the play experience while recording, you may wish to force the GPU level up and chromatic correction off: adb shell setprop debug.oculus.enableVideoCapture 1 debug.oculus.gpuLevel 3

The FOV is reduced to 80 degrees, so you are unlikely to see any black pull-in at the edges. Note: Make sure to disable video capture when you are done, or you will fill your phone up with video of every VR session you run!

Oculus Remote Monitor and VrCapture A library and monitoring client for mobile development.

VrCapture

The Capture library is a low-overhead remote monitoring library designed to help debug behavior and performance issues in mobile VR applications. It is capable of both real-time and offline inspection of collected data. Support is built into VrAPI by default. It is also automatically supported in Unity 5.1 and later. Integration Integrate VrCapture into your applications for greater visibility into performance and to supplement the default data captured by VrApi’s built-in integration. It is also automatically supported in Unity 5.1 and later. Linking/Including with ndk-build If your application uses ndk-build-based makefiles, linking to VrCapture is easy. 1. Include the following at the bottom of your Android.mk: $(call import-module,VrCapture/Projects/Android/jni)

2. Add LOCAL_STATIC_LIBRARIES += vrcapture between include $(CLEAR_VARS) and include $(BUILD_SHARED_LIBRARY). VrCapture will now be compiled and linked into your application, your header search paths will be set up properly, and VrCapture will be automatically compiled to your target architecture.

Mobile | Testing and Troubleshooting | 85

Linking/Including manually If you have a custom build system, follow this procedure instead: 1. Build VrCapture library: cd VrCapture/Projects/Android ndk-build -j16

2. Add the header search path to your compiler line: -IVrCapture/Include

3. Add the library to your link line: -LVrCapture/Projects/Android/obj/local/$(TARGET_ARCH) -lvrcapture

Init and Shutdown Note that VrCapture will do nothing until it is initialized (e.g., no sockets, threads, allocations). 1. Include the primary VrCapture header: #include

2. Near the beginning of execution of your application, add: OVR::Capture::Init();

3. Near the end of execution of your application, add: OVR::Capture::Shutdown();

We recommend not shipping your application with capture turned on by default. VrApi provides a mechanism for you to query if Capture is enabled in VrApi_LocalPrefs.h. ovr_GetLocalPreferenceValueForKey(LOCAL_PREF_VRAPI_ENABLE_CAPTURE);

CPU Zones The ability to handle a large number of performance zones is a core feature of Oculus Remote Monitor. We conservatively recommend keeping the number of events/frame to under 1000. A CPU zone begins with a simple, one-line declaration (see below) and ends when it goes out of scope. Zones may be used to measure the execution time of a block of code with extremely low overhead, e.g.: void SomeFunction(void) { OVR_CAPTURE_CPU_ZONE(SomeFunction); // Do Something Expensive Here }

Parameters Declare a float constant as a capture parameter to allow users to adjust its value in real-time from the Oculus Remote Monitor parameter screen. Potential use cases include lighting intensity or update frequencies. Note that parameters for CPU and GPU rates are already exposed.

86 | Testing and Troubleshooting | Mobile

To declare and use a capture variable in your code, use GetVariable() with an identifying label, a default value, and a min/max. When not connected to Oculus Remote Monitor, GetVariable() will immediately return the default value. Otherwise, it returns the latest value from Oculus Remote Monitor. In this example, SpecularPower is declared with a default value of 16, but Oculus Remote Monitor can override it with user input (between 1 and 128): OVR_CAPTURE_CREATE_LABEL(SpecularPower); const float specularPower = OVR::Capture::GetVariable(OVR_CAPTURE_LABEL_NAME(SpecularPower), 16.0f, 1.0f, 128.0f); glUniform1f(uSpecularPower, specularPower);

Local Capture The SDK includes support for capturing straight to a local file rather than waiting for a remote connection. This facilitates automated performance testing, startup time benchmarking, and dealing with poor network performance. It currently requires a number of manual steps to enable, pull, and disable. To enable local capture for VrApi: 1. Make sure your application has WRITE_EXTERNAL_STORAGE permissions. 2. Enable capture to file with the following: adb shell setprop debug.oculus.enableCapture /sdcard/mycapture.dat

3. Run your app and let it play until you have captured the data that you need, then exit. 4. To compress and download the capture file, run the following on a local terminal: adb shell gzip -9 /sdcard/mycapture.dat adb pull /sdcard/mycapture.dat.gz capture.dat

5. Disable capture when complete: adb shell setprop debug.oculus.enableCapture 0

6. Open capture.dat in Oculus Remote Monitor.

Oculus Remote Monitor

The Oculus Remote Monitor client (available for Mac OS X and Windows) connects to VR applications running on remote devices to capture, store, and display the streamed-in data. It is available from our Downloads page.

Setup Enable Capture Server To enable the Capture Server on a particular device: 1. Enable Developer Mode and USB Debugging on your device (for instructions, see Device Setup on page 11). 2. Plug your device into your computer via USB. 3. Check your device to see if it is requesting permission to "Allow USB debugging". If so, accept.

Mobile | Testing and Troubleshooting | 87

4. Open Oculus Remote Monitor and navigate to the Settings Panel gear icon in the upper-right. Verify that the ADB Path is set to your local installation of adb (included with the Android SDK). For example: C:\Users\ $username\AppData\Local\Android\sdk\platform-tools\adb.exe 5. The device ID of your connected device should now be listed in the Device Settings list. Select your device ID and click the Enable Capture check box to the right. After completing those steps the Capture Server will run automatically whenever a VR application runs. Note: If you reboot your device, the Device Settings are reset and you must click the Enable Capture check box again. Network Setup Oculus Remote Monitor uses a UDP-broadcast-based auto-discovery mechanism to locate remote hosts, and then a TCP connection to access the capture stream. For this reason, the host and the client must be on the same subnet, and the network must not block or filter UDP broadcasts or TCP connections. If you are on a large corporate network that may have such restrictions, we recommend setting up a dedicated network or tethering directly to your device. Furthermore, frame buffer capturing is extremely bandwidth intensive. If your signal strength is low or you have a lot of interference or traffic on your network, you may need to disable Capture Frame Buffer before connecting to improve capture performance. While most users won't need to worry about this, Oculus Remote Monitor uses UDP port 2020 and TCP ports 3030->3040. Application Setup Because we use a direct network connection, the following permission is required in your application's AndroidManifest.xml:

Basic Usage Host List If the host and client are on the same subnet and the network is configured correctly (see Network Setup above), Oculus Remote Monitor will automatically discover any compatible applications running on the network. Simply select a host, toggle your desired Session Settings, and click Connect to begin capturing and viewing data.

88 | Testing and Troubleshooting | Mobile

Open Capture File Upon each connection, Oculus Remote Monitor will automatically compress and store the data stream to disk under a unique filename in the format package-YYYYMMDD-HHMMSS-ID.dat. By default, these files are stored under your Documents directory in the OVRMonitorRecordings folder - you may change the path in Monitor's Settings. To open and play capture files back, click on the File icon in the toolbar.

Mobile | Testing and Troubleshooting | 89

Frame Buffer The Frame Buffer Viewer provides a mechanism for inspecting the frame buffer as the data is received in realtime, which is particularly useful for monitoring play test sessions. When connected to a remote session or streaming from a capture file, click on the camera icon in the toolbar to instantly access the most recent frame buffer. When enabled, the Capture library will stream a downscaled pre-distortion eye buffer across the network. We use downscaling rather than using a higher-quality compression scheme to reduce overhead on the host device as much as possible. This reduces image quality substantially, but still provides valuable visual context as to what is happening on screen at any given moment. The current default is 192x192 compressed. The Monitor application recompresses the Frame Buffer further to save memory and disk space when dealing with large capture sets.

90 | Testing and Troubleshooting | Mobile

Performance The Performance Data Viewer provides real-time and offline inspection of the following on a single, contiguous timeline: • • • •

CPU/GPU events Sensor readings Console messages, warnings, and errors Frame buffer captures

By default, VrAPI currently has a small number of events embedded to help diagnose VR-specific scheduling issues (this may change). Press the space bar to toggle between real-time timelines scrolling and freezing at a specific point in time. This allows the user to quickly alternate between watching events unfold in real-time and pausing to focus in on a point of interest without stopping or restarting. Scroll the mouse wheel to zoom the view in and out. Click and drag to pan the timeline forwards or backwards in time.

Mobile | Testing and Troubleshooting | 91

The Performance Data Viewer screen shows a selected portion of the application timeline: Frame Buffer

Provides screen captures of the pre-distorted frame buffer, timestamped the moment immediately before the frame was handed off to the TimeWarp context. The left edge of each screenshot represents the point in time in which it was captured from the GPU.

VSync

Displays notches on every driver v-sync event.

GPU Context

GPU Zones inserted into the OpenGL Command Stream via Timer Queries are displayed in a similar manner as CPU events. Each row corresponds to a different OpenGL Context. Typical VR applications will have two contexts: one for TimeWarp, and one for application rendering. Note that on tiler GPUs, these events should be regarded as rough estimates rather than absolute data.

CPU Thread

Hierarchical visualization of wall clock time of various functions inside VrAPI along with OpenGL Draw calls inside the host application. Log messages are displayed on their corresponding CPU thread as icons. Mouse over each icon to display the corresponding message (blue circles), warning (yellow squares), or error (red diamonds).

Sensor

General sensor data visualizer. CPU and GPU clocks are visualized in the screenshot shown above, but other data may be displayed here, such as thermal sensors, IMU data, et cetera.

92 | Testing and Troubleshooting | Mobile

Console VrAPI reports various messages and error conditions to Android's logcat as well as the Oculus Remote Monitor, which provides the thread and timestamp of each message. The Logging Viewer provides raw access to this data. Warnings and errors are color-coded to stand out easier, and unlike logcat, thread IDs are tracked so you know exactly when and where it occurred.

Mobile | Testing and Troubleshooting | 93

Parameters (new 0.6.1) Applications may expose user-adjustable parameters and variables in their code. Nearly any constant in your code may be turned into a knob that can be updated in real-time during a play test. By default, VrApi exposes CPU and GPU Levels to allow developers to quickly identify their required clock rates. Applications are free to expose their own.

94 | Testing and Troubleshooting | Mobile

ShowTimeWarpTextureDensity: This experimental feature toggles a visualization mode that colors the screen based on the texel:pixel ratio when running TimeWarp (green indicates 1:1 ratio; dark green < 1:1 ratio; red > 2:1 ratio). Currently in alpha. Settings Monitor has a few persistent settings that may be adjusted by clicking on the gears icon in the upper-right of the toolbar. ADB Path

Configurable any time. If it does not point to a valid executable when Monitor runs, Monitor will attempt to locate a valid copy of adb by checking the environment variable. If that fails, Monitor will search under ANDROID_HOME.

Recordings Directory

Specifies the location in which Monitor will automatically store capture files when connected to a remote host. The default is the current user's Documents directory under OVRMonitorRecordings.

Frame Buffer Compression Quality

Used for client-side recompression of the frame buffer. This helps offload compression load from the host while allowing for significant savings in memory usage on the client. Lower-quality settings provide greater memory savings, but may result in blocking artifacts in the frame buffer viewer.

Device Settings

Allows you to toggle capture support without manually editing your .oculusprefs file.

Mobile | Testing and Troubleshooting | 95

Using Oculus Remote Monitor to Identify Common Issues Tearing Tearing occurs in VR applications any time TimeWarp rendering fails to render ahead of scanout. VrApi attempts to detect this with a GPU Sync Object to determine when the GPU completes rendering distortion for a given eye. If for any reason it does not complete in time, VrApi prints a warning to logcat, which Oculus Remote Monitor picks up. V-sync %d: Eye %d, CPU latency %f, GPU latency %f, Total latency %f

If you are running on a Samsung GALAXY S6, you may also see tearing events by looking at the GPU Context that is running WarpToScreen. Example:

96 | Testing and Troubleshooting | Mobile

In this example, because the refresh rate of the display is 60 Hz, the ideal running time of WarpToScreen is 16.66ms, but a scheduling/priority issue in the application caused the second eye to be executed 10 ms late, pushing WarpToScreen to run for 26.81 ms. The actual eye distortion draw calls are barely visible as two distinct notches under each WarpToScreen event on the GPU. Missed Frames VrApi reports the Frame Index that was submitted inside vrapi_SubmitFrame, as well as the Frame Index that is currently being TimeWarped. This allows you to easily identify missed frames, and to easily track latency between app render and distortion. Every time a Frame Index is reported from API, Oculus Remote Monitor marks it on the timeline for the associated thread with a vertical grey line. If the Frame Index arrives out of order, the line is changed to red, helping you quickly identify problem areas.

In the above example, the TimeWarp thread has two out-of-order frames visible. This typically happens when the GPU was unable to finish rendering a frame from the application in time for TimeWarp to begin sampling from it. Zooming in, we can investigate the cause by looking at the actual Frame Index value.

Mobile | Testing and Troubleshooting | 97

Just as we thought, the same Frame Index is sampled twice, which results in the first red line. But the next frame is able to catch up, which means it jumps ahead two frames, resulting in the second red line along with a frame that is never displayed. This was all probably caused by the application failing to complete GPU work on time. Application OpenGL Performance Issues Oculus Remote Monitor is capable of capturing OpenGL calls across the entire process (enabled with the Graphics API option). Application-specific performance issues can therefore be spotted at times. In the example below, an application was mapping the same buffer several times a frame for a particle effect. On a Note 4 running KitKat, the GL driver triggered a sync point on the second glUnmapBuffer call, causing it to take away 2.73 ms - without the sync point, this same call takes around 0.03 ms. After spotting this issue, the developer was able to quickly fix the buffer usage and reclaim that CPU time.

Unlocked CPU Clocks VrApi attempts to lock the CPU and GPU clocks at particular frequencies to ensure some level of execution speed and scheduling guarantees. These are configurable via the CPULevel and GPULevel available in the API. When in VR Developer Mode, the clocks may occasionally unlock when out of the headset for too long. When this happens, the CPU/GPU Clock Sensors go from extremely flat to extremely noisy, typically causing many performance issues like tearing and missed frames, as seen below:

98 | Testing and Troubleshooting | Mobile

Note that on a Samsung GALAXY S6, we allow the clocks to boost slightly under certain conditions, but only by a small amount in typical cases, and it should never drop below the requested level. It is also fairly common for some cores to go completely on and offline occasionally. Network Bandwidth Stalls VrCapture uses a fixed-size FIFO internally to buffer events before they are streamed over the network. If this buffer fills faster than we can stream its contents, we are left in a tricky situation. If your network connection stalls long enough for any reason, it eventually causes the host application to stall as well. This is easily spotted in Oculus Remote Monitor - look for around two seconds of extremely long events on the OVR::Capture thread followed by other threads stalling as well. We provide a large internal buffer, so a few network hitches shouldn’t affect your application, but if they persist long enough, a random event inside your application will eventually stall until the Capture thread is able to flush the buffer. In the example below, several seconds of poor network connectivity (see by long AsyncStream_Flush events) eventually caused the MapAndCopy event on the application’s render thread to stall until it was eventually released by the Capture thread:

Mobile | Testing and Troubleshooting | 99

If you find it difficult to capture reliably because of this issue, we recommend disabling Frame Buffer Capturing before connecting, as this feature consumes the bulk of the bandwidth required. Thermal Limits If your application requests CPU/GPU Levels are too high, the internal SoC and battery temperatures will rise slowly, yet uncontrollably, until it hits the thermal limit. When this happens, GearVR Service will terminate the application and display a thermal warning until the device cools down. It may take quite a long time to encounter this scenario during testing. Monitoring thermals in Oculus Remote Monitor is a great way to quickly see if your application causes the device temperature to rise perpetually. Mouse over the Sensor graph to give the precise readout at any given time. We recommend keeping an eye on it. If the temperature exceeds the device’s first thermal trip point, the graph turns bright red, which typically occurs a few minutes before GearVR Service shuts the application down, and should serve as a stern warning that you should probably lower the CPU/GPU Levels.

Android Debugging A guide to Android debugging for mobile VR development.

100 | Testing and Troubleshooting | Mobile

Adb

This document describes utilities, tips and best practices for improving debugging for any application on Android platforms. Most of these tips apply to both native and Unity applications. Android Debug Bridge (adb) is included in the Android SDK and is the main tool used to communicate with an Android device for debugging. We recommend familiarizing yourself with it by reading the official documentation located here: http://developer.android.com/tools/help/adb.html

Using adb Using adb from the OS shell, it is possible to connect to and communicate with an Android device either directly through USB, or via TCP/IP over a WIFI connection. The Android Software Development Kit and appropriate device drivers must be installed before trying to use adb (see Device Setup on page 11 for more information). To connect a device via USB, plug the device into the PC with a compatible USB cable. After connecting, open up an OS shell and type: adb devices

If the device is connected properly, adb will show the device id list such as: List of devices attached ce0551e7 device

Adb may not be used if no device is detected. If your device is not listed, the most likely problem is that you do not have the correct Samsung USB driver - see Setting up your System to Detect your Android Device on page 15 for more information. You may also wish to try another USB cable and/or port. - waiting for device -

Note that depending on the particular device, detection may be finicky from time to time. In particular, on some devices, connecting while a VR app is running or when adb is waiting for a device, may prevent the device from being reliably detected. In those cases, try ending the app and stop adb using Ctrl-C before reconnecting the device. Alternatively the adb service can be stopped using the following command after which the adb service will be automatically restarted when executing the next command: adb kill-server

Multiple devices may be attached at once, and this is often valuable for debugging client/server applications. Also, when connected via WIFI and also plugged in via USB, adb will show a single device as two devices. In the multiple device case adb must be told which device to work with using the -s switch. For example, with two devices connected, the adb devices command might show: List of devices attached ce0551e7 device 10.0.32.101:5555 device

The listed devices may be two separate devices, or one device that is connected both via WIFI and plugged into USB (perhaps to charge the battery). In this case, all adb commands must take the form: adb -s

Mobile | Testing and Troubleshooting | 101

where is the id reported by adb devices. So, for example, to issue a logcat command to the device connected via TCP/IP: adb -s 10.0.32.101:55555 logcat -c

and to issue the same command to the device connected via USB: adb -s ce0551e7

Connecting adb via WIFI

Connecting to a device via USB is generally faster than using a TCP/IP connection, but a TCP/IP connection is sometimes indispensable, especially when debugging behavior that only occurs when the phone is placed in the Gear VR, in which case the USB connection is occupied by the Gear VR jack. To connect via TCP/IP, first determine the IP address of the device and make sure the device is already connected via USB. You can find the IP address of the device under Settings -> About device -> Status. Then issue the following commands: adb tcpip adb connect :

For example: > adb tcpip 5555 restarting in TCP mode port: 5555 > adb connect 10.0.32.101:5555 connected to 10.0.32.101:5555

The device may now be disconnected from the USB port. As long as adb devices shows only a single device, all adb commands will be issued for the device via WIFI. To stop using the WIFI connection, issue the following adb command from the OS shell: adb disconnect

Logcat

The Android SDK provides the logcat logging utility, which is essential for determining what an application and the Android OS are doing. To use logcat, connect the Android device via USB or WIFI, launch an OS shell, and type: adb logcat

If the device is connected and detected, the log will immediately begin outputting to the shell. In most cases, this raw output is too verbose to be useful. Logcat solves this by supporting filtering by tags. To see only a specific tag, use: adb logcat -s

This example: adb logcat -s VrApp

will show only output with the “VrApp” tag.

102 | Testing and Troubleshooting | Mobile

In the native VrAppFramework code, messages can generally be printed using the LOG() macro. In most source files this is defined to pass a tag specific to that file. Log.h defines a few additional logging macros, but all resolve to calling __android_log_print().

Using Logcat to Determine the Cause of a Crash Logcat will not necessarily be running when an application crashes. Fortunately, it keeps a buffer of recent output, and in many cases a command can be issued to logcat immediately after a crash to capture the log that includes the backtrace for the crash: adb logcat > crash.log

Simply issue the above command, give the shell a moment to copy the buffered output to the log file, and then end adb (Ctrl+C in a Windows cmd prompt or OS X Terminal prompt). Then search the log for “backtrace:” to locate the stack trace beginning with the crash. If too much time as elapsed and the log does not show the backtrace, there a full dump state of the crash should still exist. Use the following command to redirect the entire dumpstate to a file: adb shell dumpstate > dumpstate.log

Copying the full dumpstate to a log file usually takes significantly longer than simply capturing the currently buffered logcat log, but it may provide additional information about the crash.

Getting a Better Stack Trace The backtrace in a logcat capture or dumpstate generally shows the function where the crash occurred, but does not provide line numbering. To get more information about a crash, the Android Native Development Kit (NDK) must be installed. When the NDK is installed, the ndk-stack utility can be used to parse the logcat log or dumpstate for more detailed information about the state of the stack. To use ndk-stack, issue the following: ndk-stack -sym -dump > stack.log

For example, this command: ndk-stack -sym VrNative\Oculus360Photos\obj\local\armeabi-v7a -dump crash.log > stack.log

uses the symbol information for Oculus 360 Photos to output a more detailed stack trace to a file named stack.log, using the backtrace found in crash.log.

Application Performance Analysis A guide to performance analysis during mobile VR application development. While this document is geared toward native application development, most of the tools presented here are also useful for improving performance in Android applications developed in Unity or other engines.

Mobile | Testing and Troubleshooting | 103

Basic Performance Stats through Logcat

A simple way to get some basic performance numbers is to use logcat with a filter for VrApi. Sample usage: adb logcat -s VrApi

A line resembling this example will be displayed every second: I/VrApi (26422): FPS=60,Prd=49ms,Tear=0,Early=60,Stale=0,VSnc=1,Lat=1,CPU4/GPU=2/1,1000/350MHz,OC=FF,TA=F0/F0/F0/ F0,SP=F/F/N/N,Mem=1026MHz,Free=1714MB,PSM=0,PLS=0,Temp=31.5C/27.

FPS: Frames Per Second. An application that performs well will continuously display 59-60 FPS. Prd: The number of milliseconds between the latest sensor sampling for tracking and the anticipated display time of new eye images. For a single-threaded application this time will normally be 33 milliseconds. For an application with a separate renderer thread (like Unity) this time will be 49 milliseconds. New eye images are not generated for the time the rendering code is executed, but are instead generated for the time they will be displayed. When an application begins generating new eye images, the time they will be displayed is predicted. The tracking state (head orientation, et cetera) is also predicted ahead for this time. Tears: The number of tears per second. A well behaved and well performing application will display zero tears. Tears can be related to Asynchronous TimeWarp, which takes the last completed eye images and warps them onto the display. The time warp runs on a high priority thread using a high priority OpenGL ES context. As such, the time warp should be able to preempt any application rendering and warp the latest eye images onto the display just in time for the display refresh. However, when there are a lot of heavyweight background operations or the application renders many triangles to a small part of the screen, or uses a very expensive fragment program, then the time warp may have to wait for this work to be completed. This may result in the time warp not executing in time for the display refresh, which, in return, may result in a visible tear line across one of the eyes or both eyes. Early: The number of frames that are completed a whole frame early. Stale: The number of stale frames per second. A well-behaved application performing well displays zero stale frames. New eye images are generated for a predicted display time. If, however, the new eye images are not completed by this time, then the time warp may have to re-project and display a previous set of eye images. In other words, the time warp displays a stale frame. Even though the time warp re-projects old eye images to make them line up with the latest head orientation, the user may still notice some degree of intra-frame motion judder when displaying old images. Vsnc: The value of MinimumVsyncs, which is the number of V-Syncs between displayed frames. This value directly controls the frame rate. For instance, MinimumVsyncs = 1 results in 60 FPS and MinimumVsyncs = 2 results in 30 FPS. CPU0/GPU: The CPU and GPU clock levels and associated clock frequencies, set by the application. Lower clock levels result in less heat and less battery drain. F/F [Thread Affinity]: This field describes the thread affinity of the main thread (first hex nibble) and renderer thread (second hex nibbles). Each bit represents a core, with 1 indicating affinity and 0 indicating no affinity. For example, F/1 (= 1111 0001) indicates the main thread can run on any of the lower four cores, while the rendered thread can only run on the first core. In practice, F/F and F0/F0 are common results. Free: The amount of available memory, displayed every second. It is important to keep a reasonable amount of memory available to prevent Android from killing backgrounded applications, like Oculus Home. PLS: Power Level State, where "0" = normal, "1" = throttled, and "2" = undock required.

104 | Testing and Troubleshooting | Mobile

Temp: Temperature in degrees Celsius. Well-optimized applications do not cause the temperature to rise quickly. There is always room for more optimization, which allows lower clock levels to be used, which, in return, reduces the amount of heat that is generated.

SysTrace SysTrace is the profiling tool that comes with the Android Developer Tools (ADT) Bundle. SysTrace can record detailed logs of system activity that can be viewed in the Google Chrome browser. With SysTrace, it is possible to see an overview of what the entire system is doing, rather than just a single app. This can be invaluable for resolving scheduling conflicts or finding out exactly why an app isn’t performing as expected. Under Windows: the simplest method for using SysTrace is to run the monitor.bat file that was installed with the ADT Bundle. This can be found in the ADT Bundle installation folder (e.g., C:\android\adt-bundle-windowsx86_64-20131030) under the sdk/tools folder. Double-click monitor.bat to start Android Debug Monitor.

Select the desired device in the left-hand column and click the icon highlighted in red above to toggle Systrace logging. A dialog will appear enabling selection of the output .html file for the trace. Once the trace is toggled off, the trace file can be viewed by opening it up in Google Chrome. You can use the WASD keys to pan and zoom while navigating the HTML doc. For additional keyboard shortcuts, please refer to the following documentation: http://developer.android.com/tools/help/systrace.html

NDK Profiler

Use NDK Profiler to generate gprof-compatible profile information. The Android NDK profiler is a port of gprof for Android. The latest version, which has been tested with this release, is 3.2 and can be downloaded from the following location: https://code.google.com/p/android-ndk-profiler/ Once downloaded, unzip the package contents to your NDK sources path, e.g.: C:\Dev\Android\android-ndkr9c\sources.

Mobile | Testing and Troubleshooting | 105

Add the NDK pre-built tools to your PATH, e.g.: C:\Dev\Android\android-ndk-r9c\toolchains\arm-linuxandroideabi-4.8\prebuilt\windows-x86_64\bin. Android Makefile Modifications 1. Compile with profiling information and define NDK_PROFILE LOCAL_CFLAGS := -pg -DNDK_PROFILE

2. Link with the ndk-profiler library LOCAL_STATIC_LIBRARIES := android-ndk-profiler

3. Import the android-ndk-profiler module $(call import-module,android-ndk-profiler)

Source Code Modifications Add calls to monstartup and moncleanup to your Init and Shutdown functions. Do not call monstartup or moncleanup more than once during the lifetime of your app. #if defined( NDK_PROFILE ) extern "C" void monstartup( char const * ); extern "C" void moncleanup(); #endif extern "C" { void Java_oculus_VrActivity2_nativeSetAppInterface( JNIEnv * jni, jclass clazz ) { #if defined( NDK_PROFILE ) setenv( "CPUPROFILE_FREQUENCY", "500", 1 ); // interrupts per second, default 100 monstartup( "libovrapp.so" ); #endif app->Init(); } void Java_oculus_VrActivity2_nativeShutdown( JNIEnv *jni ) { app->Shutdown(); #if defined( NDK_PROFILE ) moncleanup(); #endif } } // extern "C"

Manifest File Changes You will need to add permission for your app to write to the SD card. The gprof output file is saved in /sdcard/ gmon.out.

Profiling your App To generate profiling data, run your app and trigger the moncleanup function call by pressing the Back button on your phone. Based on the state of your app, this will be triggered by OnStop() or OnDestroy(). Once moncleanup has been triggered, the profiling data will be written to your Android device at /sdcard/gmon.out.

106 | Testing and Troubleshooting | Mobile

Copy the gmon.out file to the folder where your project is located on your PC using the following command: adb pull /sdcard/gmon.out To view the profile information, run the gprof tool, passing to it the non-stripped library, e.g.: arm-linux-androideabi-gprof

obj/local/armeabi/libovrapp.so

For information on interpreting the gprof profile information, see the following: http://sourceware.org/binutils/ docs/gprof/Output.html

Rendering Performance: Tracer for OpenGL ES Use Tracer for OpenGL ES to capture OpenGL ES calls for an application. Tracer is a profiling tool that comes with the ADT Bundle. 1. Under Windows: the simplest method for using Tracer is to run the monitor.bat file that is installed with the ADT Bundle. This can be found in the ADT bundle installation folder (e.g., C:\android\adt-bundlewindows-x86_64-20131030) under the sdk/tools folder. Just double-click monitor.bat to start Android Debug Monitor. 2. Go to Windows -> Open Perspective… and select Tracer for OpenGL ES. 3. Click the Trace Capture button shown below. Figure 9: Tracer for OpenGL ES

4. Fill in the Trace Options and select Trace. Note: A quick way to find the package name and Activity is to launch your app with logcat running. The Package Manager getActivityInfo line will display the information, e.g., com.Oculus.Integration/ com.unity3d.player.UnityPlayerNativeActivity.

Mobile | Testing and Troubleshooting | 107

Note: Selecting any Data Collection Options may cause the trace to become very large and slow to capture. Tracer will launch your app and begin to capture frames.

5. Enable the following options: • Collect Framebuffer contents on eglSwapBuffers() • Collect Framebuffer contents on glDraw*() 6. Once you have collected enough frames, choose Stop Tracing.

108 | Testing and Troubleshooting | Mobile

7. Select the button highlighted above in red to import the .gltrace file you just captured. This can be a lengthy process. Once the trace is loaded, you can view GL commands for every captured frame.

Android Studio Native Debugging This guide provides basic recommendations for working with the Oculus Mobile SDK in Android Studio and Gradle, and is intended to supplement the relevant Android Studio documentation.

Native Debugging with Android Studio

This section introduces debugging our sample native apps in Android Studio. Note: Native support in Android Studio is still under development. Some developers have reported problems using run-as with the Note 4 with Lollipop (5.0.x) and S6 with 5.0.0, which may cause issues with debugging. If you have problems debugging, try updating to the latest system software, and please let us know on the Oculus Forums. The default configurations created during project import only support Java debugging. Select Edit Configurations… in the Configurations drop-down menu in the Android Studio toolbar.

Create a new Android Native configuration as show below:

Mobile | Testing and Troubleshooting | 109

In the General tab of the Run/Debug Configuration dialog box, assign your configuration a name, select the target module, and select the target device mode:

In the Native tab of the Run/Debug Configuration dialog box, add symbol paths:

110 | Testing and Troubleshooting | Mobile

Note that ndk-build places stripped libraries inside the libs/ directory. You must point the symbol search paths at the obj/local/ directory. This is also not a recursive search path, so you must put the full path to the obj/local/armeabi-v7a directory.

Mobile | Testing and Troubleshooting | 111

Native Debugging with ndk-gdb

This guide provides basic recommendations for using ndk-gdb to debug native mobile VR projects, and is intended to supplement the relevant Android Studio documentation. The Android NDK includes a powerful debugging tool called ndk-gdb, a small shell script wrapped around GDB. Using ndk-gdb from the command line adds convenient features to your debugging workflow by allowing, for example, adding breakpoints, stepping through code, and inspecting variables. Create Breakpoints Break On Function (gdb) break SomeFunctionName()

or (gdb) break SomeClass::SomeMethod()

Example usage: (gdb) break OVR::VrCubeWorld::Frame(OVR::VrFrame const&) Breakpoint 2 at 0xf3f56118: file jni/../../../Src/VrCubeWorld_Framework.cpp, line 292.

Note: GDB supports tab completion for symbol names. Break On File:Line (gdb) break SomeFile.cpp:256

Conditional Breakpoints Add if to the end of your break command. Example usage: (gdb) break OVR::VrCubeWorld::Frame(OVR::VrFrame const&) if vrFrame.PredictedDisplayTimeInSeconds > 24250.0 Breakpoint 6 at 0xf3f58118: file jni/../../../Src/VrCubeWorld_Framework.cpp, line 292.

Break At Current Execution Example usage: (gdb) break OVR::VrCubeWorld::Frame(OVR::VrFrame const&) if vrFrame.PredictedDisplayTimeInSeconds > 24250.0 Breakpoint 6 at 0xf3f58118: file jni/../../../Src/VrCubeWorld_Framework.cpp, line 292.

Break At Current Execution When an application is actively running, press control-c to break immediately and bring up the gdb prompt. Stepping Step Over (gdb) next

112 | Testing and Troubleshooting | Mobile

or (gdb) n

Step Into (gdb) step

or (gdb) s

Continue Execution (gdb) continue

or (gdb) c

Printing Print Struct You may enable pretty printing mode to add new lines between struct elements (optional): (gdb) set print pretty on

To print the struct: (gdb) print SomeStructVariable

Example usage: (gdb) print currentRotation $1 = { x = 23185.9961, y = 23185.9961, z = 0, static ZERO = { x = 0, y = 0, z = 0, static ZERO = } }

printf Example usage: (gdb) printf "x = %f\n", currentRotation.x x = 23185.996094

Breakpoint Commands GDB breakpoints may be set to automatically execute arbitrary commands upon being hit. This feature is useful for inserting print statements into your code without recompiling or even modifying data at key points in your program without halting execution entirely. It is controlled by the commands command.

Mobile | Testing and Troubleshooting | 113

You may specify the breakpoint number with an optional single argument; if omitted, the final breakpoint is used. You are then presented with a series of lines beginning with > where you may enter GDB commands (one per line) that will be executed upon hitting the breakpoint, until the command sequence is terminated with end. In the following example, we create a breakpoint automatically prints the value of a local variable and then continues execution, upon being hit: (gdb) break OVR::VrCubeWorld::Frame(OVR::VrFrame const&) Breakpoint 1 at 0xf3f56118: file jni/../../../Src/VrCubeWorld_Framework.cpp, line 292. (gdb) commands Type commands for breakpoint(s) 1, one per line. End with a line saying just "end". >silent >printf "time = %f\n", vrFrame.PredictedDisplayTimeInSeconds >continue >end

Text User Interface (TUI) To enable or disable the TUI, press CTRL+x followed by CTRL+a. For more information on TUI keyboard shortcuts, see https://sourceware.org/gdb/onlinedocs/gdb/TUIKeys.html.

114 | Mobile Native SDK Migration Guide | Mobile

Mobile Native SDK Migration Guide This section details migrating from earlier versions of the Mobile SDK for native development.

Migrating to Mobile SDK 1.0.3 This section is intended to help you upgrade from the Oculus Mobile SDK version 1.0.0 to 1.0.3. Overview Mobile SDK 1.0.3 provides support for multi-view rendering. In order for your application to take advantage of multi-view, you will need to restructure your application rendering to return a list of render surfaces from Frame instead of relying on the DrawEyeView path. Information regarding how to set up your application rendering for multi-view rendering can be found in “Restructing App Rendering For Multi-view” below. Note: The DrawEyeView render path is deprecated and will be removed in a future SDK update. To make the VrApi more explicit and to make integration with heavily threaded engines easier, the EGL objects and ANativeWindow that are used by the VrApi can now be explicitly passed through the API. The VrAppInterface has been refactored to simplify the interface, support multi-view rendering, and to enforce per-frame determinism. Build steps have been moved from the Python build script into Gradle. The following sections provide guidelines for updating your native project to the 1.0.3 mobile SDK. The native SDK samples which ship with the SDK are also a good reference for reviewing required changes (see VrSamples/Native). Build System Changes The SDK now builds using gcc 4.9. For help porting your own code to gcc 4.9, refer to https://gcc.gnu.org/ gcc-4.9/porting_to.html. The SDK now builds using Android SDK Build Tools Revision 23.0.1. All Gradle files specifying buildToolsVersion 22.0.1 should be revised to buildToolsVersion 23.0.1 Applications that build using build.py (including all projects in VrSamples/Native) will need their build configurations edited to include VrApp.gradle: 1. In the project-level build.gradle (e.g. VrCubeWorld_NativeActivity/Projects/Android/build.gradle), add the following lines to the top of the file: apply from: "${rootProject.projectDir}/VrApp.gradle" 2. Delete the brace-enclosed sections beginning with project.afterEvaluate and android.applicationVariants.all Make the following changes all project-level build.gradle files: 1. In the brace-enclosed section marked android, add the following line: project.archivesBaseName = ""

2. Make sure the brace-enclosed section marked android also contains the following: defaultConfig { applicationId "" }

Mobile | Mobile Native SDK Migration Guide | 115

3. In the brace-enclosed section marked dependencies, replace the line: compile name: 'VrAppFramework', ext: 'aar'

with the line: compile project(':VrAppFramework:Projects:AndroidPrebuilt')

and add the following path to your settings.gradle include list: 'VrAppFramework:Projects:AndroidPrebuilt'

See VrTemplate/ gradle files for an example. VrApi Changes VrApi now displays volume change overlays automatically. This behavior may be overridden if necessary by setting VRAPI_FRAME_FLAG_INHIBIT_VOLUME_LAYER as an ovrFrameParm flag. Note: When using the VrApi volume notifier, make sure that you do not render your own! The ANativeWindow used by the VrApi can now be explicitly passed through the API by specifying the following ovrModeParm flag: parms.Flags |= VRAPI_MODE_FLAG_NATIVE_WINDOW;

Note: While the VrApi currently continues to support the old behavior of not passing EGL objects explicitly with threading restrictions, this functionality will be removed in a future version of VrApi. ovrModeParms AllowPowerSave is now specified as an ovrModeParm flag: parms.Flags |= VRAPI_MODEL_FLAG_ALLOW_POWER_SAVE

ovrFrameLayer ProgramParms are now specified explicitly. LibOVRKernel Changes ovrPose member names have changed: Orientation -> Rotation Position -> Translation Size member names have changed: Width -> w Height -> h The Math constants are deprecated in favor of the new MATH_FLOAT_ and MATH_DOUBLE defines. VrAppFramework Changes VrAppInterface Restructing VrAppInterface has been refactored to simplify the interface, support multi-view, and enforce per-frame determinism.

116 | Mobile Native SDK Migration Guide | Mobile

The VrAppInterface::Frame() function signature has changed. Frame() now takes an ovrFrameInput structure that contains all of the per-frame state information needed for an application. It returns an ovrFrameResult structure that should contain all of the state information produced by a single application frame. In particular, ovrFrameInput now contains key state changes and ovrFrameResult must return a list of surfaces to be rendered. When multi-view rendering is enabled, this list of surfaces is submitted only once, but rendered for both eyes. Prior to multi-view support, all surfaces were both submitted and rendered twice each frame. At a minimum, Frame() must return a frame result with the center view matrix as follows: ovrFrameResult res; res.FrameMatrices.CenterView = Scene.GetCenterEyeViewMatrix(); return res;

As a result of the changes to support multi-view, DrawEyeView() is now deprecated. OneTimeInit and NewIntent have been removed from the interface. Android intents are now passed into EnteredVrMode() along with an intentType flag. On the first entry into EnteredVrMode() after the application’s main Activity is created, intentType will be INTENT_LAUNCH. If the application is re-launched while the main Activity is already running (normally when a paused application is resumed) with a new intent, intentType will be INTENT_NEW. If the application was resumed without a new intent, intentType will be INTENT_OLD. Applications that implemented OneTimeInit() and NewIntent() must now call OneTimeInit() and NewIntent() explicitly from their overloaded EnteredVrMode() method instead. In general, this means: • When intentType is INTENT_LAUNCH, call OneTimeInit(), then NewIntent(). • When intentType is INTENT_NEW, call NewIntent() only. • When intentType is INTENT_OLD, do not call OneTimeInit() or NewIntent(). OneTimeShutdown() has been removed from VrAppInterface. Application shutdown code must now be called from the destructor of the VrAppInterface derived class. OnKeyEvent() was removed from VrAppInterface to allow all input events to be passed through the ovrFrameInput structure. This reduces the number of ways application code receives events to just ovrFrameInput. This requires each application to implement its own code to dispatch key events to OnKeyEvent(). The application’s existing OnKeyEvent() can remain intact and is called from the beginning of Frame() as follows: // process input events first because this mirrors the behavior when OnKeyEvent was // a virtual function on VrAppInterface and was called by VrAppFramework. for ( int i = 0; i < vrFrame.Input.NumKeyEvents; i++ ) { const int keyCode = vrFrame.Input.KeyEvents[i].KeyCode; const int repeatCount = vrFrame.Input.KeyEvents[i].RepeatCount; const KeyEventType eventType = vrFrame.Input.KeyEvents[i].EventType; if ( OnKeyEvent( keyCode, repeatCount, eventType ) ) { continue; // consumed the event } // If nothing consumed the key and it's a short-press of the back key, then exit the application to OculusHome. if ( keyCode == OVR_KEY_BACK && eventType == KEY_EVENT_SHORT_PRESS ) { app->StartSystemActivity( PUI_CONFIRM_QUIT ); continue; } }

LeavingVrMode() has been added. This will be called any time the application leaves VR mode, i.e., whenever the application is paused, stopped or destroyed. App CreateToast has been removed. Instead, App::ShowInfoText can be used for displaying debug text.

Mobile | Mobile Native SDK Migration Guide | 117

AppLocal DrawScreenMask() and OverlayScreenFadeMaskProgram are no longer provided and should be implemented by the application which requires it. See VrSamples/Native/CinemaSDK for an example. ovrDrawSurface ovrDrawSurface has been refactored and now contains a matrix instead of a pointer to a matrix. The joints member was removed and joints must now be specified explicitly in the GlProgram uniform parms as a uniform buffer object. ovrMaterialDef The ovrMaterialDef interface has been merged into ovrGraphicsCommand and is marked for deprecation. ovrMaterialDef programObject -> ovrGraphicsCommand Program.Program ovrMaterialDef gpuState -> ovrGraphicsCommand GpuState ovrSurfaceDef The materialDef member has been replaced by graphicsCommand. The cullingBounds member was removed from ovrSurfaceDef. GlGeometry now calculates and stores a localBounds. GlGeometry GlGeometry now calculates a localBounds on Create in order to guarantee that all geometry submitted to the renderer has valid bounds. The BuildFadedScreenMask() function is no longer provided and should be implemented by the application which requires it. See VrSamples/Native/CinemaSDK for an example. GlProgram GlProgram has undergone significant refactoring for multi-view support: Internal data members were renamed so that the first letter of each member is capitalized. For example, program is now Program. GlProgram now defaults to building shader programs with version 300 to support multi-view. There is one exception: if a shader requires the use of image_external (typically used for video rendering), the shader may be built with v100 due to driver incompatibility with image_external and version 300. All drivers which fully support multi-view will support image_external with version 300. Any program which uses image_external will need to make the following change so that the program is compatible with both v100 and v300, change: #extension GL_OES_EGL_image_external : require

To: #extension GL_OES_EGL_image_external : enable #extension GL_OES_EGL_image_external_essl3 : enable

For more information regarding version 300, see: https://www.khronos.org/registry/gles/specs/3.0/ GLSL_ES_Specification_3.00.3.pdf Shader directives (extensions, optimizations, etc.) must now be specified separately from the main shader source. See VrSamples/Native/CinemaSDK for an example.

118 | Mobile Native SDK Migration Guide | Mobile

As part of multi-view support, uniformMvp is no longer part of GlProgram. To be multi-view compliant, apps must remove usage of Mvpm and instead use TransformVertex() for calculating the projected position, i.e.: gl_Position = TransformVertex( Position );

Note: The TransformVertex function is provided in a default system header block which is added to every GlProgram. GlProgram now provides explicit support for uniform parm setup. The old path of relying on a small subset of hard-coded system level uniforms is now deprecated and will be removed in a future SDK. An example of setting up uniform parms with the new path is provided below. GlTexture GlTexture was changed to require the width and height of textures. In order to enforce this requirement, the GlTexture constructors were changed and a GlTexture cannot be implicitly constructed from an integer any longer. Input Application input handling was changed so that input is now passed into the application’s Frame() function via the ovrFrameInput structure. Please see the section on VrAppInterface restructuring for an explanation and example code. VrAppFramework now also handles Android key code 82. A short-press of key code 82 opens the UM, while a long-press goes directly to Home. Texture Manager VrAppFramework now implements a Texture Manager. This allows VrGUI objects to share textures across surfaces, which in turn can make for significant load-time improvements for applications using VrGUI. At present, textures are not reference counted and will not be freed automatically when VrGUI objects are released. Projects that load significant numbers of textures onto VrGUI surfaces may need to use the texture manager to free texture memory explicitly. New instances of ovrTextureManager can be created by the application to manage separate texture memory pools. VrAppSupport VrGUI Individual instances of VrGUI components can now have names. GetComponentByName() was changed to GetComponentByTypeName() and a new template function, GetComponentByName(), exists for retrieving a VRMenuObject’s components by name. Because the new GetComponentByName() signature is different, old code calling GetComponentByName() will not compile until GetComponentByName() is changed to GetComponentByTypeName(). The OvrGuiSys::RenderEyeView() function interface has changed from: GuiSys->RenderEyeView( Scene.GetCenterEyeViewMatrix(), viewMatrix, projectionMatrix );

To: GuiSys->RenderEyeView( Scene.GetCenterEyeViewMatrix(), viewMatrix, projectionMatrix, app>GetSurfaceRender() );

Note: This functionality is specific to the DrawEyeView render path and will be removed in a future SDK.

Mobile | Mobile Native SDK Migration Guide | 119

To support multi-view, VrGUI now provides an interface call for adding all gui surfaces to the application’s render surface list: GuiSys->AppendSurfaceList( Scene.GetCenterEyeViewMatrix(), &res.Surfaces );

VrLocale The ovrLocale::Create function signature has changed: Locale = ovrLocale::Create( *app, "default" );

To: Locale = ovrLocale::Create( *java->Env, java->ActivityObject, "default" );

VrModel The SceneView::DrawEyeView() function signature has changed:: Scene.DrawEyeView( eye, fovDegreesX, fovDegreesY );

To: Scene.DrawEyeView( eye, fovDegreesX, fovDegreesY, app->GetSurfaceRender() );

Restructuring App Rendering For Multi-view In order to set up your rendering path to be multi-view compliant, your app should specify a list of surfaces and render state back to App Frame(). Immediate GL calls inside the app main render pass are not compatible with multi-view rendering and not allowed. The first section below describes how to transition your app from rendering with DrawEyeview and instead return a list of surfaces back to the application framework. The section below describes multi-view rendering considerations and how to enable it in your app. Return Surfaces From Frame Set up the Frame Result: Apps should set up the ovrFrameResult which is returned by Frame with the following steps: 1. Set up the ovrFrameParms - storage for which should be maintained by the application. 2. Set up the FrameMatrices - this includes the CenterEye and View and Projection matrices for each eye. 3. Generate a list of render surfaces and append to the frame result Surfaces list. a. Note: The surface draw order will be the order of the list, from lowest index (0) to highest index. b. Note: Do not free any resources which surfaces in list rely on while Frame render is in flight. 4. Optionally, specify whether to clear the color or depth buffer with clear color. OvrSceneView Example An example using the OvrSceneView library scene matrices and surface generation follows: ovrFrameResult OvrApp::Frame( const ovrFrameInput & vrFrame ) { ...

120 | Mobile Native SDK Migration Guide | Mobile

// fill in the frameresult info for the frame. ovrFrameResult res; // Let scene construct the view and projection matrices needed for the frame. Scene.GetFrameMatrices( vrFrame.FovX, vrFrame.FovY, res.FrameMatrices ); // Let scene generate the surface list for the frame. Scene.GenerateFrameSurfaceList( res.FrameMatrices, res.Surfaces ); // Initialize the FrameParms. FrameParms = vrapi_DefaultFrameParms( app->GetJava(), VRAPI_FRAME_INIT_DEFAULT, vrapi_GetTimeInSeconds(), NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { FrameParms.Layers[0].Textures[eye].ColorTextureSwapChain = vrFrame.ColorTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].DepthTextureSwapChain = vrFrame.DepthTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].TextureSwapChainIndex = vrFrame.TextureSwapChainIndex; FrameParms.Layers[0].Textures[eye].TexCoordsFromTanAngles = vrFrame.TexCoordsFromTanAngles; FrameParms.Layers[0].Textures[eye].HeadPose = vrFrame.Tracking.HeadPose;

}

FrameParms.ExternalVelocity = Scene.GetExternalVelocity(); FrameParms.Layers[0].Flags = VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION; res.FrameParms = (ovrFrameParmsExtBase *) & FrameParms; return res; }

Custom Rendering Example First, you need to make sure any immediate GL render calls are represented by an ovrSurfaceDef. In the DrawEyeView path, custom surface rendering was typically done by issuing immediate GL calls. glActiveTexture( GL_TEXTURE0 ); glBindTexture( GL_TEXTURE_2D, BackgroundTexId ); glDisable( GL_DEPTH_TEST ); glDisable( GL_CULL_FACE ); GlProgram & prog = BgTexProgram; glUseProgram( prog.Program ); glUniform4f( prog.uColor, 1.0f, 1.0f, 1.0f, 1.0f ); globeGeometry.Draw(); glUseProgram( 0 ); glActiveTexture( GL_TEXTURE0 ); glBindTexture( GL_TEXTURE_2D, 0 );

Instead, with the multi-view compliant path, an ovrSurfaceDef and GlProgram would be defined at initialization time as follows. static ovrProgramParm BgTexProgParms[] = { { "Texm", ovrProgramParmType::FLOAT_MATRIX4 }, { "UniformColor", ovrProgramParmType::FLOAT_VECTOR4 }, { "Texture0", ovrProgramParmType::TEXTURE_SAMPLED }, }; BgTexProgram= GlProgram::Build( BgTexVertexShaderSrc, BgTexFragmentShaderSrc, BgTexProgParms, sizeof( BgTexProgParms) / sizeof( ovrProgramParm ) ); GlobeSurfaceDef.surfaceName = "Globe"; GlobeSurfaceDef.geo = BuildGlobe(); GlobeSurfaceDef.graphicsCommand.Program = BgTexProgram; GlobeSurfaceDef.graphicsCommand.GpuState.depthEnable = false; GlobeSurfaceDef.graphicsCommand.GpuState.cullEnable = false; GlobeSurfaceDef.graphicsCommand.UniformData[0].Data = &BackGroundTexture; GlobeSurfaceDef.graphicsCommand.UniformData[1].Data = &GlobeProgramColor;

At Frame time, the uniform values can be updated, changes to the gpustate can be made, and the surface(s) added to the render surface list.

Mobile | Mobile Native SDK Migration Guide | 121

Note: This manner of uniform parm setting requires the application to maintain storage for the uniform data. Future SDKs will provide helper functions for setting up uniform parms and materials. An example of setting up FrameResult using custom rendering follows: ovrFrameResult OvrApp::Frame( const ovrFrameInput & vrFrame ) { ... // fill in the frameresult info for the frame. ovrFrameResult res; // calculate the scene matrices for the frame. res.FrameMatrices.CenterView = vrapi_GetCenterEyeViewMatrix( &app->GetHeadModelParms(), &vrFrame.Tracking, NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { res.FrameMatrices.EyeView[eye] = vrapi_GetEyeViewMatrix( &app->GetHeadModelParms(), &CenterEyeViewMatrix, eye ); res.FrameMatrices.EyeProjection[eye] = ovrMatrix4f_CreateProjectionFov( vrFrame.FovX, vrFrame.FovY, 0.0f, 0.0f, 1.0f, 0.0f ); } // Update uniform variables and add needed surfaces to the surface list. BackGroundTexture = GlTexture( BackgroundTexId, 0, 0 ); GlobeProgramColor = Vector4f( 1.0f, 1.0f, 1.0f, 1.0f ); res.Surfaces.PushBack( ovrDrawSurface( &GlobeSurfaceDef ) ); // Initialize the FrameParms. FrameParms = vrapi_DefaultFrameParms( app->GetJava(), VRAPI_FRAME_INIT_DEFAULT, vrapi_GetTimeInSeconds(), NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { FrameParms.Layers[0].Textures[eye].ColorTextureSwapChain = vrFrame.ColorTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].DepthTextureSwapChain = vrFrame.DepthTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].TextureSwapChainIndex = vrFrame.TextureSwapChainIndex; FrameParms.Layers[0].Textures[eye].TexCoordsFromTanAngles = vrFrame.TexCoordsFromTanAngles; FrameParms.Layers[0].Textures[eye].HeadPose = vrFrame.Tracking.HeadPose;

}

FrameParms.ExternalVelocity = Scene.GetExternalVelocity(); FrameParms.Layers[0].Flags = VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION; res.FrameParms = (ovrFrameParmsExtBase *) & FrameParms; return res; }

Specify the Render Mode: In your app Configure(), specify the appropriate render mode. To configure the app to render using the surfaces returned by Frame, set the following: settings.RenderMode = RENDERMODE_STEREO;

Note: A debug render mode option has been provided which alternates rendering between the deprecated DrawEyeView path and the new path for returning a render surface list from Frame. This aides the conversion process so it is easier to spot discrepancies between the two paths. settings.RenderMode = RENDERMODE_DEBUG_ALTERNATE_STEREO;

Multi-view Render Path Before enabling the multi-view rendering path, you will want to make sure your render data is multi-view compatible. This involves: Position Calculation

122 | Mobile Native SDK Migration Guide | Mobile

App render programs should no longer specify Mvpm directly and should instead calculate gl_Position using the system provided TransformVertex() function which accounts for the correct view and projection matrix for the current viewID. Per-Eye View Calculations Apps will need to take into consideration per-eye-view calculations: Examples follow: Per-Eye Texture Matrices: In the DrawEyeView path, the texture matrix for the specific eye was bound at the start of each eye render pass. For multi-view, an array of texture matrices indexed by VIEW_ID should be used. Note: Due to a driver issue with the Adreno 420, KitKat, and version 300 programs, uniform array of matrices should be contained inside a uniform buffer object. Stereo Images: In the DrawEyeView path, the image specific to the eye was bound at the start of each eye render pass. For multi-view, while an array of texture index by VIEW_ID would be preferable, Android KitKat does not support the use of texture arrays. Instead, both textures can be specified in the fragment shader and the selection determined by the VIEW_ID. External Image Usage Applications which make use of image_external, i.e. video rendering applications, must take care when constructing image_external shader programs. Not all drivers support image_external as version 300. The good news is that drivers which fully support multiview will support image_external in the version 300 path, which means image_external programs will work correctly when the multi-view path is enabled. However, for drivers which do not fully support multi-view, these shaders will be compiled as version 100. These shaders must continue to work in both paths, i.e., version 300 only constructs should not be used and the additional extension specification requirements, listed above, should be made. For some cases, the cleanest solution may be to only use image_external during Frame to copy the contents of the external image to a regular texture2d which is then used in the main app render pass (which could eat into the multi-view performance savings) Enable Multi-view Rendering Finally, to enable the multi-view rendering path, set the render mode in your app Configure() to the following: settings.RenderMode = RENDERMODE_MULTIVIEW;

Migrating to Mobile SDK 1.0.0 This section is intended to help you upgrade from the 0.6.2 SDK to 1.0.0. VrApi Changes The function vrapi_Initialize now returns an ovrInitializeStatus. It is important to verify that the initialization is successful by checking that VRAPI_INITIALIZE_SUCCESS is returned. The function vrapi_GetHmdInfo has been replaced with vrapi_GetSystemPropertyInt and vrapi_GetSystemPropertyFloat. These functions use the ovrSystemProperty enumeration to get the individual properties that were previously returned on the ovrHmdInfo structure.

Mobile | Mobile Native SDK Migration Guide | 123

The functions ovr_DeviceIsDocked, ovr_HeadsetIsMounted, ovr_GetPowerLevelStateThrottled, and ovr_GetPowerLevelStateMinimum have been replaced with vrapi_GetSystemStatusInt and vrapi_GetSystemStatusFloat. These functions use the ovrSystemStatus enumeration to select the individual status to be queried. These functions may now be used to also query various performance metrics. The other functions from VrApi_Android.h wrapped Android functionality or dealt with launching or returning from System Activities (Universal Menu, et cetera). These functions were removed from VrApi because they are not considered part of the core minimal API for VR rendering. The functions to get/set brightness, comfort mode and do-not-disturb mode have been removed. The other functions are now available through the VrAppSupport/SystemUtils library. See the VrAppSupport Changes section for more detailed information about using the SystemUtils library. The layer textures are passed to vrapi_SubmitFrame() as "texture swap chains" (ovrTextureSwapChain). These texture swap chains are allocated through vrapi_CreateTextureSwapChain(). It is important to allocate these textures through the VrApi to allow them to be allocated in special system memory. When using a static layer texture, the texture swap chain does not need to be buffered and the chain only needs to hold a single texture. When the layer textures are dynamically updated, the texture swap chain needs to be buffered. When the texture swap chain is passed to vrapi_SubmitFrame(), the application also passes in the chain index to the most recently updated texture. The behavior of the TimeWarp compositor is no longer specified by selecting a warp program with a predetermined composition layout. The ovrFrameLayers in ovrFrameParms now determine how composition is performed. The application must specify the number of layers to be composited by setting ovrFrameParms::LayerCount, and the application must initialize each ovrFrameLayer in ovrFrameParms::Layers to achieve the desired compositing. See CinemaSDK in the SDK native samples for an example of how to setup and configure layer composition. The ovrFrameOption enumeration has been renamed to ovrFrameFlags, and the VRAPI_FRAME_OPTION_* flags have been renamed to VRAPI_FRAME_FLAG_*. The VRAPI_FRAME_OPTION_INHIBIT_CHROMATIC_ABERRATION_CORRECTION flag has been replaced with VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION. which is now set on ovrFrameLayer::Flags. Note that chromatic aberration correction is now disabled by default. The VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION flag must be set on each layer that needs chromatic aberration correction. Only enable chromatic aberration correction on the layers that need it, to maximize performance and minimize power draw. The ovrFrameLayer::WriteAlpha and ovrFrameLayer::FixedToView booleans have been replaced with the ovrFrameLayer::Flags VRAPI_FRAME_LAYER_FLAG_WRITE_ALPHA and VRAPI_FRAME_LAYER_FLAG_FIXED_TO_VIEW, respectively. The ovrFrameLayerTexture::TextureRect member now affects composition. When there are opportunities to eliminate fragment shading work in the compositor, regions outside the TextureRect may be skipped. However, rendering must still work correctly, even if the regions are not skipped, because in some cases, regions outside the rect must still be rendered. VrApi now allows several OpenGL objects to be explicitly passed through vrapi_EnterVrMode and vrapi_SubmitFrame. The structure ovrModeParms now allows the Display, WindowSurface and ShareContext to be passed explicitly to vrapi_EnterVrMode. If these OpenGL objects are explicitly set on the ovrModeParms, then it is not necessary to call vrapi_EnterVrMode from a thread that has an OpenGL context current on the active Android window surface. If these objects are not explicitly set on ovrModeParms (i.e., they are set to the default value of 0), then vrapi_EnterVrMode will behave the same way it used to, and vrapi_EnterVrMode must be called from a thread that has an OpenGL context current on the active Android window surface. The ovrFrameLayerTexture structure, as part of the ovrFrameParms, now allows a CompletionFence to be passed explicitly to vrapi_SubmitFrame. If this OpenGL object is explicitly set on all layer textures, then it

124 | Mobile Native SDK Migration Guide | Mobile

is not necessary to call vrapi_SubmitFrame from the thread with the OpenGL context current that was used to render the eye images. If this object is not explicitly set on all layer textures, then vrapi_SubmitFrame will behave the same it used to, and vrapi_SubmitFrame must be called from the thread with the OpenGL context current that was used to render the eye images. VrAppSupport Changes The support modules in VrAppSupport are now available as prebuilt libraries. SystemUtils Library All Gear VR application interactions with the System Activities application are managed through the SystemUtils library under VrAppSupport, which contains the System Activities API. All native applications must now link in this VrAppSupport/SystemUtils library by including the following line in the Android.mk file and adding the SystemUtils.aar (jar additionally provided) as a build dependency: $(call import-module,VrAppSupport/SystemUtils/Projects/AndroidPrebuilt/jni)

In order to utilize the SystemUtils library, applications must do the following: • Call SystemActivites_Init() when the application initializes. • Call SystemActivities_Update() to retrieve a list of pending System Activities events. • The application may then handle any events in this list that it wishes to process. The application is expected to call SystemActivities_RemoveAppEvent() to remove any events that it handled. • Call SystemActivities_PostUpdate() during the application’s frame update. • Call SystemActivities_Shutdown() when the application is being destroyed. Gear VR applications may use this API to send a specially formatted launch intent when launching the System Activities application. All applications may start System Activities in the Universal Menu and Exit to Home menu by calling SystemActivities_StartSystemActivity() with the appropriate PUI_ command. See the SystemActivities.h header file in the VrAppSupport/SystemUtils/Include folder for more details, and for a list of available commands. In theory, Gear VR applications may receive messages from System Activities at any time, but in practice these are only sent when System Activities returns to the app that launched it. These messages are handled via SystemActivities_Update() and SystemActivites_PostUpdate(). When Update is called, it returns an array of pending events. Applications may handle or consume events in this events array. If an application wishes to consume an event, it must be removed from the events array using SystemActivities_RemoveAppEvent(). This array is then passed through PostUpdate, where default handling is performed on any remaining events that have default behaviors, such as the reorient event. This example sequence illustrates how System Activities typically interacts with an application: 1. The user long-presses the back button. 2. The application detects the long-press and launches System Activities with SystemActivities_StartSystemActivity and the PUI_GLOBAL_MENU command. 3. In the Universal Menu, the user selects Reorient. 4. System Activities sends a launch intent to the application that launched the Universal Menu. 5. System Activities sends a reorient message to the application that launched it. 6. The application receives the message and adds it to the System Activities event queue. 7. The application resumes. 8. The application calls SystemActivities_Update() to get a list of pending events.

Mobile | Mobile Native SDK Migration Guide | 125

9. If the application has special reorient logic, e.g., to re-align UI elements to be in front of the user after reorienting, it can call vrapi_RecenterPose(), reposition its UI, and then remove the event from list of events. 10.The application calls SystemActivities_PostUpdate(). If the reorient event was not handled by the application, it is handled inside of PostUpdate. In the case of an exitToHome message, the SystemUtils library always handles the message with a default action of calling finish() on the application’s main Activity object without passing it on through the Update/ PostUpdate queue. This is done to allow the application to exit while still backgrounded. The VrAppFramework library already handles System Activities events, so native applications using the VrAppFramework do not need to make any System Activities API calls. After VrAppFramework calls SystemActivities_Update(), it places a pointer to the event array in the VrFrame object for that frame. Applications using the framework can then handle those System Activities events in their frame loop, or ignore them entirely. Any handled events should be removed from the event array using SystemActivities_RemoveAppEvent() before the application returns from it’s VrAppInterface::Frame() method. The Unity plugin (in Legacy 0.8+ and Utilities 0.1.3+) already includes the SystemUtils library and internally calls the System Activities API functions as appropriate. Unity applications cannot currently consume System Activities events, but can handle them by adding an event delegate using SetVrApiEventDelegate() on the OVRManager class. VrGui Library The VrGui library contains functionality for a fully 3D UI implemented as a scene graph. Each object in the scene graph can be functionally extended using components. This library has dependencies on VrAppFramework and must be used in conjunction with it. See the CinemaSDK, Oculus360PhotosSDK and Oculus360Videos SDKs for examples. VrLocale Library The VrLocale library is a wrapper for accessing localized string tables. The wrapper allows for custom string tables to be used seamlessly with Android’s own string tables. This is useful when localized content is not embedded in the application package itself. This library depends on VrAppFramework. VrModel Library The VrModel library implements functions for loading 3D models in the .ovrScene format. These models can be exported from .fbx files using the FbxConverter utility (see FBXConverter). This library depends on rendering functionality included in VrAppFramework. VrSound The VrSound library implements a simple wrapper for playing sounds using the android.media.SoundPool class. This library does not provide low latency or 3D positional audio and is only suitable for playing simple UI sound effects.

Migrating to Mobile SDK 0.6.2.0 This document is intended to help you upgrade from the 0.6.1 SDK to 0.6.2. Note: If you are upgrading from an 0.5.x SDK version to 0.6.0, see Migrating to Mobile SDK 0.6.0. If you are upgrading from a version older than 0.5.0, please refer to previous SDK documentation for migration instructions to 0.5.0.

126 | Mobile Native SDK Migration Guide | Mobile

Overview Mobile SDK v. 0.6.2 continues the restructuring of the native (C/C++) SDK started in 0.6.0. Native Development The following sections provide guidelines for updating your native project to the 0.6. mobile SDK. The native SDK samples which ship with the SDK are also a good reference for reviewing required changes (see VrSamples/Native). LibOVRKernel/Include/LibOVR.h has been renamed to LibOVRKernel/Include/OVR_Kernel.h and marked for future deprecation. The library project VrGUI now includes Java source and resource files. Makefile and build system changes project.properties If your project relies on the VrGUI library project, you will need to add a Java library reference to the project.properties file: android.library.reference.2=../../../../../VrGUI/Projects/Android

In general, if the library’s Projects/Android/src folder hierarchy contains any .java source files, you should add a library reference for that project to your project.properties file. .classpath, .project, .cproject For VrGUI Java library dependencies: 1. Go to Project -> Properties -> Java Build Path -> Projects tab. 2. Select Add and choose VrGUI. For Project Reference Dependencies: 1. Go to Project -> Properties -> Project References. 2. Select VrGUI. custom_rules.xml For projects which use the ant build system, you will need to add a pathelement entry to the project.library.res.folder.path for the VrGui resource folder:

See VrSamples/Native/VrTemplate for an example. VrAppInterface Changes VrAppInterface::DrawEyeView() now takes an additional parameter: Matrix4f DrawEyeView( const int eye, const float fovDegrees, ovrFrameParms & frameParms );

is now: Matrix4f DrawEyeView( const int eye, const float fovDegreesX, const float fovDegreesY, ovrFrameParms & frameParms );

Mobile | Mobile Native SDK Migration Guide | 127

Migrating to Mobile SDK 0.6.1.0 This document is intended to help you upgrade from the 0.6.0 SDK to 0.6.1.0 Note: If you are upgrading from an 0.5.x SDK version to 0.6.0, see Migrating to Mobile SDK 0.6.0. If you are upgrading from a version older than 0.5.0, please refer to previous SDK documentation for migration instructions to 0.5.0. Overview Mobile SDK v. 0.6.1 continues the restructuring of the native (C/C++) SDK started in 0.6.0. Many components that were previously part of VrAppFramework in 0.6.0 have been split out into new libraries that are located in the VrAppSupport folder in this version. LibOVR has now been renamed to LibOVRKernel to better represent its intended functionality and to maintain parity with the PC SDK. VrAppFramework now ships with a public includes folder - see VrAppFramework/Include. LibOVRKernel and VrAppFramework now ship with pre-built libraries in addition to full source. The VRPlatform library has now moved to 1stParty/VrPlatform. Native Development The following sections provide guidelines for updating your native project to the 0.6.1 mobile SDK. The native SDK samples which ship with the SDK are also a good reference for reviewing required changes (see VrSamples/Native). Makefile and build system changes Android.mk For projects which rely on LibOVR (now LibOVRKernel) and VrAppFramework, the following changes should be made: 1. Rename LibOVR->LibOVRKernel. 2. Specify the pre-built build path. $(call import-module,LibOVR/Projects/AndroidPrebuilt/jni) $(call import-module,VrAppFramework/Projects/AndroidPrebuilt/jni)

For projects which rely on the 3rdParty libraries, minizip or stb, the following changes should be made: $(call import-module,3rdParty/minizip/build/androidprebuilt/jni) $(call import-module,3rdParty/stb/build/androidprebuilt/jni)

For projects which rely on the VrAppSupport libraries, a reference to each needed library must be added to the project. Each of the VrAppSupport libraries has its own Projects/Android/Android.mk file which specifies the library’s name. For instance, VrLocale library’s Android.mk file contains: LOCAL_MODULE

:= vrlocale

# generate libvrlocale.a

128 | Mobile Native SDK Migration Guide | Mobile

The library name, in this case, vrlocale, must be used in the application’s Android.mk file to include the VrLocale library. First, add the name of the library to your project’s LOCAL_STATIC_LIBRARIES line as follows: LOCAL_STATIC_LIBRARIES += vrlocale vrappframework libovrkernel

Next, at the bottom of your Android.mk file, import the library module with: $(call import-module,VrAppSupport/VrLocale/Projects/Android/jni)

project.properties If your project relies on VrAppSupport, note that some of the VrAppSupport libraries include Java packages and require a Java library reference to be added to your project.properties file. android.library.reference.2=../../../../../VrAppSupport/VrLocale/Projects/Android android.library.reference.3=../../../../../VrAppSupport/VrSound/Projects/Android

In general, if the library’s Projects/Android/src folder hierarchy contains any .java source files you should add a library reference for that project to your project.properties file. Eclipse Project Files (.classpath, .project, .cproject) For VrAppSupport Java library dependencies: 1. Go to Project -> Properties -> Java Build Path -> Projects tab. 2. Select Add and choose VrSound and VrLocale. Now that VrAppFramework ships with pre-built .jar, remove it as a Java library dependency: 1. Go to Project->Properties->Java Build Path -> Projects tab. 2. Deselect VrAppFramework. For Project Reference Dependencies: 1. Go to Project -> Properties -> Project References. 2. Select VrSound and VrLocale. Now that LibOVR (LibOVRKernel) and VrAppFramework both ship with pre-built .so and .jar, both projects should be removed from the project references. 1. Go to Project->Properties->Project References. 2. Deselect LibOVR and VrAppFramework. VrMenu Changes If your project previously used the native GUI implementation (VRMenu, OvrGuiSys and OvrGazeCursor or related files) your make files must be modified to include the VrAppSupport/VrGui library. Most of the interface changes required for the new library were made in the 0.6.0 SDK, but some initialization steps have changed now that the GUI system is in its own library. VrGui depends on the VrAppFramework library and cannot be used without it. In order to use VrGui, create an OvrGuiSys pointer in your applications overloaded VrAppInterface class. In your OneTimeInit method, call OvrGuiSys::Create() to create an OvrGuiSys object. Call OvrGuiSys::Init on this object, passing a pointer to the VrAppFramework’s App object, a reference to an ovrSoundEffectPlayer object, the name of the SDF font to be used as the GUI’s default font, and a reference to Apps debug line object: GuiSys = OvrGuiSys::Create(); GuiSys->Init( this->app, *SoundEffectPlayer, “efigs.fnt”, app->GetDebugLines() );

Mobile | Mobile Native SDK Migration Guide | 129

VrSound Changes The SoundEffectPlayer is an object derived from the OvrGuiSys::SoundEffectPlayer class. OvrGuiSys requires an OvrGuiSys::SoundEffectPlayer reference in order to be initialized. For applications that play sounds from the GUI, create an overloaded OvrGuiSys::SoundEffectPlayer class, implement the virtual methods on the class to play any sound library, and pass an object of this class to the OvrGuiSys::Init method. For migration convenience, OvrGuiSys also provides a stubbed version of an overloaded OvrGuiSys::SoundEffectPlayer class named OvrGuiSys::ovrDummySoundEffectPlayer. This class will simply output log messages when its virtual methods are called and can be used as a temporary step in migrating to 0.6.1.

Migrating to Mobile SDK 0.6.0 This document is intended to help you upgrade from the 0.5.x SDK to 0.6.0. Note: If you are upgrading from a version older than 0.5.0, please refer to previous SDK documentation for migration instructions to 0.5.0. Overview Mobile SDK v. 0.6.0 introduces several major changes that necessitate updates to the VRLib structure, native app interface, and development workflow. VRLib has been restructured into three separate libraries in order to make the code more modular and provide for a smoother workflow: • LibOVR - the Oculus Library • VrApi - the minimal API for VR • VrAppFramework - the application framework used by native apps Both LibOVR and VrAppFramework ship with full source. The VrApi is shipped as a set of public include files, a pre-built shared library, and a jar file. Having the VrApi in a separate shared library allows the VrApi implementation to be updated and/or changed after an application has been released. This is important, as it makes it easy to apply hot fixes, implement new optimizations, and add support for new devices without requiring all applications to be recompiled with a new SDK. The Vr App Interface (now part of VrAppFramework) has been significantly simplified and now has a clearlydefined lifecycle. Previously, the order in which functions were called was unclear, and some functions could be called either in VR mode or outside of VR mode. The lifecycle can be found in VrAppFramework/Src/App.h The VRMenu code has been refactored in preparation for being moved into its own static library, VrGUI. Native Developers The following sections provide guidelines for updating your native project to the 0.6.0 mobile SDK. The native SDK samples which ship with the SDK are also a good reference for reviewing required changes (see VrSamples/Native). Makefile and build system changes The VRLib restructure requires several changes to the Android and Eclipse build files. Android.mk

130 | Mobile Native SDK Migration Guide | Mobile

Remove import_vrlib.mk from your makefile and replace it with the following import commands: $(call import-module,LibOVR/Projects/Android/jni) $(call import-module,VrApi/Projects/AndroidPrebuilt/jni) $(call import-module,VrAppFramework/Projects/Android/jni)

The cflags.mk file is now located at the top level of the SDK directory: include ../../../cflags.mk

You must add the following library dependencies (at minimum): LOCAL_STATIC_LIBRARIES := vrappframework libovr LOCAL_SHARED_LIBRARIES := vrapi

If your project relies on the stb or minizip, you must now specify those dependencies explicitly: LOCAL_STATIC_LIBRARIES += stb $(call import-module,3rdParty/stb) LOCAL_STATIC_LIBRARIES += minizip $(call import-module,3rdParty/minizip)

project.properties Remove VRLib as an Android library reference and instead, use: android.library.reference.1=../../../VrAppFramework/Projects/Android

.classpath, .project, .cproject Remove all references to VRLib in your Eclipse build files and replace with the new project libraries. For Java library dependencies: 1. Go to Project -> Properties -> Java Build Path -> Projects tab. 2. Select Add and choose VrAppFramework. For Project Reference Dependencies: 1. Go to Project -> Properties -> Project References. 2. Select LibOVR and VrAppFramework. Note: VrApi ships the pre-built .so and .jar at the location VrApi/Libs/Android. VrApi source is no longer included with the SDK. New Build Scripts For developers who prefer to use command-line scripts to build native projects instead of Eclipse or other IDEs, the 0.6.0 Mobile SDK provides a robust cross-platform set of command line scripts. Earlier versions of the SDK shipped with separate build scripts for Windows and Mac, which could lead to disjointed functionality between the platforms. We now provide a common Python build script located in bin/ scripts/build/ovrbuild.py, and light-weight wrapper scripts in the individual project folders. To build using the new Python scripts, navigate to your project directory from a command or terminal window and invoke the build.bat or build.py script from that project directory. The scripts takes the following arguments: • clean - cleans the project’s build files

Mobile | Mobile Native SDK Migration Guide | 131

• debug - builds a debug application

• release - builds a release application • -n - skips installation of the application to the phone After building the application, the build script queries adb for the device signatures of any connected devices and verifies that the application has valid Oculus Signature Files for those devices before installation. If no matching signature is found, the script will output a warning and skip the installation step. General Absolute paths for any of the VRLib include paths in your project must be changed to relative paths. For example, instead of #include "LibOVR/Src/Kernel/OVR_Threads.h" use #include "Kernel/ OVR_Threads.h" ovr_GetTimeInSeconds() has been replaced with vrapi_GetTimeInSeconds(). However, do not use this time as a seed for simulations, animations, or other logic. For example, animations should not be updated based on the "real time" at which the animation code is executed. Instead, animations should be updated based on the estimated time at which they will be displayed. Using the "real time" introduces intra-frame motion judder when the code is not executed at a consistent point in time every frame. In other words, for simulations, animations, and other logic, use the time returned by vrapi_GetPredictedDisplayTime(). This function is not directly available in the application framework. The time returned by vrapi_GetPredictedDisplayTime() is stored on VrFrame, and passed to applications as PredictedDisplayTimeInSeconds. VrFrame also provides DeltaSeconds, which is the delta between the PredictedDisplayTimeInSeconds from the previous frame and the current frame. VrFrame::DeltaSeconds can be used to incrementally update a simulation. Java VrActivity Changes If your application subclasses VrActivity, you will need to make the following changes to your Java activity: Instead of import com.oculusvr.vrlib.VrActivity; import com.oculusvr.vrlib.VrLib;

use import com.oculus.vrappframework.VrActivity;

You must also change any references to classname VrLib with VrActivity: String commandString = VrActivity.getCommandStringFromIntent( intent ); String fromPackageNameString = VrActivity.getPackageStringFromIntent( intent ); String uriString = VrActivity.getUriStringFromIntent( intent );

appPtr is no longer directly exposed on VrActivity. Replace any references to appPtr with the appropriate accessor call: long getAppPtr(); void setAppPtr(long appPtr);

VrAppInterface Changes VrAppInterface Lifecycle The new VrAppInterface lifecycle is as follows: 1.

Configure(settings)

<--------+ |

132 | Mobile Native SDK Migration Guide | Mobile

2. 3. 4.

if (firstTime) OneTimeInit(intent) | if (newIntent) NewIntent(intent) | EnteredVrMode() | <--+ | 5. while(keyEvent) OnKeyEvent() | | 6. Frame() | | 7. DrawEyeView(left) | | 8. DrawEyeView(right) | | ---+ | 9. LeavingVrMode() | ---------+ 10. OneTimeShutdown() All VrAppInterface virtual functions are called from the VR application thread. All functions except for VrAppInterface::Configure() and VrAppInterface::OneTimeShutdown() are now called while the application is in VR mode. VrAppInterface::Configure VrAppInterface::ConfigureVrMode() was renamed to VrAppInterface::Configure(). virtual void

ConfigureVrMode( ovrModeParms & modeParms );

is now: virtual void

Configure( ovrSettings & settings );

The new VrAppInterface::Configure() uses an in-out struct parameter that allows applications to change a lot more settings: struct ovrSettings { bool bool bool int int ovrModeParms ovrEyeBufferParms ovrHeadModelParms };

ShowLoadingIcon; UseSrgbFramebuffer; UseProtectedFramebuffer; FramebufferPixelsWide; FramebufferPixelsHigh; ModeParms; EyeBufferParms; HeadModelParms;

• The ovrSettings::ShowLoadingIcon bool replaces VrAppInterface::ShouldShowLoadingIcon(). • The ovrSettings::UseSrgbFramebuffer bool replaces VrAppInterface::GetWantSrgbFramebuffer(). • The ovrSettings::UseProtectedFramebuffer bool replaces VrAppInterface::GetWantProtectedFramebuffer(). ovrSettings::FramebufferPixelsWide and ovrSettings::FramebufferPixelsHigh are new, and allow an application to set a lower resolution frame buffer which will be automatically scaled up by the Android hwcomposer. VrAppInterface::EnteredVrMode / VrAppInterface::LeavingVrMode The following lifecycle notifications have been replaced: • • • •

virtual void WindowCreated(); virtual void WindowDestroyed(); virtual void Paused(); virtual void Resumed();

Instead, the VrAppInterface now provides the following functions: • virtual void EnteredVrMode(); • virtual void LeavingVrMode();

Mobile | Mobile Native SDK Migration Guide | 133

The functions VrAppInterface::WindowCreated(), VrAppInterface::WindowDestroyed(), VrAppInterface::Resumed() and VrAppInterface::Paused() were confusing because they were hooked up to two different Android lifecycles (Activity/Surface). Any of these functions could be called outside of VR mode, which is bad, because various VrApi functions will not be available. Even though the application may not call VrApi functions directly, they may be called indirectly. An application basically had to figure out whether it was in VR mode by checking if both VrAppInterface::WindowCreated() and VrAppInterface::Resumed() had been called without VrAppInterface::WindowDestroyed() or VrAppInterface::Paused() being called in between. What the application really cares about is when it enters VR mode, and when it leaves VR mode. Therefore the VrAppInterface::WindowCreated(), VrAppInterface::WindowDestroyed(), VrAppInterface::Resumed() and VrAppInterface::Paused() functions have been replaced with VrAppInterface::EnteredVrMode(), which is called just after the application has entered VR mode, and VrAppInterface::LeavingVrMode(), which is called right before the application leaves VR mode. In other words, both functions are called while in VR mode, which allows the application to directly or indirectly call any of the VrApi functions. VrAppInterface::NewIntent VrAppInterface::NewIntent() used to be called outside of VR mode, which is bad because various VrApi functions will not be available. Even though the application may not call VrApi functions directly from NewIntent(), they may be called indirectly. VrAppInterface::NewIntent() is now called when the application is in VR mode even though the application has not yet have been notified through VrAppInterface::EnteredVrMode(). Now that VrAppInterface::NewIntent() is called while in VR mode, a time warped loading icon can be shown in case VrAppInterface::NewIntent() needs to load assets or perform other time consuming work. VrAppInterface::Command The functions VrAppInterface::Command() and App::GetMessageQueue() have been removed. These functions basically hijacked the message queue used to handle Android lifecycle events. Using this message queue also did not provide any guarantee as to when VrAppInterface::Command() would be called. As a result, VrAppInterface::Command() could be called while the application was not in VR mode. This is bad, because various VrApi functions are not available. Even though the application may not call VrApi functions directly from VrAppInterface::Command(), they may be called indirectly. Applications that need to receive messages from other threads can easily add their own message queue and process these messages at the beginning of VrAppInterface::Frame(). AppInterface::Frame The declaration for Frame has changed: virtual Matrix4f Frame( VrFrame vrFrame );

is now virtual Matrix4f Frame( const VrFrame & vrFrame );

AppInterface::OnKeyEvent The declaration for OnKeyEvent has changed: virtual bool OnKeyEvent( const int keyCode, const KeyState::eKeyEventType eventType );

134 | Mobile Native SDK Migration Guide | Mobile

is now virtual bool OnKeyEvent( const int keyCode, const int repeatCount, const KeyEventType eventType );

App/AppLocal Changes We have made significant changes to the App/AppLocal interface. Many of these should be transparent to most applications, but a partial list of changes includes: • Some App and AppLocal methods were re-ordered and grouped for clarity. • Some App and AppLocal methods relating to VRMenu were removed or moved to OvrGuiSys (see the “VRMenu Changes” section that follows this section). • Added App::LoadFontForLocale() so applications can load a localized signed-distance-field font without knowing specifics about the current locale. This is now normally passed to OvrGuiSys at initialization time. Applications using VrAppFramework will always load an EFIGS (English, French, Italian, German, Spanish) font for debugging purposes. • App::GetUiJni() was removed, as it was almost always incorrect to use the UI JNIEnv* in application code. Instead, use the JNIEnv* allocated for the VrThread, returned by GetVrJni(). • App::GetSwapParms() was renamed to App::GetFrameParms() and App::GetSensorForNextWarp() has been replaced with App::GetHeadPoseForNextWarp(). • Added App::FatalError(). This sends an intent to System Activities instructing it to show the specified fatal error in VR. When paired with the latest version of System Activities, running an app with an invalid Oculus signature file will trigger this error. VRMenu Changes The VRMenu code has been refactored in order to extricate it fully from the application framework. Applications using VRMenu must now instantiate a single OvrGuiSys object which contains OvrGazeCursor, DefaultFont, DefaultFontSurface and the OvrVRMenuMgr. Previously, these objects were allocated by AppLocal and passed individually in many cases. These components have been removed from the App Interface and should now be referenced through OvrGuiSys. If your application uses VRMenu, make the following adjustments to your application: Several functions on the VRMenu virtual interface have changed. If your application overloads any of them, you will need to adjust the function declarations: virtual void Frame_Impl( OvrGuiSys & guiSys, VrFrame const & vrFrame, gazeCursorUserId_t const gazeUserId ); virtual bool OnKeyEvent_Impl( OvrGuiSys & guiSys, int const keyCode, const int repeatCount, KeyEventType const eventType ); virtual void Open_Impl( OvrGuiSys & guiSys ); virtual void Close_Impl( OvrGuiSys & guiSys ); virtual void OnItemEvent_Impl( OvrGuiSys & guiSys, VRMenuId_t const itemId, class VRMenuEvent const & event ); virtual void ResetMenuOrientation_Impl( Matrix4f const & viewMatrix );

In your application’s constructor, you will need to create an OvrGuiSys object: GuiSys( OvrGuiSys::Create() )

Mobile | Mobile Native SDK Migration Guide | 135

And, destroy the OvrGuiSys in the destructor: OvrGuiSys::Destroy( GuiSys );

In OneTimeInit, initialize the OvrGuiSys: GuiSys->Init( app, &app->GetSoundMgr(), app->LoadFontForLocale(), &app->GetDebugLines() );

Render the OvrGuiSys in your application’s DrawEyeView call: GuiSys->RenderEyeView( Scene.CenterViewMatrix(), mvp );

In OnKeyEvent, call the key event handling on the OvrGuiSys: if ( GuiSys->OnKeyEvent( keyCode, repeatCount, eventType ) ) { return true; }

At the beginning of your application’s Frame call, call OvrGuiSys::BeginFrame(): // Reset any VR menu submissions from previous frame. GuiSys->BeginFrame();

At the end of your application’s Frame call, call OvrGuiSys::Frame(): // update gui systems after the app frame, but before rendering anything GuiSys->Frame( vrFrame, Scene.CenterViewMatrix() );

136 | Release Notes | Mobile

Release Notes This section describes changes for each version release.

1.0 Release Notes This document provides an overview of new features, improvements, and fixes included in the latest version of the Oculus Mobile SDK. 1.0.3 Overview of Major Changes Multi-view Mobile SDK 1.0.3 adds multi-view rendering support. Multi-view rendering allows drawing to both eye views simultaneously, significantly reducing driver API overhead. It includes GPU optimizations for geometry processing. Preliminary testing has shown that multi-view can provide: • 25-50% reduction in CPU time consumed by the application • 5% reduction in GPU time on the ARM Mali • 5%-10% reduction in power draw Obviously the freed up CPU time could be used to issue more draw calls. However, instead of issuing more draw calls, we recommend that applications maintain the freed up CPU time for use by the driver threads to reduce/eliminate screen tears. While current driver implementations of multi-view primarily reduce the CPU usage, the GPU usage is not always unaffected. On the Exynos based devices, multi-view not only reduces the CPU load, but slightly reduces the GPU load by only computing the view-independent vertex attributes once for both eyes, instead of separately for each eye. Even though there are significant savings in CPU time, these savings do not directly translate into a similar reduction in power draw. The power drawn by the CPU is only a fraction of the total power drawn by the device (which includes the GPU, memory bandwidth, display etc.). Although all applications will have their unique set of challenges to consider, multi-view should allow most applications to lower the CPU clock frequency (CPU level) which will in turn improve power usage and the thermal envelope. However, this does not help on the Exynos based devices where CPU level 1, 2 and 3 all use the same clock frequency. Multi-view will not be available on all Gear VR devices until driver and system updates become available. The current set of supported devices as of the date of this release is: • • • • • •

S6 / Android M S6+ / Android M S6 Edge / Android M Note 5 / Android M Exynos S7 / Android M Exynos S7+ / Android M

Mobile | Release Notes | 137

For detailed instructions on how to structure a native application for multi-view rendering, see Migrating to Mobile SDK 1.0.3 on page 114. We are working with Unity and Epic to support multi-view in Unity and Unreal Engine. VrAppInterface VrAppInterface has been refactored to simplify the interface, support multi-view rendering, and enforce per-frame determinism. We highly recommend updating your VrAppInterface based application to support multi-view. However, even if you are not planning on supporting multi-view, it would be good to adopt the VrAppInterface changes because they also pave the way for Vulkan support in the future. VrApi VrAppFramework-based applications now explicitly pass EGL objects to VrApi. Previously, the various VrApi functions had to be called from a thread with a specific EGLContext current. The current EGLContext and EGLSurface were basically invisible parameters to the VrApi functions. By explicitly passing the necessary EGL objects to the API, there are no threading restrictions. Volume notifier is now rendered automatically in VrApi as a TimeWarp layer - the application is no longer responsible for generating and displaying this notification. Be sure not to render your own volume interface! Build process Various build steps have been moved from the Python build scripts into Gradle. New Features • • • •

Volume Notifier now rendered automatically in VrApi as a TimeWarp Layer. VrAppFramework now supports multi-view rendering path. VrAppFramework now uses explicit EGL objects. GlTexture now supports RGBA ASTC compressed textures.

API Changes • VrAppInterface::OneTimeInit and VrAppInterface::NewIntent have been replaced by VrAppInterface::EnteredVrMode. This function is called right after an application entered VR mode. • VrAppInterface::OneTimeShutdown has been removed in favor of moving shutdown code to the destructor of the VrAppInterface derived class. • VrAppInterface::LeavingVrMode is now called right before the application is about to leave VR mode. • VrAppInterface::Frame now takes an ovFrameInput structure and returns an ovrFrameResult structure. • VrAppInterface::OnKeyEvent was removed. Key events are now explicitly handled in VrAppInterface::Frame. • VrApi ovrModeParmFlags now provide VRAPI_MODE_FLAG_NATIVE_WINDOW for specifying the ANativeWindow explicitly. Bug Fixes • Fixed docked / mounted queries to be accurate without requiring an initial event. • Sample apps no longer prompt for an SD card on devices that don’t support external memory. Known Issues • When converting your app to be multi-view compliant, ensure that your System Activities version is at least 1.0.3.1 or you will receive a required system update message.

138 | Release Notes | Mobile

1.0.0.1 Overview of Major Changes This minor patch release fixes problems with OS X development and splits Oculus Remote Monitor from the Mobile SDK to make it easier to access for developers using a third-party game engine. New Features • Oculus Remote Monitor • VR Developer Mode is no longer required if you have System Activities 1.0.2 or greater and an app built with Oculus Mobile SDK 1.0 or greater. • Exposed experimental layer texel density and complexity visualizers (supported by apps built with Oculus Mobile SDK 1.0 or later). • Now available as a separate downloadable package from the full Mobile SDK download. Bug Fixes • Fixed OS X “No such file or directory” build problem. • Oculus Remote Monitor • Improved network stability on Windows • Increased VR thread stack size. 1.0.0 Overview of Major Changes VrApi is now dynamically loaded from a separate APK, allowing it to be updated without requiring developers to recompile applications. This will allow us to fix bugs, support new devices, and add new OS versions without disrupting development. VrApi now presents the core minimal API for VR rendering. System-level functionality not necessary for VR rendering has been moved to a VrAppSupport/SystemUtils library. This library primarily deals with loading, and with receiving events from the Universal Menu. Various OpenGL objects may now be explicitly passed to the VrApi. In this case, functions such as vrapi_EnterVrMode and vrapi_SubmitFrame do not need to be called from a thread with a specific OpenGL context current. All native applications are now built and developed using Gradle and Android Studio instead of ANT and Eclipse. It is important to note that while the command-line Gradle build path is mature, Android Studio support for native development should still be considered experimental. Feedback on our developer forums is appreciated! Important Change to SystemActivities The VrApi implementation is now deployed as part of SystemActivities, which is automatically updated. When updating to the latest SDK, it is important to make sure the device is online to allow the SystemActivities to be automatically updated. If for some reason the SystemActivities is not up to date when launching a new application, you may get the message “Oculus updates needed.” To allow the application to run, the SystemActivities must be updated. If the auto-update function has not delivered the latest SystemActivities to your device, please ensure that you are connected to a working Wi-Fi network. It may take up to 24 hours for the update to trigger, so please be patient. If your development schedule requires a timely update, you may download the SystemActivities APK

Mobile | Release Notes | 139

from this location as temporary relief during this transition period. Future updates should always be processed by the automatic update system on the Gear VR platform. New Features • • • • •

VrApi now dynamically loads from a separate APK. Various OpenGL objects may now be explicitly passed to VrApi, lifting some threading restrictions. Added support for Gradle and experimental support for Android Studio. Added support for the Samsung Galaxy S6 Edge+ and Note 5. TimeWarp Debug Graph may now be toggled on and off during runtime via VrApi Frame flags.

API Changes • vrapi_Initialize now returns an ovrInitializeStatus. • vrapi_GetHmdInfo has been replaced with vrapi_GetSystemPropertyInt and vrapi_GetSystemPropertyFloat. • ovr_DeviceIsDocked, ovr_HeadsetIsMounted, ovr_GetPowerLevelStateThrottled and ovr_GetPowerLevelStateMinimum have been replaced with vrapi_GetSystemStatusInt and vrapi_GetSystemStatusFloat. • Various functions from VrApi_Android.h have been moved to the VrAppSupport/SystemUtils library. • Various performance metrics can now be queried through vrapi_GetSystemStatusInt and vrapi_GetSystemStatusFloat. Bug Fixes • Fixed reorient on mount. • Fixed spurious incorrect long-press event after a short-press plus a long frame. • Fixed invalid input clock levels resulting in dynamic clock frequency mode.

0.6 Release Notes This document provides an overview of new features, improvements, and fixes included in the latest version of the Oculus Mobile SDK. 0.6.2.0 Overview of Major Changes The 0.6.2.0 version of the Oculus Mobile SDK includes a change to how we link with OpenGL ES. Previous versions of the SDK had a hard link to libGLESv3.so, which is problematic for engines which integrate the VrApi and must support non-VR devices. Beginning with 0.6.2, the mobile SDK dynamically loads all OpenGL ES symbols. The source for the Unity SDKExample MediaSurface Plugin is now provided. The Media Surface Plugin provides a native library which is used with the Android MediaPlayer for hardware decoding of a single video to a texture. Note that the SDKExample MediaSurface Plugin is not intended to be production quality, and is provided for reference purposes. We have made the source available in case you would like to modify it for your use. The source and build files are located at the following SDK path: VrAppSupport/MediaSurfacePlugin. For detailed instructions on updating native projects to this SDK version, see Mobile Native SDK: Migration.

140 | Release Notes | Mobile

New Features • VrApi • Dynamically loads OpenGL ES symbols instead of hard-linking to libGLESv3.so. • VrCapture • Added mode to capture straight to local flash storage instead of waiting for a remote connection. Useful for automated testing, capturing the startup sequence, and working around issues caused by lowbandwidth WiFi networks. API Changes • VrApi • Remove several non-VR android-specific interface calls. • Native Application Framework • VrGUI Library project now contains Java source and resources. • VrAppInterface::DrawEyeView() takes an additional parameter, see Migrating to Mobile SDK 0.6.2 for more information. Bug Fixes • VrApi • Fixed thread affinity failing to be set for the TimeWarp and DeviceManager threads. • Fixed device reset when repeatedly plugging and unplugging the Travel Adapter on the Samsung GALAXY S6. • VrAppFramework • Optimized GazeCursor to render in two draw calls per eye instead of 32. • Fixed GlGeometry BuildGlobe so that equirect center is at -Z (forward). 0.6.1.0 Overview of Major Changes The 0.6.1.0 version of the Oculus Mobile SDK includes major structural changes to the native VrAppFramework library. Several subsystems that were previously part of VrAppFramework have been moved into individual libraries under the VrAppSupport folder, making the application framework leaner and reducing its focus to handling the Android application lifecycle. LibOVR has now been renamed to LibOVRKernel to better represent its intended functionality and to maintain parity with the PC SDK. The Java activity code has been refactored to remove dependencies on the VrActivity class for applications that do not use VrAppFramework. The SDK Native Samples were updated to a cross-platform friendly structure. Android Library Projects are now found at [ProjectName]/Projects/Android. Changes to Unity Integration The Oculus Unity Integration is no longer bundled with the Mobile SDK and must now be downloaded separately from our Downloads page: https://developer.oculus.com/downloads/

Mobile | Release Notes | 141

We now provide a Utilities for Unity package for use with Unity’s first-party VR support (available in Unity 5.1+). For legacy project development, we currently offer a legacy Unity Integration package for use with Unity 4.6.7 and later. Please see our Unity documentation for more information. If you are using the Utilities for Unity Package with Unity 5.1+, the SDKExamples are now available for download separately. If you are using the Legacy Unity Integration, update to Oculus Runtime for OS X 0.5.0.1. Before updating your runtime, be sure to run uninstall.app (found in /Applications/Oculus/) to remove your previous installation. New Features • VrApi • Images composited by the time warp are now allocated through the VrApi as “texture swap chains”. • Performance params (CPU/GPU level, thread ids) can now be adjusted every frame through vrapi_SubmitFrame. • adb logcat -s VrApi now reports the thread affinity. • ovr_GetSystemProperty now provides options for querying GpuType, Device external memory, and max fullspeed framebuffer samples, see ovrSystemProperty in VrApi_Types.h • VrCubeWorld • Added example to VrCubeWorld_SurfaceView and VrCubeWorld_NativeActivity samples to reduce the latency by re-sampling the tracking state later in the frame. • VrTemplate • make_new_project script is now converted to Python for cross-compatibility. • VrCapture / OVRMonitor • VrCapture may now be integrated into and collect data from multiple shared libraries in your app simultaneously (previously you could capture from VrApi or from your app, but not both at the same time). • OpenGL and logcat calls are now captured throughout the entire process. • Applications may now expose user-adjustable variables via OVR::Capture::GetVariable() and tweak the values in real-time in OVRMonitor. • Frame Buffer capturing now does basic block-based compression on the GPU, reducing network bandwidth by 50%. • GPU Zones are enabled, but we recommend only using them on the Samsung GALAXY S6. • Added Settings View for toggling VR Developer Mode. • Sensor Graphs now turn red when values exceed max defined by SetSensorRange(). API Changes • Native Application Framework • VRMenu, OvrGuiSys, OvrGazeCursor and related classes have been moved to the VrAppSupport/VrGui library. • OvrSceneView, ModelFile and related classes have been moved to the VrAppSupport/VrModel library. • Localization-related functionality has been moved to the VrAppSupport/VrLocale library. • The sound pool and related classes have been moved to the VrAppSupport/VrSound library. • The VrGui library now uses the SoundEffectPlayer interface for sound playback, replacing SoundManager. This simple interface can be overloaded to allow VrGui sounds to be played by any sound library. • VrActivity java class now subclasses Android Activity instead of ActivityGroup.

142 | Release Notes | Mobile

Bug Fixes • VrAPI • Fixed adb logcat -s VrApi failure to report memory stats. • Native Application Framework • Fixed a bug where missing font glyphs were skipped instead of rendering as an asterisk. • Cinema SDK • Fixed last media poster showing up as the first poster for another category. • Play/Pause icon does not function correctly after unmount/mount. • Unmount/Mount does not pause media immediately. • Fixed bad camera orientation in Void theater when auto-switching from another theater due to starting a video with a resolution greater than 1920x1080. • 360 Photos SDK • Fixed Favorites and Folder Browser icon switching in the attribution menu. • Fixed menu state bug causing background scene not to be drawn. • Fixed menu orientations not resetting on reorient. • Increased vertical spacing between categories in Folder Browser to improve thumbnail scrollbar fit. • 360 Videos SDK • Fixed media failure to pause immediately when unmounted. • Fixed movie not pausing on launching system activities. • Fixed menu orientation when resuming app. • Fixed gaze cursor not showing up when in Browser. • Build Scripts • Fix for devices over adb tcpip: If the phone was connected over TCP, it was trying to find oculussig_WWW.XXX.YYY.ZZZ:PPP when checking for the oculussig file. • If an install and run was requested but no devices found, now reports to user rather than quitting silently. • Change directories in an exception-safe manner. • VrCapture / Remote Monitor • Fixed rare crash when disconnecting from remote host on OS X. • Reconnecting to an app multiple times no longer puts the capture library in an undefined state. Known Issues • Unity 4 with Oculus Runtime for OS X 0.4.4 and Legacy Integration 0.6.1.0 or 0.6.0.2 • Editor crashes when building APK or pressing play in Play View; Mac standalone player crashes. To fix, update to Oculus Runtime for OS X 0.5.0.1. Before updating your runtime, be sure to run uninstall.app (found in /Applications/Oculus/) to remove your previous installation.VrApi implicitly links to libGLESv3.so, so currently you cannot load libvrapi.so on devices without OpenGL ES 3.0 support. • VrCapture / Remote Monitor • • • •

GPU Zones currently work on the Galaxy S6 only. Timer Queries are not functional on Adreno based devices. Integrated systrace support is under development and is currently disabled. Some VPNs break auto-discovery.

Mobile | Release Notes | 143

0.6.0.1 Overview This release of the Mobile SDK includes a fix to performance issues related to our Universal Menu and a hitching problem associated with data logging in VrApi, as well as some other minor updates. If you are upgrading from Mobile SDK v 0.5, be sure to review the 0.6.0 release notes for additional information on significant changes to our native development libraries as well as other updates. Note that our Mobile SDK documentation is now available online here: https://developer.oculus.com/ documentation/mobilesdk/latest/ New Features • Allow Unity MediaSurface dimensions to be modified via plugin interface. Bug Fixes • • • • •

Fixed performance regression triggered when coming back from the Universal Menu. Fixed not being able to enable chromatic aberration correction in the Unity plugin. Reduced once per second frame drop due to gathering stats. Fixed Do Not Disturb setting. Fixed adjusting clock levels from Unity during load.

Known Issues • adb logcat -s VrApi always reports the amount of available memory as 0. 0.6.0 Overview of Native Changes The 0.6.0 version of the Oculus Mobile SDK introduces several major changes that necessitate updates to the VRLib structure, native app interface, and development workflow. If you are migrating from a previous SDK, please refer to the “Migrating from Earlier Versions” sections of the Native Development guide. VRLib has been restructured into three separate libraries in order to make the code more modular and to provide a smoother workflow: • LibOVR – the Oculus Library • VrApi – the minimal API for VR • VrAppFramework – the application framework used by native apps Both LibOVR and VrAppFramework ship with full source. The VrApi is shipped as a set of public include files, a pre-built shared library, and a jar file. Shipping VrApi as a separate shared library allows the VrApi implementation to be updated and/or changed after an application has been released. This allows us to apply hot fixes, implement new optimizations, and add support for new devices without requiring applications to be recompiled with a new SDK. VrApi source is no longer included with the SDK. The Vr App Interface (now part of VrAppFramework) has been simplified and now has a clearly-defined lifecycle. The order in which functions are called has been clarified – previously, some functions could be called either in VR mode or outside of VR mode. The lifecycle can be found in VrAppFramework/Src/App.h. The VRMenu code has been refactored in preparation for moving it into its own static library. User Interfacerelated interfaces that were previously passed to functions individually are now part of OvrGuiSys.

144 | Release Notes | Mobile

There are three new native samples. These samples implement the same simple scene in three different ways, illustrating three approaches to Native application development • VrCubeWorld_SurfaceView – uses a plain Android SurfaceView and handles all Activity and Surface lifecycle events in native code. This sample uses only the VrApi and uses neither the Oculus Mobile Application Framework nor LibOVR. • VrCubeWorld_NativeActivity – uses the Android NativeActivity class. This sample uses only the VrApi and uses neither the Oculus Mobile Application Framework nor LibOVR. • VrCubeWorld_NativeActivity – uses the Oculus Mobile Application Framework. For developers who prefer to use command-line scripts to build native projects, this SDK provides a robust cross-platform set of python build scripts to replace the platform specific build scripts provided with previous SDKs. Overview of Unity Integration Changes • Oculus Runtime is no longer required for mobile development. • Synced with the Oculus PC SDK 0.6.0.0 beta. • Allows clients to re-map plugin event IDs. For both the PC and Mobile SDKs we recommend the following Unity versions or higher: Unity Pro 4.6.3, Unity Professional 5.0.2.p2, Unity Free 4.6, or Unity Personal 5.0.2.p2. For mobile development, compatibility issues are known to exist with Unity 5 and OpenGL ES 3.0 – please check back for updates. Earlier versions of Unity 5 should not be used with the Mobile SDK. Note: Before installing or integrating this distribution, we strongly recommend backing up your project before attempting any merge operations. New Features • VrAPI • Improved frame prediction, in particular for Unity. • Leaving the CPU clock unlocked until the application starts rendering frames to make application loading/resuming faster. • Improved Performance Metrics via Logcat (see Basic Performance Stats through Logcat section of the Native Development Guide for more information). • Native Application Framework • Improved Android Activity and Android Surface lifecycle handling. • Fixed volume bar not showing on first click of the volume adjustment. • 360 Photos SDK • Gaze Cursor now disappears when looking away from Attribution menu. • Blocksplosion • Added OS X input mappings. API Changes • Native Application Framework • Automatic caching of files extracted from apk. Bug Fixes • VrAPI

Mobile | Release Notes | 145

• Removed additional frame of latency between synthesis and display. • Fixed intra frame object motion judder due to TimeWarp displaying eye buffers too early when eye buffer rendering completed early. • Fixed TimeWarp getting more than one frame behind after a bad hitch. • Workaround for "loss of head tracking" after closing and re-opening the device 96 times. • Native Application Framework • Fixed volume bar not showing on first click of the volume adjustment. • Unity Integration • Fixed prediction glitch every 64 frames. • Use correct prediction for OVR_GetCameraPositionOrientation. • Fixed location of the PlatformMenu Gaze Cursor Timer. • Cinema SDK • Fixed playback control reorienting screen in Void theater when user clicks on controls when they're off the screen on portrait videos. • Fixed divide by zero in SceneManager::GetFreeScreenScale() which caused Void theater to crash when starting a movie. • 360 Photos SDK • Fixed Favorites button not creating Favorites folder. • Blocksplosion • Fixed launch blocks falling straight down when launched when built with Unity 5. • Fixed touch triggering “next level” after returning from the System Activity. • Fixed launch block being offset when looking left or right. Known Issues • Initial launch of 360Photos SDK Sample can crash if a duplicate category folder name is present on the target device’s sdcard. Subsequent launches of the app will not crash. A fix is in the works for the next release.

0.5 Release Notes This document provides an overview of new features, improvements, and fixes included in the version 0.5 of the Oculus Mobile SDK. 0.5.1 Overview of Major Changes This document provides an overview of new features, improvements and fixes that are included in this distribution of the Oculus Mobile SDK. The most significant change in 0.5.1 is to System Activities event handling in Unity. The 0.5.0 code for handling System Activities events in Unity was doing heap allocations each frame. Though this was not a leak, it would cause garbage collection to trigger much more frequently. Even in simple applications, garbage collection routinely takes 1 to 2 milliseconds. In applications that were already close to dropping below 60 Hz, the

146 | Release Notes | Mobile

increased garbage collection frequency could cause notable performance drops. The event handling now uses a single buffer allocated at start up. Other notable changes were to HMT sensor prediction — specifically clamping of the delta time used for prediction. Without this change delta times on application startup could sometimes be huge, causing an apparent jump in screen orientation. Unity Developers: As with Mobile SDK v 0.5.0, Unity developers using this SDK version must install the Oculus Runtime for Windows or OS X. This requirement will be addressed in a future release of the SDK. Note: Before installing or integrating this distribution, we strongly recommend that you backup your project before attempting any merge operations. API Changes • Sensor Prediction: Make sure Predicted deltaTime can never be negative or become huge. • Sensor Prediction: Clamp delta time used for sensor prediction to 1/10th of a second instead of 1/60th so that we don’t under-predict if the target frame rate is not being met. • Better handling for a case where SystemActivities resumes without an explicit command. This can happen if the top app crashes or does a finish() instead of launching Home to exit. Bug Fixes • Unity Integration • Rework System Activities Event handling to prevent any per-frame allocations that could trigger Garbage Collector. • Native Framework • Fixed potential MessageQueue deadlock • Bitmapfont - Fix case where billboarded text is right on top of the view position and results in a zerolength normal vector. • Bitmapfont - Fix for font info height not being y-scaled. • Renamed VERTICAL_BOTTOM to VERTICAL_BASELINE because it aligns to the first row’s baseline rather than the bottom of the entire text bounds. • Bitmapfont - Fix for VERTICAL_CENTER_FIXEDHEIGHT to correctly account for the ascent / descent when rendering single and multi-line text. • VrMenu Fader - Update only performed if frame time is > 0.0f. • VrMenu - Add ProgressBar component. • VrMenu - Parent / child rotation order in menus was backwards, causing confusion when local rotations were used. • VrMenu - Don’t use an old view matrix to reposition menus on a reorient. Since we reorient to identity (with respect to yaw) we should reposition with respect to identity instead of the last frame’s view matrix. • AppLocal::RecenterYaw() now adjusts lastViewMatrix so that it instantly reflects the recenter of the sensor fusion state. • FolderBrowser - Allow implementers to create their own panel object. Known Issues • Application version number remains 0.5.0 and was not incremented to 0.5.1. This does not affect app functionality and will be addressed in a future release. • For use with the Mobile SDK, we recommend Unity versions 4.6.3. The Mobile SDK is compatible with Unity 5.0.1p2, which addresses a problem with OpenGL ES 3.0, but there is still a known Android ION memory leak. Please check back for updates.

Mobile | Release Notes | 147

0.5.0 Overview of Major Changes The Universal Menu has been removed from VrLib, allowing modifications to the Universal Menu without requiring each app to upgrade to the latest SDK. The Universal Menu is now part of the Oculus System Activities application and is downloaded and updated alongside Oculus Home and Horizon. Make sure you update your version of Home in order to test your application with the new Universal Menu. If you are migrating from a previous SDK, please refer to the “Migrating from Earlier Versions” sections of the Native Development and Unity Integration guides. The Mobile Unity Integration is now synced with the Oculus PC SDK 0.5.0.1 Beta. Please ensure you have installed the corresponding 0.5.0.1 Oculus runtime; it can be found at the following location: https:// developer.oculus.com/downloads/ VrPlatform entitlement checking is now disabled by default in Unity; handling for native development is unchanged. If your application requires this feature, please refer to the Mobile SDK Documentation for information on how to enable entitlement checking. Applications built with Mobile SDK 0.5.0 or later will be compatible with the Samsung GALAXY S6. Note: Before installing or integrating this distribution, we strongly recommend that you back up your project before attempting any merge operations. New Features • Android Manifest • Mobile SDK 0.5.0 no longer requires PlatformActivity in the AndroidManifest.xml file. If you have previously worked with an earlier SDK, the following block must be removed:

• The camera permission is also no longer required and can be removed from your manifest if your app does not rely on it:

• For additional information on manifest requirements, see the relevant documentation in the Native Development Guide, Unity Integration Guide, and Mobile App Submission Guide. • Native Framework • Folder Browser • Added support for dynamically loaded categories. • Factored out MetaData from FolderBrowser into MetaDataManager.h/cpp. • Improved wrap-around controls. • Sound Limiter • Application sound_asset.json files may now override specific menu sounds. • VrMenu • Added hit test result to VRMenuEvent. • Added debugMenuHierarchy console command for debug drawing of VrMenu hierarchy. • Now uses current view matrix for gaze cursor and menu positions.

148 | Release Notes | Mobile

• Added options for horizontal and vertical text justification. • Multi-Line text justification. • Added option to allow text to line up horizontally with different descenders. • Unity Integration • Synced with the Oculus PC SDK 0.5.0.1 Beta. • VrPlatform entitlement checking is now disabled by default. • Cinema SDK • UI reworked using new UI components. • 360 Photos SDK • Added libjpeg.a directly to projects in order to avoid dependency on libjpeg source. • Metadata is now app-extensible. Added functionality for reading and writing extended metadata during app loading and saving. • 360 Videos SDK • Added libjpeg.a directly to projects in order to avoid dependency on libjpeg source. • Metadata is now app-extensible. Added functionality for reading and writing extended metadata during app loading and saving. API Changes • VrLib • • • • • • • • •

Universal Menu moved from VrLib into a separate application. Universal Menu specific functionality removed from VrLib. Adds Oculus Remote Monitor support. VrApi restructured for future modularity and ease of development. Local preferences are now allowed in Developer Mode. Please refer to the Mobile SDK Documentation for more information. Default eye height and interpupillary distance have been changed to conform to the default values used by the PC SDK. The native head-and-neck model has been re-parameterized to use a depth/height pair rather than angle/length to conform to the PC SDK. HMDState sensor acquisition code has been re-written to make it reliable and thread safe. Now restores last-known good HMD sensor yaw when recreating the HMD sensor.

Bug Fixes • Unity Integration • Health and Safety Warning no longer displays in editor Play Mode if a DK2 is not attached. • Cinema SDK • Fixed playback controls reorienting screen in void theater when user clicks on controls when they are off the screen on portrait videos. • OvrGuiSys • RemoveMenu is now DestroyMenu and will now free the menu. Known Issues • Unity Integration

Mobile | Release Notes | 149

• For use with the Mobile SDK, we recommend Unity versions 4.6.3, which includes Android 5.0 Lollipop support as well as important Android bug fixes. While the Mobile SDK is compatible with Unity 5.0.0p2 and higher, several issues are still known to exist, including an Android ION memory leak and compatibility issues with OpenGL ES 3.0. Please check back for updates.

0.4 Release Notes This document provides an overview of new features, improvements, and fixes included in version 0.4 of the Oculus Mobile SDK. 0.4.3.1 Overview of Major Changes This release adds support for Unity 5.0.0p2. Developers using Unity 5 must update to this version, and make sure that they are using the latest patch release from Unity. We would like to highlight the inclusion of the new Mobile Unity Integration with full DK2 support based on the Oculus PC SDK 0.4.4. As this is a significant API refactor, please refer to the Unity Development Guide: Migrating From Earlier Versions section for information on how to upgrade projects built with previous versions of the Mobile Unity Integration. Note: Before installing or integrating this distribution, we strongly recommend that you backup your project before attempting any merge operations. 0.4.3 New Features • Android Manifest • Applications will now be required to specify the following permission to support distortion configuration updates by the system service.

• Note: Always refer to the Oculus Mobile Submission Guidelines for the latest information regarding the submission process. • VrPlatform • Support for entitlement checking with VrPlatform. Integration steps and instructions are included in the Oculus Mobile Developer Guide’s Mobile SDK Setup section. • Unity Integration • New Mobile Unity Integration Based on Oculus PC SDK 0.4.4 • Miscellaneous • The Mobile SDK Documentation folder hierarchy has been re-organized into a single document. API Changes • VrLib • Localized string updates for the Universal Menu.

150 | Release Notes | Mobile

• Improvements to yaw drift correction. • Fixed vsync possibly being turned off by the Universal Menu when selecting reorient. • Pre-register nativeSetAppInterface to work around a JNI bug where JNI functions are not always linked. • Do not allow nativeSurfaceChanged to use a deleted AppLocal in case surfaceDestroyed is executed after onDestroy. • Removed resetting the time warp when sensor device information is not populated on application launch. • Improved Passthrough Camera latency by disabling Optical Image Stabilization (Exynos chipset only). • Free EGL sync objects on time warp thread shutdown. Bug Fixes • 360 Videos SDK • Fixed bug where a few 360 videos would not play. • Fixed several UI bugs. • Added extra error handling. • 360 Photos SDK • Fixed several UI bugs. 0.4.2 Overview of Major Changes If you are developing with Unity, we recommend updating to Unity 4.6.1, which contains Android 5.0 – Lollipop support. We would like to highlight the inclusion of the new Mobile Unity Integration with full DK2 support based on the Oculus PC SDK 0.4.4. As this is a significant API refactor, please refer to the Unity Development Guide: Migrating From Earlier Versions section for information on how to upgrade projects built with previous versions of the Mobile Unity Integration. Note: Before installing or integrating this distribution, we strongly recommend that you backup your project before attempting any merge operations. API Changes • VrLib • Universal Menu localization support: English, French, Italian, German, Spanish, Korean. • Move Direct Render out of VrApi and into TimeWarp. • Print battery temperature to logcat. • Fix rendering of TimeWarp Debug Graph. • Unity Integration • Fix for camera height discrepancies between the Editor and Gear VR device. • Moonlight Debug Util class names now prefixed with OVR to prevent namespace pollution. • Provide callback for configuring VR Mode Parms on OVRCameraController; see OVRModeParms.cs for an example. • Native Framework

Mobile | Release Notes | 151

• Fixed bug in which Volume toast is not dismissed if user transitions to Universal Menu while the toast is active. • Allow for app-specific handling when the user selects Reorient in the Universal Menu. • SearchPaths: Now correctly queries Android storage paths. • SearchPaths: Refactored to OvrStoragePaths. • FolderBrowser: Improved load time by removing check for thumbnails in the application package. • FolderBrowser: Improved scrolling and swiping physics. FolderBrowser: Added bound back and wrap around effects. • Sample Project Changes • 360 Photos SDK • Fixed bug in which the user could easily close the menu unintentionally when returning from a photo. • Fixed crash that occurred when photos stored in the physical “Favorites” folder were untagged as “Favorites”. • Fixed crash caused by swiping on the “no media found” screen. • 360 Videos SDK • Background star scene now fades to black when starting a video. • Changed corrupted media message to show only filename so it fits in the view. • Fixed rendering artifact that occurred when starting to play a video. 0.4.1 Overview of Major Changes Added support for Android 5.0 (Lollipop) and Unity Free. New Features • Mobile SDK • Added support for Android 5.0 (Lollipop). • Unity • Added Unity Free support for Gear VR developers. 0.4.0 Overview of Major Changes First public release of the Oculus Mobile SDK. New Features • First public release of the Oculus Mobile SDK. API Changes • The Mobile SDK is now using API Level 19. Please make the following change to your manifest file:

152 | Release Notes | Mobile

Bug Fixes • • • • • • •

Health and Safety Message no longer required on mount. Lens distortion updated for final Gear VR lenses. Improved Mag Yaw drift correction. Option ability to update distortion file without the need for app rebuild. Changed default font to Clear Sans. Unity vignette rendering updated to match native (slightly increases effective FOV). Unity volume pop-up distance to match native.

• Fix Gaze Cursor Timer scale.

System Activities/VrApi Release Notes This section describes changes to System Activities and VrApi.

1.10.x Release Notes System Activities 1.10.0 Release Notes Beginning with this release, VrApi implementation has been renamed System Driver and now ships separately from System Activities. For System Driver release notes, see System Driver Release Notes on page 156.

1.0.x Release Notes 1.0.3.7 Release Notes New Features • Added new getting started tutorial support. 1.0.3.6 Release Notes Bug Fixes • Removed VrApi interface threading restrictions when using explicit EGL objects. • Prevent the wrong JNIEnv from being used on vrapi_leaveVrMode when calling VrApi interface methods from different threads. 1.0.3.5 Release Notes Bug Fixes • Fixed latency and tearing issues. • VrApi’s framebuffer capture now only streams buffered layers to prevent Oculus Remote Monitor from streaming a single cursor layer. • Modified Developer Mode to not set VRAPI_TRACKING_STATUS_HMD_CONNECTED when device is undocked with phone sensors enabled, allowing code which checks this flag to execute while undocked to run properly in Developer Mode.

Mobile | Release Notes | 153

1.0.3.4 Release Notes New Features • Phones in Developer Mode now implement limited orientation tracking using phone sensors. This may be disabled with a setting to Local Preferences or System Properties. For more information, see Local Preferences and Android System Properties. • Volume indicator now turns red at volume 11 when headphones detected. • Redesigned icons for volume, Bluetooth, Wi-Fi, airplane mode, notifications, brightness, reorient, and battery. • • • •

Moved Pass-Through Camera from Settings to Utilities in Universal Menu. Reorient function now opens Pass-Through Camera and displays instructional text. Polished UI elements for friend request, status bar. Changed brightness/volume UI to a slider bar.

1.0.3.3 Release Notes New Features • The Universal Menu (UM) may now be realigned to any direction except straight down. To realign, look away from the UM and tap the touchpad when the message “Tap to reorient menu” appears. • Added SMS text preview. To view message contents, hover over the notification and click “View.” • Game invites from friends may now be accepted from the UM notifications page. When an invite is received through a supported app, a gamepad icon will appear in the upper left corner. The user may enter the UM to see details and choose to join or clear. If ignored, game invites expire in ten minutes. • Gamepad B button now acts as Back button. • A notification count badge now appears over the Notifications button when new notifications are received. • Long text is now auto-scaled or clipped to fit in UI elements. • SMS now displays sender’s phone number. • Volume and brightness indicators can now be changed by gazing over them and swiping forward or backwards. • Added UI for inviting users to games to System Activities and UM. This interface will be exposed to developers in a future release of the Oculus Platform SDK library. Bug Fixes • • • •

Fixed duplicate sounds when hovering over items. Fixed misaligned time when using the 24-hour time format. Fixed a text aliasing bug. Fixed a crash when receiving SMS messages.

• Improved text wrapping in Chinese, Japanese, and Korean. • Added missing Korean translation for “incoming calls”. • All fonts now support the Unicode replacement character. This character looks like a diamond with a question mark in it and is rendered when a character is requested but not found in the font. • Brightness level indicator no longer changes when re-entering the UM. • Issues related to Android OS sending phantom MENU key events on a back press were fixed. 1.0.3.2 Release Notes New Features • User’s real name now displayed on profile page.

154 | Release Notes | Mobile

Bug Fixes • Fixed minor word-wrapping bug. • Fixed subtle overlay layer jittering associated with large, odd-sized textures. 1.0.3.1 Release Notes New Features • Android N compatibility update in VrApi Loader to work with library namespacing. Bug Fixes • • • •

Updated stb library to fix jpeg loading bugs. Font rendering improvements. Fixed automatic word wrapping for Chinese text. Fixed incorrect word wrapping in Japanese.

1.0.3.0 Release Notes New Features • Added Universal Menu support for Profile, Friends, Notifications, and Game Invites (app must support this feature). • No longer require an OpenGL context to be bound during vrapi_EnterVrMode. • Added ability to change clock levels from a background thread. • Defer deleting texture swapchains. • A fatal error is now triggered if vrapi_Shutdown is called before vrapi_LeaveVrMode. • Made improvements to Oculus Remote Monitor - see OVRMonitor release notes for details. • Local Preferences are now also mapped to Android system properties. Bug Fixes • Return NULL from vrapi_EnterVrMode when eglCreateWindowSurface fails. • Fix for vsync handling taking an extremely large amount of CPU time on S7+Snapdragon devices running certain OS versions. 1.0.2.5 Release Notes New Features • Enabled distortion mesh clipping by default: WARP_MESH_CLIP_MODE_SQUARE. • Updated V-sync timing and events. • Added support for different rendering completion fences. 1.0.2.3 Release Notes Bug Fixes • Added workaround to determine the external SDCard path if Gear VR Service version is lower than 2.4.18. 1.0.2.2 Release Notes New Features • Enabled video capture and screenshot buttons in the Utilities submenu of Universal Menu. See Screenshot and Video Capture for details.

Mobile | Release Notes | 155

• Upgraded icon resolutions for all Universal Menu icons. Bug Fixes • Fix to maintain consistent text size for Passthrough Camera Off button. 1.0.2.1 Release Notes This release of System Activities adds new permission requirements for features in development. This update will prompt the user to accept camera and network access. New Features • Added two additional weights to EFIGS font. • Improved text word wrap integration in VRMenuObject. Bug Fixes • • • •

Fixed update for buttons in Universal Menu. Fixed VRMenuObjects to prevent freeing textures that were allocated via the texture manager. Fixed Universal Menu performance regression. Fixed out-of-range font weight making text invisible.

1.0.2.0 Release Notes New Features • Redesigned Universal Menu. • Video Capture (Alpha) video files are now written into the app’s cache directory. If permissions permit, the video file is now moved into /sdcard with the app’s package name appended for easy sorting in the directory /sdcard/Oculus/VideoShots/. Bug Fixes • Fixed Unity support with Video Capture (Alpha). • getApplicationName no longer uses the string table to lookup application name, in the case it is not in the string table. 1.0.1.4 Release Notes Bug Fixes • Fixed HMD Device Creation Failure. • Implemented workaround for Android abandoning the buffer queue when using a NativeActivity. 1.0.1.4 Release Notes Bug Fixes • Fixed HMD Device Creation Failure. • Implemented workaround for Android abandoning the buffer queue when using a NativeActivity. 1.0.1.3 Release Notes New Features • Added logging for the vrapi version requested by the app, making it easier to determine which SDK a given application was built against.

156 | Release Notes | Mobile

• Added warning if loading the default distortion file fails. • Modified getdevice and getgputype checks to run after vr permissions check. • Distortion file is now loaded directly from VRSVC without requiring a temp file in /sdcard/Oculus/, eliminating need for READ_EXTERNAL_STORAGE Android permission (requires Mobile SDK 1.0 or later). • “Do Not Disturb” mode in the Universal Menu only checks for change state when toggled. Bug Fixes • Fixed button rendering issue in Universal Menu. Known Issues • On a clean boot of the target device, distortion correction is missing when running a VR app outside of the headset with Developer Mode enabled. Target device must be docked to a Gear VR headset at least once after every clean boot up of the target device in order to cache the distortion file. 1.0.1.2 Release Notes Bug Fixes • Fixed the front buffer extension on Adreno Lollipop Note 4 when the EGLSurface for the front buffer is not created by TimeWarp. 1.0.1.1 Release Notes Bug Fixes • Ensured that initial swapbuffer is called for Oculus Home.

System Driver Release Notes This section describes changes to System Driver (previously named VrApi).

1.0.x Release Notes System Driver 1.0.4.0 Release Notes Beginning with this release, VrApi has been renamed System Driver and now ships separately from System Activities. For System Activities release notes, see System Activities/VrApi Release Notes on page 152. New Features • Moved SysUtils library from the Mobile SDK into System Driver for applications built with Mobile SDK 1.0.4 or later. • Removed TimeWarp Debug Graph support. • Added support for Prologue startup experience.

Oculus Remote Monitor Release Notes This section describes changes to Oculus Remote Monitor.

Mobile | Release Notes | 157

1.x Release Notes Oculus Remote Monitor for PC and OS X 1.0.3 For more information and usage instructions, please see Oculus Remote Monitor and VrCapture on page 84. New Features • CPU Scheduler events are now available on Galaxy S7. • Added memory Allocation tracking. Every malloc/free can now be charted in the profiler view. • Added Head Tracking graph. Bug Fixes • Fixed corrupt data streams that would happen on slow networks. • Improved profiler view performance. • Fixed miscellaneous bugs. Oculus Remote Monitor for PC and OS X 1.0.0 This is the initial stand-alone release. Release notes for earlier versions may be found in the Mobile SDK Release Notes. New Features • VR Developer Mode is no longer required if you have System Activities 1.0.2 or greater and an app built with Oculus Mobile SDK 1.0 or greater. • Added experimental layer texel density and complexity visualizers (supported by apps built with Oculus Mobile SDK 1.0 or later). • Improved network stability on Windows. • Now available as a separate downloadable package from the full Mobile SDK download.

Life Enjoy

" Life is not a problem to be solved but a reality to be experienced! "

Get in touch

Social

© Copyright 2013 - 2019 DOKUMENTIX.COM - All rights reserved.