Mobile SDK - Oculus

push notifications, or authentication via a separate Android application) do not make sense on Oculus Go. • No Camera. Oculus Go does not have a cam...

33 downloads 104 Views 4MB Size

Recommend Documents


Mobile SDK
As this is a significant API refactor, please refer to the Unity Development Guide: ... Menu localization support: English, French, Italian, German, Spanish, Korean. ..... Note: As of October 2014, the Windows x86 version of the JDK appears to be ...

Mobile SDK
Native application framework for building high-performance VR Applications from scratch. ... Native project sample applications and source to provide reference model ... Mobile Development Basics: Information every developer should know about ......

Mobile SDK
To become acquainted with the Oculus VR environment and with using Gear .... In your activity's onCreate method, add the following line: ... Bitmapfont - Fix for font info height not being y-scaled. ..... using native code languages such as C and C++

Audio SDK - Oculus
2 | Introduction | Audio. 2 | |. Copyrights and ..... Playing Ambisonic Audio in Unity 2017.1 (Beta). ..... Audio | Introduction to Virtual Reality Audio | 9. Directional ...

Platform SDK - Oculus
Your app should constantly check the message queue for notifications and ... Once you've retrieved the message payload call ovr_FreeMessage() to free the ... In your Visual C++ project, open Properties, expand the Linker tab, and select Input. ......

Platform SDK - Oculus
Cloud Storage is designed to support the following use-cases in Rift apps: 1. ...... These expressions compare the Data Settings of potential matches against.

Avatar SDK - Oculus
a Unity package with scripts, prefabs, art assets, and sample scenes for PC and Gear VR development. ..... Speak or sing to see the voice .... void Awake () {.

Mobile SDK
For detailed instructions on updating native projects to this SDK version, see Mobile ..... Once downloaded, extract the .zip file into a directory of your choice (e.g., ...... against VrApi, then calls nativeSetAppInterface() to allow the C++ code t

Mobile SDK
In theory, Gear VR applications may receive messages from System ...... 112 | Testing and Troubleshooting | Mobile or .... GLSL_ES_Specification_3.00.3.pdf ..... For projects which rely on the 3rdParty libraries, minizip or stb, the following ...

Avatar SDK - Oculus
native C/C++ samples, header files, and libraries for PC and Gear VR development. • an Unreal ... This tutorial .... avatar 180 degrees so that its front faces us.

Mobile SDK Version 1.14

2 | Introduction | Mobile

Copyrights and Trademarks ©

2017 Oculus VR, LLC. All Rights Reserved.

OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights reserved. BLUETOOTH is a registered trademark of Bluetooth SIG, Inc. All other trademarks are the property of their respective owners. Certain materials included in this publication are reprinted with the permission of the copyright holder.

2 |  | 

Mobile | Contents | 3

Contents Mobile SDK Getting Started Guide.....................................................................6 Mobile Development with Unity and Unreal.................................................................................................... 7 System and Hardware Requirements................................................................................................................ 8 Device Setup - Gear VR.................................................................................................................................... 9 Device Setup - Oculus Go.............................................................................................................................. 11 Android Development Software Setup........................................................................................................... 12

Mobile Development Basics.............................................................................. 14 Oculus Signature File (osig) and Application Signing.....................................................................................14 Android Studio Basics..................................................................................................................................... 15 SD Card Support - Gear VR............................................................................................................................22

Native Development Overview..........................................................................23 Native Source Code........................................................................................................................................ 23 Native Samples................................................................................................................................................ 24 Android Manifest Settings............................................................................................................................... 25 Reserved User Interactions.............................................................................................................................. 26 Reserved User Interactions.........................................................................................................................26

Native Engine Integration..................................................................................28 VrApi.................................................................................................................................................................28 Lifecycle and Rendering.............................................................................................................................28 Frame Timing............................................................................................................................................. 31 Latency Examples....................................................................................................................................... 31 VrApi Input API................................................................................................................................................34 Asynchronous TimeWarp (ATW)......................................................................................................................37 TimeWarp Minimum Vsyncs....................................................................................................................... 38 Consequences of not rendering at 60 FPS............................................................................................... 39 Power Management.........................................................................................................................................40 Managing Power Consumption..................................................................................................................40 Power Management and Performance...................................................................................................... 41 Power State Notification and Mitigation Strategy..................................................................................... 42 Advanced Rendering....................................................................................................................................... 43 Multi-View................................................................................................................................................... 43 Fixed Foveated Rendering.........................................................................................................................47 72 Hz Mode............................................................................................................................................... 49

Native Application Framework.......................................................................... 51 Creating New Apps with the Framework Template....................................................................................... 51 UI and Input Handling.....................................................................................................................................52 Native SoundEffectContext............................................................................................................................. 53 Runtime Threads.............................................................................................................................................. 54

Other Native Libraries........................................................................................55 Media and Assets.............................................................................................. 56 Mobile VR Media Overview............................................................................................................................ 56 Panoramic Stills...........................................................................................................................................56 Panoramic Videos....................................................................................................................................... 56

4 | Contents | Mobile

Movies on Screens..................................................................................................................................... 57 Movie Meta-data........................................................................................................................................ 57 Oculus 360 Photos and Videos Meta-data................................................................................................58 Media Locations......................................................................................................................................... 58 Native VR Media Applications.........................................................................................................................59

Testing and Troubleshooting.............................................................................61 Tools and Procedures......................................................................................................................................61 Android System Properties........................................................................................................................ 61 Screenshot and Video Capture..................................................................................................................63 Oculus Remote Monitor.................................................................................................................................. 64 Setup...........................................................................................................................................................65 Basic Usage................................................................................................................................................ 65 Using Oculus Remote Monitor to Identify Common Issues...................................................................... 72 OVR Metrics Tool............................................................................................................................................ 78 Android Debugging.........................................................................................................................................82 Adb............................................................................................................................................................. 82 Logcat......................................................................................................................................................... 84 Application Performance Analysis................................................................................................................... 85 Basic Performance Stats through Logcat...................................................................................................85 SysTrace...................................................................................................................................................... 86 NDK Profiler................................................................................................................................................87 Snapdragon Profiler....................................................................................................................................89 Native Debugging........................................................................................................................................... 89 Native Debugging with Android Studio.................................................................................................... 90 Native Debugging with ndk-gdb............................................................................................................... 94

Mobile Native SDK Migration Guide.................................................................97 Migrating Migrating Migrating Migrating Migrating Migrating Migrating Migrating

to to to to to to to to

Mobile Mobile Mobile Mobile Mobile Mobile Mobile Mobile

SDK SDK SDK SDK SDK SDK SDK SDK

1.14.0..................................................................................................................... 97 1.12.0..................................................................................................................... 98 1.9.0....................................................................................................................... 99 1.7.0..................................................................................................................... 101 1.5.0..................................................................................................................... 103 1.0.4..................................................................................................................... 104 1.0.3..................................................................................................................... 106 1.0.0..................................................................................................................... 114

Release Notes.................................................................................................. 118 1.14 Release Notes........................................................................................................................................118 1.12 Release Notes........................................................................................................................................119 1.9 Release Notes..........................................................................................................................................120 1.7 Release Notes..........................................................................................................................................121 1.5 Release Notes..........................................................................................................................................122 1.0 Release Notes..........................................................................................................................................122 0.6 Release Notes..........................................................................................................................................127 0.5 Release Notes..........................................................................................................................................133 0.4 Release Notes..........................................................................................................................................136 System Activities/VrApi Release Notes......................................................................................................... 139 1.22.x Release Notes............................................................................................................................... 139 1.21.x Release Notes............................................................................................................................... 140 1.20.x Release Notes............................................................................................................................... 140 1.19.x Release Notes............................................................................................................................... 140 1.16.x Release Notes............................................................................................................................... 140 1.15.x Release Notes............................................................................................................................... 140 1.14.x Release Notes............................................................................................................................... 140

Mobile | Contents | 5

1.13.x Release Notes............................................................................................................................... 141 1.12.x Release Notes............................................................................................................................... 141 1.11.x Release Notes............................................................................................................................... 141 1.10.x Release Notes............................................................................................................................... 142 1.0.x Release Notes................................................................................................................................. 142 System Driver Release Notes........................................................................................................................ 146 1.14.x Release Notes............................................................................................................................... 146 1.13.x Release Notes............................................................................................................................... 147 1.12.x Release Notes............................................................................................................................... 147 1.11.x Release Notes............................................................................................................................... 147 1.10.x Release Notes............................................................................................................................... 147 1.9.x Release Notes................................................................................................................................. 147 1.8.x Release Notes................................................................................................................................. 147 1.7.x Release Notes................................................................................................................................. 148 1.6.x Release Notes................................................................................................................................. 148 1.5.x Release Notes................................................................................................................................. 148 1.0.x Release Notes................................................................................................................................. 149 Oculus Remote Monitor Release Notes........................................................................................................150 1.x Release Notes.................................................................................................................................... 150

Mobile SDK Documentation Archive...............................................................152

6 | Mobile SDK Getting Started Guide | Mobile

Mobile SDK Getting Started Guide The Oculus Mobile SDK includes libraries, tools, and resources for native development for Oculus Go and Gear VR. SDK Contents • • • •

VrApi for third-party engine integration (not required for Unity or Unreal). Native application framework for building high-performance VR Applications from scratch. Additional libraries providing support for GUI, locale, and other functionality. Native project sample applications and source to provide reference model for creating your own VR applications. • Tools and resources to assist with native development. Mobile SDK Intro Documentation • Getting Started Guide: A one-time guide to environment setup. • Mobile Development Basics: Information every developer should know about Oculus mobile development. Every developer should read through this guide. Native Developers Most of the Mobile SDK guide is written for native developers. Complete the setup described in the Getting Started Guide, then move on to the Native Development Overview on page 23. Unity and Unreal Developers Mobile developers working with Unity and Unreal should begin with Mobile Development with Unity and Unreal on page 7, as setup and development differ substantially from native setup and development. Developing for Go and Gear VR For the most part, developing apps for the Go and Gear VR is the same. However, you should be aware of some key differences between building for the two platforms, especially if you've previously built an app for Gear VR. The Oculus Go has the following restrictions: • No Google Play Services. Unlike the Samsung Galaxy devices that run Gear VR, Oculus Go does not ship with Google Play Services installed. You cannot rely on Google Play Services (e.g. Google Firebase, Google Cloud Messaging, etc) when running on Oculus Go. • No 2D Surface. Oculus Go does not have a 2D phone display, and therefore some app behaviors (such as push notifications, or authentication via a separate Android application) do not make sense on Oculus Go. • No Camera. Oculus Go does not have a camera, and cannot run applications that rely upon access to a camera. • No HMD Touchpad. Oculus Go does not have a touchpad on the HMD. Your app should not refer to an HMD touchpad when running on Oculus Go. • Different Controller. The Oculus Go Controller and Gear VR Controller share the same inputs: both are 3DOF controllers with clickable trackpads and an index finger trigger. Though these two devices provide the same inputs, the physical design of each is distinct. If your app displays a visible controller, you should change the model displayed depending on whether you are running on Gear VR or Oculus Go. Alternatively, a stylized controller model that is distinct from both the Oculus Go Controller and the Gear VR Controller is acceptable.

Mobile | Mobile SDK Getting Started Guide | 7

Platform Features Mobile applications may use our Platform SDK (available separately from our Downloads page) to add features related to security (e.g., entitlements), community (e.g., rooms, matchmaking), revenue (e.g., in-app purchases), and engagement (e.g., leaderboards). For more information, see our Platform SDK documentation. Application Submission For information on preparing to submit your mobile VR application to Oculus for distribution through the Oculus Store, see our Publishing Guide. Thank you for joining us at the forefront of virtual reality! To become acquainted with Gear VR, we recommend starting with the Gear VR Documentation, which covers topics including: • Device Features and Functionality • Connecting the Headset • Navigation and App Selection Questions? Visit our developer support forums at https://developer.oculus.com. Our Support Center can be accessed at https://support.oculus.com/.

Mobile Development with Unity and Unreal Unity and Unreal provide built-in support for Oculus mobile development. If you wish to use the mobile SDK with other engines, please see Native Engine Integration on page 28. Unity Mobile Development Unity 5.1 and later provides built-in VR support for Oculus Go and Gear VR, enabled by checking a box in Player Settings. The Oculus Mobile SDK is not required. We provide a Utilities for Unity 5 unitypackage that includes supplemental scripts, scenes, and prefabs to assist development. The Utilities package is available from our Downloads page. For more information, see Preparing for Mobile Development in our Oculus Unity Guide. Unreal Mobile Development Unreal versions 4.10 and later provide built-in support for Oculus Go and Gear VR. The Oculus Mobile SDK is not required. The Android SDK is required for mobile development with Unreal. However, most Unreal developers do not need to install Android Studio or NDK. Unreal developers should follow the instructions in our Device Setup Gear VR on page 9 guide, and install the Java Development Kit (JDK) and Android SDK before beginning development. For more information, see Preparing for Unreal Mobile Development in our Unreal Developer Guide.

8 | Mobile SDK Getting Started Guide | Mobile

System and Hardware Requirements Please begin by making sure that you are using supported hardware and devices for this release of the Oculus Mobile SDK. Operating System Requirements The Oculus Mobile SDK currently supports the following operating systems: • Windows 7/8/10 • Mac OS: 10.10+ (x86 only) Minimum System Requirements The following computer system requirements for the Oculus Mobile SDK are based on the Android SDK system requirements: • 2.0+ GHz processor • 2 GB system RAM Supported VR Headsets • Oculus Go • Samsung Gear VR Supported Devices (Gear VR only) • • • • • • • • • • •

Samsung Galaxy S9+ Samsung Galaxy S9 Samsung Galaxy A8+ (2018) Samsung Galaxy A8 (2018) Samsung Galaxy Note 8 Samsung Galaxy S8+ Samsung Galaxy S8 Samsung Galaxy S7 Edge Samsung Galaxy S7 Samsung Galaxy Note FE Samsung Galaxy Note 5

• Samsung Galaxy S6 Edge+ • Samsung Galaxy S6 Edge • Samsung Galaxy S6 Gear VR Innovator v2 • Samsung Galaxy S6 Edge • Samsung Galaxy S6 Target Device Requirements • API Level 21 (Android 5.0) or later

Mobile | Mobile SDK Getting Started Guide | 9

Accessories Samsung Gear VR Controller The Gear VR Controller orientation-tracked input device is the primary Gear VR controller going forward. We recommend that developers take advantage of its capabilities if it makes sense to do so with your application or game. Oculus Go Controller The Oculus Go Controller is the orientation-tracked input device for the Go. Bluetooth Gamepad Bluetooth gamepads are also supported. However, not all brands have been tested for compliance. Developers should perform appropriate due diligence for key code compatibility when utilizing gamepad input in their application. A gamepad is necessary for testing the sample applications which come with this release. Compatible gamepads must have the following features: • Wireless Bluetooth connection (BT3.0) • Compatible with Android devices • Start and Select buttons Typical controls include: • One Analog Stick • Action Button (4) • Trigger Button (2)

Device Setup - Gear VR This section will provide information on how to set up your supported Gear VR device for running, debugging, and testing your mobile application. Please review the System and Hardware Requirements above for the list of supported devices for this SDK release. Note: This information is accurate at the time of publication. We cannot guarantee the consistency or reliability of third-party applications discussed in these pages, nor can we offer support for any of the third-party applications we describe. Configuring Your Device for Debugging In order to test and debug applications on your Android device, you will need to enable specific developer options on the device: 1. Configure Developer Options in Settings • Enable USB Debugging • Allow mock locations • Verify apps via USB 2. Configure Display Options in Settings • Disable lock screen

10 | Mobile SDK Getting Started Guide | Mobile

• Set Display Timeout (optional) Developer Options Note: Depending on which mobile device you are using, options and menu names may vary slightly. Developer options may be found under: Settings -> System -> Developer options. Developer options may be hidden by default. If so, you can expose these options with the following steps: 1. Locate Build number option in Settings. Android M and later: Go to Settings -> System -> About device -> Software Info. Earlier Android Versions: Go to Settings -> System -> About device. 2. Scroll down to Build number. 3. Press Build number seven times. You should be informed that Developer options has been enabled. Once you have found Developer options, enable the following: USB Debugging: This will allow the tools to install and launch deployed apps over USB. You should see the screen shown on the accompanying figure.

Note: If the above screen does not appear, ensure that your computer recognizes the device when it is connected. If not, you may need to pull down the notifications bar on your phone and find the USB connection setting, and set USB to Software installation (it may be set to Charging by default). If you still do not see the pictured screen, try a different USB cable. If your phone is recognized by your computer but you do not see the above screen, try toggling USB Debugging off then back on.

Mobile | Mobile SDK Getting Started Guide | 11

Check Always allow this computer and hit OK. To purge the authorized whitelist for USB Debugging, press Revoke USB debugging authorizations from the Developer options menu and press OK. Allow mock locations: This will allow you to send mock location information to the device (convenient for apps which use Location Based Services). Verify apps via USB: This will check installed apps from ADB/ADT for harmful behavior. Display Options The following display options are found in: Home -> Apps -> Settings -> Sound and Display. Lock screen/Screen Security/Screen lock: Set to None to make the Home screen is instantly available, without swipe or password. Useful to quickly get in and out of the phone. Display/Screen timeout: Set the time to your desired duration. Useful if you are not actively accessing the device but wish to keep the screen awake longer than the default 30 seconds. See Android Debugging for more information.

Device Setup - Oculus Go This section will provide information on how to set up your Oculus Go device for running, debugging, and testing your application. To begin development locally for Oculus Go, you must enable Developer Mode in the companion app. Before you can put your device in Developer Mode, you need to have created (or belong to) a developer organization on the Oculus Dashboard. To join an existing organization: 1. You'll need to request access to the existing Organization from the admin. 2. You'll receive an email invite. Once accepted, you'll be a member of the Organization. To create a new organization: 1. Go to: https://dashboard.oculus.com/organization/create 2. Fill in the appropriate information. Enable Developer Mode Then, to put your Oculus Go in developer mode: 1. 2. 3. 4.

Open the Oculus app on your mobile device. In the Settings menu, select your Go headset that you're using for development. Select More Settings. Toggle Developer Mode on.

Once complete, you'll be able to use adb to push builds to your Go.

12 | Mobile SDK Getting Started Guide | Mobile

Android Development Software Setup This guide describes how to install the Android Studio Development Bundle that you'll use to build mobile VR apps. The Android Studio Development Bundle includes all the necessary tools you need to begin developing Android Applications: • Android Studio IDE (recommended IDE) • • • •

Android Platforms Android SDK tools Android NDK Open JDK

Getting Started If you're planning to use a Mac for development, first install Xcode - https://developer.apple.com/xcode/ download/. If you're using another platform you may skip this installation. To get started, download Android Studio - https://developer.android.com/studio/index.html. Please refer to the Install Android Studio (https://developer.android.com/studio/install.html?pkg=studio) guide for detailed installation steps. Install Additional Packages and Tools Once Android Studio has been installed, you can then install the following packages: • • • • •

Android SDK Platform, API level 21 Android SDK Build Tools, v 26.0.2 Android NDK LLDB Open JDK

These packages are installed through the Android SDK Manager. To access the manager, either navigate to Tools > Android > SDK Manager or click the SDK Manager icon in the toolbar ( project loaded you may navigate to Configure > SDK Manager.

). If you do not have a

Verify that the correct packages and versions have been installed In the Android SDK Manager, select the SDK Platforms tab. Verify the following Android platform is installed, as it is the minimum version required by the SDK: Android 5.0 (Lollipop). Then select the SDK Tools tab. Verify that the following components are installed: NDK and LLDB, and that the following tool versions are installed: Android SDK Build Tools 26.0.2. Note: A copy of the latest OpenJDK comes bundled with Android Studio. No additional installation is necessary. Please refer to the following pages for information about Android NDK (https://developer.android.com/studio/ projects/add-native-code.html#download-ndk) and OpenJDK (https://developer.android.com/studio/intro/ studio-config.html#jdk).

Mobile | Mobile SDK Getting Started Guide | 13

Configure Android Studio Once you've finished installing the required packages, you can configure your development environment. Android Studio Project Structure To verify your settings in Android Studio, navigate to File > Project Structure > SDK Location. If you do not have a project loaded, navigate to Configure > Project > Defaults > Project Structure > SDK Location. Verify that Use embedded JDK is checked and that the following properties are set to appropriate values: • Android SDK location • JDK location • Android NDK location Make note of these locations, you'll use them to set your environmental variables in the next section. Environment Variables and Path With the locations recorded in the previous step, set the following environment variables: • • • •

Set the environment variable JAVA_HOME to the JDK location. Set the environment variable ANDROID_HOME to the Android SDK location. Set the environment variable ANDROID_NDK_HOME to the Android NDK location. Add the JDK tools directory to your PATH, ie C:\Program Files\Android Studio\jre\bin.

Setting up your System to Detect your Android Device (Windows Only) You must set up your system to detect your Android device over USB in order to run, debug, and test your application on an Android device. If you are developing on Windows, you may need to install a USB driver for adb after installing the Android SDK. For an installation guide and links to OEM drivers, see the Android OEM USB Drivers document. Samsung Android drivers may be found on their developer site: http://developer.samsung.com/android/toolssdks/Samsung-Android-USB-Driver-for-Windows Windows may automatically detect the correct device and install the appropriate driver when you connect your device to a USB port on your computer. Access the Device Manager through the Windows Control Panel. If the device was automatically detected, it will show up under Portable Devices in the Device Manager. Otherwise, look under Other Devices in the Device Manager and select the device to manually update the driver. To verify that the driver successfully recognized the device, open a command prompt and type the command: adb devices

Note: You will need to successfully setup your Android development environment in order to use this command. For more information, see the next section: Android Development Environment Setup If the device does not show up, verify that the device is turned on with enough battery power, and that the driver is installed properly.

14 | Mobile Development Basics | Mobile

Mobile Development Basics This guide reviews basic development steps you'll need to know, such as application signing and required support for Settings Menu and volume handling. For instructions on how to install an APK to your mobile device, see "Using adb to Install Applications" in Adb on page 82.

Oculus Signature File (osig) and Application Signing Oculus mobile apps may require two distinct signatures at different stages of development. • Oculus Signature File (required during development, remove for submission, only required for Gear VR apps) • Android Application Signature (required for submission) Oculus Signature File (osig) - Gear VR only During development, your application must be signed with an Oculus-issued Oculus Signature File, or osig. This signature comes in the form of a file that you include in your application in order to access protected lowlevel VR functionality on your mobile device. Each signature file is tied to a specific device, so you will need to generate osig files for each device that you use for development. When your application is submitted and approved, Oculus will modify the APK so that it can be used on all devices. Please see our osig self-service portal for more information and instructions on how to request an osig for development: https://dashboard.oculus.com/tools/osig-generator/ Android Application Signing Android uses a digital certificate (also called a keystore) to cryptographically validate the identity of application authors. All Android applications must be digitally signed with such a certificate in order to be installed and run on an Android device. All developers must create their own unique digital signature and sign their applications before submitting them to Oculus for approval. For more information and instructions, please see Android's "Signing your Applications" documentation: http://developer.android.com/tools/publishing/app-signing.html Make sure to save the certificate file you use to sign your application. Every subsequent update to your application must be signed with the same certificate file, or it will fail. Note: Your application must be signed by an Android certificate before you submit it to the Oculus Store. Android Application Signing and Unity Unity automatically signs Android applications with a temporary debug certificate by default. Before building your final release build, create a new Android keystore by following the "Sign Your App Manually" instructions in Android's Sign your Applications guide. Then assign it with the Use Existing Keystore option, found in Edit > Project Settings > Player > Publishing Options. For more information, see the "Android" section of Unity's documentation here: http://docs.unity3d.com/Manual/class-PlayerSettings.html.

Mobile | Mobile Development Basics | 15

Android Application Signing and Unreal Once you add an osig to the appropriate Unreal directory, it will be added automatically to every APK that you build. You will need one osig for each mobile device. To add your osig to Unreal for development: 1. Download an osig as described in above. 2. Navigate to the directory \Engine\Build\Android\Java\. 3. Create a new directory inside \Engine\Build\Android\Java\ and name it assets. The name must not be capitalized. 4. Copy your osig to this directory. When you are ready to build an APK for submission to release, we recommend that you exclude the osig in your APK. To do so, select Edit > Project Settings > Android, scroll down to Advanced APKPackaging, and verify that Remove Oculus Signature Files from Distribution APK is checked. Before building your final release build, create a new Android keystore by following the "Sign Your App Manually" instructions in Android's Sign your Applications guide. Once you have generated your distribution keystore, go to Edit > Project Settings > Platforms > Android, scroll down to Distribution Signing, and entered the required information.

Android Studio Basics This guide introduces the Android Studio IDE and reviews some basic features. Getting Started with Oculus Native Samples: Import Gradle Project 1. If this is the first time you are launching Android Studio, select Open an existing Android Studio project. If you have launched Android Studio before, click File > Open instead.

16 | Mobile Development Basics | Mobile

2. Open any build.gradle project file from the Mobile SDK VRSamples folders. For example, VrSamples/ Native/VrCubeworld_Framework/Projects/Android/build.gradle.

Mobile | Mobile Development Basics | 17

3. When asked if you would like the project to use the Gradle wrapper, click OK.

18 | Mobile Development Basics | Mobile

Note: If this is your first time opening a native project in Android Studio, you are likely to be asked to install some dependencies. Follow the on-screen instructions to install all the required dependencies before you continue. Troubleshooting Gradle Sync Errors Here are some possible solutions if Android Studio reports a Gradle sync or configuration error: • The most common cause of such an error is that the Android SDK or NDK locations are wrong. Verify that the SDK and NDK locations are specified correctly in File > Project Structure. If either are wrong or missing, you cannot continue until you fill in the correct path. • On macOS, sometimes Android Studio reports a missing SDK location error even when the correct paths are listed in the Project Structure dialog box. To correct this problem, copy the local.properties file from your project folder up to the root of your Oculus Mobile SDK folder.

Mobile | Mobile Development Basics | 19

Project Overview Android Studio displays project files in the Android view by default. We recommend changing it to the Project view, which provides a good overview of the entire directory structure and highlights imported project directories in bold.

20 | Mobile Development Basics | Mobile

Select Target Configuration, Build, and Run You can build and run your application on your device directly from Android Studio. This will compile your project, build the APK, copy it to the phone over USB orWi-Fi, and prepare it for launching. If you are developing for Gear VR and your phone is set to Developer Mode (see Developer Mode for instructions), your application can launch without being inserted into your Gear VR headset. Otherwise, when the process completes you will be prompted to insert your mobile device into the headset to launch the application. Make sure you have followed the configuration steps in the Mobile SDK Setup Guide to ensure your device is configured appropriately. Select the target configuration you wish to build before building by selecting Edit Configurations in the project menu in the Android Studio toolbar.

Mobile | Mobile Development Basics | 21

Note: Before you can run the application, you must create and then copy the oculussig file for your device to the assets folder of your project. See Application Signing for more information. To build and run your app: 1. Click Run in the toolbar.

2. The Select Deployment Target dialog box appears. This is sometimes set to an emulator by default. 3. Select a device listed under Connected Device instead.

4. If your device asks you to Allow USB debugging, click OK. Troubleshooting: If USB debugging does not seem to be working: 1. Go to Developer Options on your phone.

22 | Mobile Development Basics | Mobile

2. Toggle USB Debugging off and then back on. Syncing Projects If you edit a *.gradle file or install an update to the Oculus Mobile SDK which includes updated Gradle projects, click Sync Project with Gradle Files to update your Android Studio project files.

SD Card Support - Gear VR All Gear VR applications submitted to the Oculus Store must support installation to SD Card. SD Card installation support is enabled with the manifest setting android:installLocation="auto" as described in Android Manifest Settings on page 25. To test application performance after installing to an SD Card: Verify that AndroidManifest.xml is set to android:installLocation="auto". Build an APK of your application. Sign your application as described in Oculus Signature File (osig) and Application Signing on page 14. Sideload your app as described in “Using adb to Install Applications” in our adb guide. Be sure to install the application to internal storage, 5. Go to the Android Application Manager and move the application to an SD Card. For more details, see our Developer Blog post Gear VR SD Card Support. 6. Launch your application and verify that it works properly. 1. 2. 3. 4.

Users who install mobile applications from the Oculus Store who wish to store them on their SD Card must manually move them following the steps described by our Developer Blog post Gear VR SD Card Support.

Mobile | Native Development Overview | 23

Native Development Overview Welcome to the Native Development Guide. This guide describes the libraries, tools, samples, and other material provided with this SDK for native development of mobile VR applications. While native software development is comparatively rudimentary, it is closer to the metal and allows implementing very high performance virtual reality experiences without the overhead of elaborate environments such as you would find with a typical game engine. It is not feature rich, but it provides the basic infrastructure you will need to get started with your own high-performance virtual reality experience. This SDK includes several sample projects which provide an overview of the native source code. See Native Samples on page 24 for details. Note: This guide is intended to provide a high-level orientation and discussion of native development with the mobile SDK. Be sure to review the header files for any libraries you use for more extensive, in-depth discussion.

Native Source Code This section describes mobile native source code development. The native SDK provides four basic native libraries: • • • •

VrApi: the minimal API for VR; VrAppFramework: the application framework used by native apps; VrAppSupport: support for GUI, Locale handling, sound, etc.; LibOVRKernel: a low-level Oculus library for containers, mathematical operations, etc.

VrApi provides the minimum required API for rendering scenes in VR. Applications may query VrApi for orientation data, and submit textures to apply distortion, sensor fusion, and compositing. The VrApi Input API allows developers to query the state of connected devices, such as the Gear VR Controller. Developers working with a third-party engine other than Unity or Unreal will use VrApi to integrate the mobile SDK. For detailed information, see Native Engine Integration on page 28. The VrAppFramework handles VrApi integration and provides a wrapper around Android activity that manages the Android lifecycle. The VrAppFramework is the basis for several of our samples and first-party applications, including Oculus Video and Oculus 360 Photos. If you are not using Unity, Unreal, or another integration and you would like a basic framework to help get started, we recommend that you have a look. See Native Application Framework on page 51 for more information. VrAppSupport and LibOVRKernel provide minimal functionality to applications using VrAppFramework such as GUI, sound, and locale management. See Other Native Libraries on page 55 for more information. The Vr App Interface (part of VrAppFramework) has a clearly-defined lifecycle, which may be found in VrAppFramework/Include/App.h. LibOVRKernel and VrAppFramework ship with full source as well as pre-built libs and aar files. The VrApi is shipped as a set of public include files, and a pre-built shared library. Providing the VrApi in a separate shared library allows the VrApi implementation to be updated after an application has been released, making it easy to apply hot fixes, implement new optimizations, and add support for new devices without requiring applications to be recompiled with a new SDK. The VrApi is periodically updated automatically to Samsung phones - for release notes, see System Activities/VrApi Release Notes. See the VrSamples/Native/VrCubeWorld projects for examples of how to integrate VrApi into third party engines as well as how to use VrAppFramework. Please see Native Samples on page 24 for more details.

24 | Native Development Overview | Mobile

Main Components Component

Description

Source code folder

VrApi

The Virtual Reality API provides a minimal set of entry points for enabling VR rendering in native applications and third-party engines.

VrApi

VrApi Includes

Header files for the VrApi library.

VrApi/Includes

Application Framework

Framework and support code for native applications. Includes code for rendering, user interfaces, sound playback, and more. It is not meant to provide the functionality of a full-fledged engine, but it does provide structure and a lot of useful building blocks for building native applications.

VrAppFramework/ Src

VrAppFramework Includes

Header files for the VrAppFramework library.

VrAppFramework/ Include

Native Samples

Sample projects illustrating use of VrApi and VrAppFramework.

VrSamples/Native

LibOVRKernel

The Oculus library.

LibOVRKernel/Src

Native Samples The mobile SDK includes a set of sample projects that prove out virtual reality application development on the Android platform and demonstrate high-performance virtual reality experiences on mobile devices. Sample Applications and Media The sample projects included with the SDK are provided as a convenience for development purposes. Some of these projects are similar to apps available for download from the Oculus Store. Due to the potential for conflict with these versions, we do not recommend running these sample apps on the same device on which you have installed your retail experience. Note: Due to limitations of Android ndk-build, your Oculus Mobile SDK must not contain spaces in its path. If you have placed your Oculus Mobile SDK in a path or folder name that contains spaces, you must move or rename the folder before you build our samples. The following samples can be found in \VrSamples\Native\: • Oculus 360 Photos: A viewer for panoramic stills. • Oculus 360 Videos: A viewer for panoramic videos. • Oculus Cinema: Plays 2D and 3D movies in a virtual movie theater.

• VrController: A simple scene illustrating use of the VrApi Input API. • VR Compositor: A simple scene illustrating use of the different layer types available with vrapi_SubmitFrame2. • VR Cube World: A simple scene with colorful cubes illustrating construction of basic native apps using different tools provided by the SDK. There are three versions of VR Cube World: • VrCubeWorld_SurfaceView is closest to the metal. This sample uses a plain Android SurfaceView and handles all Android Activity and Android Surface lifecycle events in native code. This sample does not use the application framework or LibOVRKernel - it only uses the VrApi. It provides a good example of how to integrate the VrApi in an existing engine. The MULTI_THREADED define encapsulates the code that shows how the VrApi can be integrated into an engine with a separate renderer thread. • VrCubeWorld_NativeActivity uses the Android NativeActivity class to avoid manually handling all the lifecycle events. This sample does not use the application framework or LibOVRKernel - it only uses

Mobile | Native Development Overview | 25

the VrAPI. It provides a good example of how to integrate the VrApi in an existing engine that uses a NativeActivity. The MULTI_THREADED define encapsulates the code that shows how the VrApi can be integrated into an engine with a separate renderer thread. • VrCubeWorld_Framework uses Oculus Mobile Application Framework. When starting a new application from scratch without using a pre-existing engine, the application framework can make development significantly easier. The application framework presents a much simplified application lifecycle, allowing a developer to focus on the virtual reality experience without having to deal with many of the platformspecific details. How to Build A Sample App Information about how to build apps in Android Studio can be found on the Android Studio Basics on page 15 page. How to Install Sample Media Use one of the following methods to install the sample media on your mobile device. 1. Use the supplied script • Connect to the device via USB. • Ensure you have permissions to transfer files to the device. • Run installtophone.bat from your Oculus Mobile SDK directory e.g., C:\Dev\Oculus\Mobile \installToPhone.bat}. 2. Use an ADB command • Connect to the device via USB. • Ensure you have permissions to transfer files to the device. • Issue the following commands from your development folder e.g., C:\Dev\Oculus\Mobile\ adb push sdcard_SDK /sdcard/ 3. Use Windows Explorer • Connect to the device via USB. • Ensure you have permissions to transfer files to the device. • Open a Windows Explorer window and transfer the contents of the sample media folder, e.g., C:\Dev \Oculus\Mobile\sdcard_SDK\oculus\ to the mobile device location e.g., \MobileDevice\Phone\Oculus\.

Android Manifest Settings Configure your manifest with the necessary VR settings, as shown in the following manifest segment. Note: These manifest requirements are intended for development and differ from our submission requirements. Before submitting your application, please be sure to follow the manifest requirements described by our Publishing Guide.

26 | Native Development Overview | Mobile



• Replace with your actual package name, such as "com.oculus.cinema".

• The Android theme should be set to the solid black theme for comfort during application transitioning: Theme.Black.NoTitleBar.Fullscreen • The vr_only meta data tag should be added for VR mode detection. • The required screen orientation is landscape: android:screenOrientation="landscape" • It is recommended that your configChanges are as follows: android:configChanges="density| keyboard|keyboardHidden|navigation|orientation|screenLayout|screenSize|uiMode". Note that the density config change is only required when targeting API level 24 or greater. • Setting android:resizeableActivity is only required when targeting API level 24 or greater. • The minSdkVersion is set to API level 21. This ensures that the app will run on all supported mobile devices. • Do not add the noHistory attribute to your manifest. Applications submission requirements may require additional adjustments to the manifest. Please refer to Application Manifests for Release Versions in our Publishing Guide.

Reserved User Interactions This section describes input actions that are reserved for system level functionality. This includes the following physical buttons: Volume, Back, and Home.

Reserved User Interactions

The Back button, Home button, and Volume button behaviors must conform to specific requirements. Volume Button Interactions Volume adjustment on the Samsung Gear VR and Oculus Go devices are handled automatically. The volume control dialog display is also handled automatically by the VrApi as of Mobile SDK 1.0.3. Do not implement your own volume display handling, or users will see two juxtaposed displays. You may override automatic volume display handling if necessary by setting VRAPI_FRAME_FLAG_INHIBIT_VOLUME_LAYER as an ovrFrameParm flag. Volume buttons are not exposed through vrapi interfaces. Back Button Interactions Back button presses are of three types: long-press, short-press, and aborted long-press. Short-press back button behavior is determined by the application. It is typically (but not necessarily) treated as a generic back action appropriate to the application’s current state. Back actions usually prompt apps to navigate one level up in an interface hierarchy. For example, a shortpress on the back button may bring up the application’s menu. In another application, a short-press may act as a generic back navigation in the UI hierarchy until the root is reached, at which point it may bring up an application-specific menu, or enter the Quit Confirmation dialog, allowing the user to exit the application. In applications built with Unity, if no satisfactory stateful condition is identified by the application, the shortpress opens the Quit Confirmation dialog allowing the user to exit the app and return to Oculus Home.

Mobile | Native Development Overview | 27

Applications built with other engines must implement this handling - see the VrCubeWorld_NativeActivity sample for an example. An aborted long-press results in no action, and when a timer is being shown cancels the timer. When using the vrapi interfaces, the back button will be shown to be down for a short press only on the frame that it is actually released. This prevents the app from having to implement its own short press detection. Home Button Interactions A Home button press always opens a dialog to return the user to Oculus Home. As of Mobile SDK 1.0.4, this behavior is handled automatically by the VrApi. If the Home button is held down on a 3DoF controller, it will start a timer for a controller recenter. The Home button must be held down for 0.75 seconds for the recenter action to complete. The Home button is not exposed through the vrapi interfaces, and no android events will be passed to the app for the Home button. Hardware Platform Differences Oculus Go • A short-press occurs when the button is down less than 0.5 seconds. • There is no response to a long-press on the Back button and no animation timer is shown. Samsung Gear VR • A short-press occurs when the button is down less than 0.25 seconds. • A timer animation is displayed from 0.25 to 0.75 seconds, indicating to the user that a long-press in underway. • A long-press occurs when the back button is held longer than 0.75 seconds. • Long-press on the back button is reserved, and always opens the Settings Menu. As of Mobile SDK 1.0.4, this behavior is handled automatically by the VrApi. You should no longer implement your own long-press back button handling, including the gaze timer.

28 | Native Engine Integration | Mobile

Native Engine Integration This guide describes how to integrate the mobile native SDK with a game engine using VrApi. VrApi provides the minimum required API for rendering scenes in VR. Applications may query VrApi for orientation data, and submit textures to apply distortion, sensor fusion, and compositing. We have provided the source for VrCubeWorld_NativeActivity and VrCubeWorld_SurfaceView, simple sample applications using VrApi, to serve as references. Please see Native Samples on page 24 for more details.

VrApi Lifecycle and Rendering

Multiple Android activities that live in the same address space can cooperatively use the VrApi. However, only one activity can be in "VR mode" at a time. The following explains when an activity is expected to enter/leave VR mode. Android Activity lifecycle An Android Activity can only be in VR mode while the activity is in the resumed state. The following shows how VR mode fits into the Android Activity lifecycle. 1. 2. 3. 4. 5. 6. 7. 8.

VrActivity::onCreate() <---------+ VrActivity::onStart() <-------+ | VrActivity::onResume() <---+ | | vrapi_EnterVrMode() | | | vrapi_LeaveVrMode() | | | VrActivity::onPause() -----+ | | VrActivity::onStop() ----------+ | VrActivity::onDestroy() ---------+

Android Surface lifecycle An Android Activity can only be in VR mode while there is a valid Android Surface. The following shows how VR mode fits into the Android Surface lifecycle. 1. 2. 3. 4. 5.

VrActivity::surfaceCreated() <----+ VrActivity::surfaceChanged() | vrapi_EnterVrMode() | vrapi_LeaveVrMode() | VrActivity::surfaceDestroyed() ---+

Note that the lifecycle of a surface is not necessarily tightly coupled with the lifecycle of an activity. These two lifecycles may interleave in complex ways. Usually surfaceCreated() is called after onResume() and surfaceDestroyed() is called between onPause() and onDestroy(). However, this is not guaranteed and, for instance, surfaceDestroyed() may be called after onDestroy() or even before onPause(). An Android Activity is only in the resumed state with a valid Android Surface between surfaceChanged() or onResume(), whichever comes last, and surfaceDestroyed() or onPause(), whichever comes first. In other words, a VR application will typically enter VR mode from surfaceChanged() or onResume(), whichever comes last, and leave VR mode from surfaceDestroyed() or onPause(), whichever comes first.

Mobile | Native Engine Integration | 29

Android VR lifecycle This is a high-level overview of the rendering pipeline used by VrApi. For more information, see VrApi\Include \VrApi.h. 1. Initialize the API. 2. Create an EGLContext for the application. 3. Get the suggested resolution to create eye texture swap chains with vrapi_GetSystemPropertyInt( &java, VRAPI_SYS_PROP_SUGGESTED_EYE_TEXTURE_WIDTH ). 4. Allocate a texture swap chain for each eye with the application's EGLContext current. 5. Android Activity/Surface lifecycle loop.

Acquire ANativeWindow from Android Surface. Enter VR mode once the Android Activity is in the resumed state with a valid ANativeWindow. Set the tracking transform to use (eye level by default). Frame loop, possibly running on another thread. Get the HMD pose, predicted for the middle of the time period during which the new eye images will be displayed. The number of frames predicted ahead depends on the pipeline depth of the engine and the synthesis rate. The better the prediction, the less black will be pulled in at the edges. f. Advance the simulation based on the predicted display time. g. Render eye images and setup ovrSubmitFrameDesc using ovrTracking2. h. Render to textureId using the ViewMatrix and ProjectionMatrix from ovrTracking2. Insert fence using eglCreateSyncKHR. i. Submit the frame with vrapi_SubmitFrame2. j. Must leave VR mode when the Android Activity is paused or the Android Surface is destroyed or changed. k. Destroy the texture swap chains. Make sure to delete the swapchains before the application's EGLContext is destroyed. 6. Shut down the API. a. b. c. d. e.

Integration The API is designed to work with an Android Activity using a plain Android SurfaceView, where the Activity lifecycle and the Surface lifecycle are managed completely in native code by sending the lifecycle events (onResume, onPause, surfaceChanged etc.) to native code. The API does not work with an Android Activity using a GLSurfaceView. The GLSurfaceView class manages the window surface and EGLSurface and the implementation of GLSurfaceView may unbind the EGLSurface before onPause() gets called. As such, there is no way to leave VR mode before the EGLSurface disappears. Another problem with GLSurfaceView is that it creates the EGLContext using eglChooseConfig(). The Android EGL code pushes in multisample flags in eglChooseConfig() if the user has selected the "force 4x MSAA" option in settings. Using a multisampled front buffer is completely wasted for TimeWarp rendering. Alternately, an Android NativeActivity can be used to avoid manually handling all the lifecycle events. However, it is important to select the EGLConfig manually without using eglChooseConfig() to make sure the front buffer is not multisampled. The vrapi_GetSystemProperty* functions can be called at any time from any thread. This allows an application to setup its renderer, possibly running on a separate thread, before entering VR mode. On Android, an application cannot just allocate a new window/frontbuffer and render to it. Android allocates and manages the window/frontbuffer and (after the fact) notifies the application of the state of affairs through lifecycle events (surfaceCreated / surfaceChanged / surfaceDestroyed). The application (or 3rd party engine) typically handles these events. Since the VrApi cannot just allocate a new window/frontbuffer, and the VrApi does not handle the lifecycle events, the VrApi somehow has to take over ownership of the Android

30 | Native Engine Integration | Mobile

surface from the application. To allow this, the application can explicitly pass the EGLDisplay, EGLContext EGLSurface or ANativeWindow to vrapi_EnterVrMode(), where the EGLSurface is the surface created from the ANativeWindow. The EGLContext is used to match the version and config for the context used by the background time warp thread. This EGLContext, and no other context can be current on the EGLSurface. If, however, the application does not explicitly pass in these objects, then vrapi_EnterVrMode() must be called from a thread with an OpenGL ES context current on the Android window surface. The context of the calling thread is then used to match the version and config for the context used by the background TimeWarp thread. The TimeWarp will also hijack the Android window surface from the context that is current on the calling thread. On return, the context from the calling thread will be current on an invisible pbuffer, because the time warp takes ownership of the Android window surface. Note that this requires the config used by the calling thread to have an EGL_SURFACE_TYPE with EGL_PBUFFER_BIT. Before getting sensor input, the application also needs to know when the images that are going to be synthesized will be displayed, because the sensor input needs to be predicted ahead for that time. As it turns out, it is not trivial to get an accurate predicted display time. Therefore the calculation of this predicted display time is part of the VrApi. An accurate predicted display time can only really be calculated once the rendering loop is up and running and submitting frames regularly. In other words, before getting sensor input, the application needs an accurate predicted display time, which in return requires the renderer to be up and running. As such, it makes sense that sensor input is not available until vrapi_EnterVrMode() has been called. However, once the application is in VR mode, it can call vrapi_GetPredictedDisplayTime() and vrapi_GetPredictedTracking() at any time from any thread. How Eye Images are Synchronized The VrApi allows for one frame of overlap which is essential on tiled mobile GPUs. Because there is one frame of overlap, the eye images have typically not completed rendering by the time they are submitted to vrapi_SubmitFrame(). To allow the time warp to check whether the eye images have completed rendering, the application can explicitly pass in a sync object (CompletionFence) for each eye image through vrapi_SubmitFrame(), or in the case of vrapi_SubmitFrame2, one CompletionFence is specified for the whole frame via the ovrSubmitFrameDescription structure. Note that these sync objects must be EGLSyncKHR because the VrApi still supports OpenGL ES 2.0. If, however, the application does not explicitly pass in sync objects, then vrapi_SubmitFrame() must be called from the thread with the OpenGL ES context that was used for rendering, which allows vrapi_SubmitFrame() to add a sync object to the current context and check if rendering has completed. Note that even if no OpenGL ES objects are explicitly passed through the VrApi, vrapi_EnterVrMode() and vrapi_SubmitFrame() can still be called from different threads. vrapi_EnterVrMode() needs to be called from a thread with an OpenGL ES context that is current on the Android window surface. This does not need to be the same context that is also used for rendering. vrapi_SubmitFrame() needs to be called from the thread with the OpenGL ES context that was used to render the eye images. If this is a different context than the context used to enter VR mode, then for stereoscopic rendering this context *never* needs to be current on the Android window surface. Eye images are passed to vrapi_SubmitFrame() as "texture swap chains" (ovrTextureSwapChain). These texture swap chains are allocated through vrapi_CreateTextureSwapChain(). This is important to allow these textures to be allocated in special system memory. When using a static eye image, the texture swap chain does not need to be buffered and the chain only needs to hold a single texture. When the eye images are dynamically updated, the texture swap chain needs to be buffered. When the texture swap chain is passed to vrapi_SubmitFrame(), the application also passes in the chain index to the most recently updated texture.

Mobile | Native Engine Integration | 31

Frame Timing

It is critical in VR that we never show the user a stale frame. vrapi_SubmitFrame() controls the synthesis rate through an application specified specified frame parameter, SwapInterval. It also determines the point where the calling thread gets released, currently the halfway point of the predicted display refresh cycle. vrapi_SubmitFrame() only returns when both these conditions are met: • the previous eye images have been consumed by the asynchronous time warp (ATW) thread • at least the specified minimum number of V-syncs have passed since the last call to vrapi_SubmitFrame(). The ATW thread consumes new eye images and updates the V-sync counter halfway through a display refresh cycle. This is the first time ATW can start updating the first eye, covering the first half of the display. As a result, vrapi_SubmitFrame() returns and releases the calling thread at the halfway point of the display refresh cycle. Once vrapi_SubmitFrame() returns, synthesis has a full display refresh cycle to generate new eye images up to the next midway point. At the next halfway point, the time warp has half a display refresh cycle (up to Vsync) to update the first eye. The time warp then effectively waits for V-sync and then has another half a display refresh cycle (up to the next-next halfway point) to update the second eye. The asynchronous time warp uses a high priority GPU context and will eat away cycles from synthesis, so synthesis does not have a full display refresh cycle worth of actual GPU cycles. However, the asynchronous time warp tends to be very fast, leaving most of the GPU time for synthesis. Instead of just using the latest sensor sampling, synthesis uses predicted sensor input for the middle of the time period during which the new eye images will be displayed. This predicted time is calculated using vrapi_GetPredictedDisplayTime(). The number of frames predicted ahead depends on the pipeline depth, the extra latency mode, and the minimum number of V-syncs in between eye image rendering. Less than half a display refresh cycle before each eye image will be displayed, ATW will get new predicted sensor input using the very latest sensor sampling. ATW then corrects the eye images using this new sensor input. In other words, ATW always correct the eye images even if the predicted sensor input for synthesis is not perfect. However, the better the prediction for synthesis, the less black will be pulled in at the edges by the asynchronous time warp. The application can improve the prediction by fetching the latest predicted sensor input right before rendering each eye, and passing a, possibly different, sensor state for each eye to vrapi_SubmitFrame(). However, it is very important that both eyes use a sensor state that is predicted for the exact same display time, so both eyes can be displayed at the same time without causing intra frame motion judder. While the predicted orientation can be updated for each eye, the position must remain the same for both eyes, or the position would seem to judder "backwards in time" if a frame is dropped. Ideally the eye images are only displayed for the SwapInterval display refresh cycles that are centered about the eye image predicted display time. In other words, a set of eye images is first displayed at Predicted Display Time - (SwapInterval / 2) display refresh cycles. The eye images should never be shown before this time because that can cause intra frame motion judder. Ideally the eye images are also not shown after Predicted Display Time + (SwapInterval / 2) display refresh cycles, but this may happen if synthesis fails to produce new eye images in time.

Latency Examples

32 | Native Engine Integration | Mobile

Diagram Legend

SwapInterval = 1, ExtraLatencyMode = off • Expected single-threaded simulation latency is 33 milliseconds • ATW reduces this to 8-16 milliseconds.

SwapInterval = 1, ExtraLatencyMode = on • Expected single-threaded simulation latency is 49 milliseconds. • ATW reduces this to 8-16 milliseconds.

Mobile | Native Engine Integration | 33

SwapInterval = 2, ExtraLatencyMode = off • Expected single-threaded simulation latency is 58 milliseconds. • ATW reduces this to 8-16 milliseconds.

SwapInterval = 2, ExtraLatencyMode = on • Expected single-threaded simulation latency is 91 milliseconds. • ATW reduces this to 8-16 milliseconds.

34 | Native Engine Integration | Mobile

VrApi Input API This document describes using the VrApi Input API. The VrApi Input API allows applications linked to VrApi to enumerate and query the state of devices connected to a Samsung Gear VR or Oculus Go. When a device is enumerated, its current state can be queried using the input API. The Input API is defined in VrApi/Src/VrApi_Input.h. For sample usage, see VrSamples/Native/VrController. Input Devices Supported For all devices listed below, there are buttons and interactions that are unavailable or require specific interactions. The Home, Back button long-press, volume up and volume down buttons on the controller and headset are reserved for system use and will not appear in the button state on either input device. Please see the Reserved User Interactions on page 26 page for more information about reserved interactions. Samsung Gear VR Headset Gear VR Headset controls include the touchpad and Back button short-press. Samsung Gear VR Controller and Oculus Go Controller The controllers are orientation-tracked input devices. The controller is positioned relative to the user by using a body model to estimate the location based upon the orientation of the device. Left-handedness versus right-handedness is specified by users during controller pairing, and is used to determine which side of the user’s body to position the controller. Both controllers have a touchpad, a trigger, a home and back button. The Samsung Gear VR Controller also has volume up and down buttons. Bluetooth Gamepads

Mobile | Native Engine Integration | 35

Bluetooth gamepads are exposed through the Input VrApi, attempting to map the device to the classic gamepad model of a X Y B A buttons, left and right triggers, left and right bumpers, a d-pad, and 2 joysticks. Enumerating Devices In order to find a device, an application should call vrapi_EnumerateInputDevices. This function takes a pointer to the ovrMobile context, and an index and a pointer to an ovrInputCapabilityHeader structure. If a device exists for the specified index, the ovrInputCapabilityHeader’s Type and DeviceID members are set upon return. Once a device is enumerated, its full capabilities can be queried with vrapi_GetInputDeviceCapabilities. This function also takes a pointer to an ovrInputCapabilityHeader structure, but the caller must pass a structure that is appropriate for the ovrControllerType that was returned by vrapi_EnumerateInputDevices. For instance, if vrapi_EnumerateInputDevices returns a Type of ovrControllerType_TrackedRemote when passed an index of 0, then the call to vrapi_GetInputDeviceCapabilities should pass a pointer to the Header field inside of a ovrInputTrackedRemoteCapabilities structure. For example: ovrInputCapabilityHeader capsHeader; if ( vrapi_EnumerateInputDevices( ovrContext, 0, &capsHeader ) >= 0 ) { if ( capsHeader.Type == ovrControllerType_TrackedRemote ) { ovrInputTrackedRemoteCapabilities remoteCaps; if ( vrapi_GetInputDeviceCapabilities( ovr, &remoteCaps.Header ) >= 0 ) { // remote is connected } } }

After successful enumeration, the ovrInputCapabilityHeader structure that was passed to vrapi_EnumerateInputDevices will have its DeviceID field set to the device ID of the enumerated controller. The device state can then be queried by calling vrapi_GetInputTrackingState as described below. Device Connection and Disconnection Devices are considered connected once they are enumerated through vrapi_EnumerateInputDevices, and when vrapi_GetInputTrackingState and vrapi_GetCurrentInputState return valid results. vrapi_EnumerateInputDevices does not do any significant work and may be called each frame to check if a device is present or not. Querying Device Input State The state of the input device can be queried via the vrapi_GetCurrentInputState function. Both functions take deviceIDs and pointers to ovrInputStateHeader structures. Before calling these functions, fill in the header’s Type field with the type of device that is associated with the passed deviceID. Make sure the structure passed to these functions is not just a header, but the appropriate structure for the device type. For instance, when querying a controller, pass an ovrInputTrackedRemoteCapabilities structure with the Header.Type field set to ovrControllerType_TrackedRemote. ovrInputStateTrackedRemote remoteState; remoteState.Header.Type = ovrControllerType_TrackedRemote; if ( vrapi_GetCurrentInputState( ovr, controllerDeviceID, &remoteState.Header ) >= 0 ) { // act on device state returned in remoteState }

36 | Native Engine Integration | Mobile

vrapi_GetCurrentInputState returns the controller’s current button and trackpad state. Querying Device Tracking State To query the orientation tracking state of a device, call vrapi_GetInputTrackingState and pass it a predicted pose time. Passing a predicted pose time of 0 will return the most recently-sampled pose. ovrTracking trackingState; if ( vrapi_GetInputTrackingState( ovr, controllerDeviceID, &trackingState ) >= 0 )

VrApi implements an arm model that uses the controller’s orientation to synthesize a plausible hand position each frame. The tracking state will return this position in the Position field of the predicted tracking state’s HeadPose.Pose member. Controller handedness may be queried using vrapi_GetInputDeviceCapabilities as described in Enumerating Devices above. Applications that implement their own arm models are free to ignore this position and calculate a position based on the Orientation field that is returned in the predicted tracking state’s pose. Recentering the Controller Users may experience some orientation drift in the yaw axis, causing the physical controller's orientation to go out of alignment with its VR representation. To synchronize the physical controller’s orientation with the VR representation, users should: 1. Point the controller in the direction of the forward axis of their headset, and 2. Press and hold the Home button for one second. When a recenter occurs, the VrApi arm model is notified and the arm model’s shoulders are repositioned to align to the headset’s forward vector. This is necessary because the shoulders do not automatically rotate with the head. Applications that implement their own arm models can poll the device input state’s RecenterCount field to determine when the controller is recentered. RecenterCount increments only when a recenter is performed. We recommend recentering arm models based on the head pose when this field changes. Headset Emulation Emulation mode is convenient for applications that have not been rebuilt to use the new controller API. When enabled, Gear VR and Oculus Go Controller touch values send Android touch events using the same mapping as Gear Vr headset touch values, and applications cannot distinguish headset inputs from controller inputs. Headset emulation for the controller can be toggled on or off by calling vrapi_SetRemoteEmulation. It is disabled by default. When emulation is enabled, applications that load a new VrApi with Gear VR Controller support will receive input from the controller through Android Activity’s dispatchKeyEventx and dispatchTouchEvent methods. New applications and applications that are specifically updated to use the controller should use the VrApi Input API to enumerate the controller and query its state directly. Applications may also want to enumerate the headset and query its state through the same API. Touchpad Swiping Gestures For touchpads, the user interface of your VR experience should follow these natural scrolling and swiping gestures: • Swipe up: Pull content upward. Equivalent to scrolling down.

Mobile | Native Engine Integration | 37

• Swipe down: Pull content downward. Equivalent to scrolling up. • Swipe left: Pull content left or go to the next item or page. • Swipe right: Pull content right or go to the previous item or page.

Asynchronous TimeWarp (ATW) Asynchronous TimeWarp (ATW) transforms stereoscopic images based on the latest head-tracking information to significantly reduce the motion-to-photon delay. reducing latency and judder in VR applications. Overview In a basic VR game loop, the following occurs: 1. 2. 3. 4.

The software requests your head orientation. The CPU processes the scene for each eye. The GPU renders the scenes. The Oculus Compositor applies distortion and displays the scenes on the headset.

The following shows a basic example of a game loop: Figure 1: Basic Game Loop

When frame rate is maintained, the experience feels real and is enjoyable. When it doesn’t happen in time, the previous frame is shown which can be disorienting. The following graphic shows an example of judder during the basic game loop: Figure 2: Basic Game Loop with Judder

When you move your head and the world doesn’t keep up, this can be jarring and break immersion. ATW is a technique that shifts the rendered image slightly to adjust for changes in head movement. Although the image is modified, your head does not move much, so the change is slight. Additionally, to smooth issues with the user’s computer, game design or the operating system, ATW can help fix "potholes" or moments when the frame rate unexpectedly drops. The following graphic shows an example of frame drops when ATW is applied: Figure 3: Game Loop with ATW

38 | Native Engine Integration | Mobile

At the refresh interval, the Compositor applies TimeWarp to the last rendered frame. As a result, a TimeWarped frame will always be shown to the user, regardless of frame rate. If the frame rate is very bad, flicker will be noticeable at the periphery of the display. But, the image will still be stable. ATW is automatically applied by the Oculus Compositor; you do not need to enable or tune it. Although ATW reduces latency, make sure that your application or experience makes frame rate. Discussion Stereoscopic eye views are rendered to textures, which are then warped onto the display to correct for the distortion caused by wide angle lenses in the headset. To reduce the motion-to-photon delay, updated orientation information is retrieved for the headset just before drawing the time warp, and a transformation matrix is calculated that warps eye textures from where they were at the time they were rendered to where they should be at the time they are displayed. Many people are skeptical on first hearing about this, but for attitude changes, the warped pixels are almost exactly correct. A sharp rotation will leave some pixels black at the edges, but this turns out to be minimally distracting. The time warp is taken a step farther by making it an "interpolated time warp." Because the video is scanned out at a rate of about 120 scan lines a millisecond, scan lines farther to the right have a greater latency than lines to the left. On a sluggish LCD this doesn't really matter, but on a crisp switching OLED, users may feel like the world is subtly stretching or shearing when they turn quickly. This is corrected by predicting the head attitude at the beginning of each eye, a prediction of < 8 milliseconds, and the end of each eye, < 16 milliseconds. These predictions are used to calculate time warp transformations, and the warp is interpolated between these two values for each scan line drawn. The time warp may be implemented on the GPU by rendering a full screen quad with a fragment program that calculates warped texture coordinates to sample the eye textures. However, for improved performance the time warp renders a uniformly tessellated grid of triangles over the whole screen where the texture coordinates are setup to sample the eye textures. Rendering a grid of triangles with warped texture coordinates basically results in a piecewise linear approximation of the time warp. If the time warp runs asynchronous to the stereoscopic rendering, then it may also be used to increase the perceived frame rate and to smooth out inconsistent frame rates. By default, the time warp currently runs asynchronously for both native and Unity applications.

TimeWarp Minimum Vsyncs The TimeWarp MinimumVsyncs parameter default value is 1 for a 60 FPS target. Setting it to 2 will reduce the maximum application frame rate to no more than 30 FPS. The asynchronous TimeWarp thread will continue to render new frames with updated head tracking at 60 FPS, but the application will only have an opportunity to generate 30 new stereo pairs of eye buffers per second. You can set higher values for experimental purposes, but the only sane values for shipping apps are 1 and 2. There are two cases where you might consider explicitly setting this: If your application can't hold 60 FPS most of the time, it might be better to clamp at 30 FPS all the time, rather than have the app smoothness or behavior change unpredictably for the user. In most cases, we believe that simplifying the experiences to hold 60 FPS is the correct decision, but there may be exceptions. Rendering at 30 application FPS will save a significant amount of power and reduce the thermal load on the device. Some applications may be able to hit 60 FPS, but run into thermal problems quickly, which can have catastrophic performance implications -- it may be necessary to target 30 FPS if you want to be able to play for extended periods of time. See Power Management for more information regarding thermal throttle mitigation strategies.

Mobile | Native Engine Integration | 39

Consequences of not rendering at 60 FPS These consequences apply whether you have explicitly set MinimumVsyncs or your app is simply going that slow by itself. If the viewpoint is far away from all geometry, nothing is animating, and the rate of head rotation is low, there will be no visual difference. When any of these conditions are not present, there will be greater or lesser artifacts to balance. If the head rotation rate is high, black at the edges of the screen will be visibly pulled in by a variable amount depending on how long it has been since an eye buffer was submitted. This still happens at 60 FPS, but because the total time is small and constant from frame to frame, it is almost impossible to notice. At lower frame rates, you can see it snapping at the edges of the screen. There are two mitigations for this: 1) Instead of using either "now" or the time when the frame will start being displayed as the point where the head tracking model is queried, use a time that is at the midpoint of all the frames that the eye buffers will be shown on. This distributes the "unrendered area" on both sides of the screen, rather than piling up on one. 2) Coupled with that, increasing the field of view used for the eye buffers gives it more cushion off the edges to pull from. For native applications, we currently add 10 degrees to the FOV when the frame rate is below 60. If the resolution of the eye buffers is not increased, this effectively lowers the resolution in the center of the screen. There may be value in scaling the FOV dynamically based on the head rotation rates, but you would still see an initial pop at the edges, and changing the FOV continuously results in more visible edge artifacts when mostly stable. TimeWarp does not currently attempt to compensate for changes in position, only attitude. We don't have real position tracking in mobile yet, but we do use a head / neck model that provides some eye movement based on rotation, and apps that allow the user to navigate around explicitly move the eye origin. These values will not change at all between eye updates, so at 30 eye FPS, TimeWarp would be smoothly updating attitude each frame, but movement would only change every other frame. Walking straight ahead with nothing really close by works rather better than might be expected, but sidestepping next to a wall makes it fairly obvious. Even just moving your head when very close to objects makes the effect visible. There is no magic solution for this. We do not have the performance headroom on mobile to have TimeWarp do a depth buffer informed reprojection, and doing so would create new visual artifacts in any case. There is a simplified approach that we may adopt that treats the entire scene as a single depth, but work on it is not currently scheduled. It is safe to say that if your application has a significant graphical element nearly stuck to the view, like an FPS weapon, that it is not a candidate for 30 FPS. Turning your viewpoint with a controller is among the most nauseating things you can do in VR, but some games still require it. When handled entirely by the app, this winds up being like a position change, so a low-frame-rate app would have smooth "rotation" when the user's head was moving, but chunky rotation when they use the controller. To address this, TimeWarp has an "ExternalVelocity" matrix parameter that can allow controller yaw to be smoothly extrapolated on every rendered frame. We do not currently have a Unity interface for this. In-world animation will be noticeably chunkier at lower frame rates, but in-place doesn't wind up being very distracting. Objects on trajectories are more problematic, because they appear to be stuttering back and forth as they move, when you track them with your head. For many apps, monoscopic rendering may still be a better experience than 30 FPS rendering. The savings are not as large, but it is a clear tradeoff without as many variables.

40 | Native Engine Integration | Mobile

If you go below 60 FPS, Unity apps may be better off without the multi-threaded renderer, which adds a frame of latency. 30 FPS with GPU pipeline and multi-threaded renderer is getting to be a lot of latency, and while TimeWarp will remove all of it for attitude, position changes including the head model, will feel very lagged. Note that this is all bleeding edge, and some of this guidance is speculative.

Power Management Power management is a crucial consideration for mobile VR development. A governor process on the device monitors an internal temperature sensor and tries to take corrective action when the temperature rises above certain levels to prevent malfunctioning or scalding surface temperatures. This corrective action consists of lowering clock rates. If you run hard into the limiter, the temperature will continue climbing even as clock rates are lowered, and CPU clocks may drop resulting in a significantly degraded VR experience. If your app consistently uses most of the available processing power, you will eventually run into the thermal governor, even if you have no problem at first. A typical manifestation is poor app performance after ten minutes of play. If you filter logcat output for "thermal" you will see various notifications of sensor readings and actions being taken. (For more on logcat, see Android Debugging: Logcat.) A difference to note between mobile and PC/console development is that no optimization is ever wasted. Without power considerations, if you have the frame ready in time, it doesn't matter if you used 90% of the available time or 10%. On mobile, every operation drains the battery and heats the device. Of course, optimization entails effort that comes at the expense of something else, but it is important to note the tradeoff.

Managing Power Consumption

To deal with the power and heat issues identified above, the Fixed Clock Level API on Gear VR and Dynamic Clock Throttling on Go allows your app to manage heat and power consumption. Fixed Clock Level API On current devices, the CPU and GPU clock rates are fixed to the application set values until the device temperature reaches the limit, at which point the CPU and GPU clocks will change to the power save levels. This change can be detected (see Power State Notification and Mitigation Strategy below). Apps may choose to continue operating in a degraded fashion, perhaps by changing to 30 FPS or monoscopic rendering. Others may display a warning screen saying that play cannot continue. The fixed CPU level and fixed GPU level set by the Fixed Clock Level API are abstract quantities, not MHz / GHz, so some effort can be made to make them compatible with future devices. For current hardware, the levels can be 0, 1, 2, or 3 for CPU and GPU. 0 is the slowest and most power efficient; 3 is the fastest and hottest. Not all clock combinations are valid for all devices. For example, the highest GPU level may not be available for use with the two highest CPU levels. If an invalid matrix combination is provided, the system will not acknowledge the request and clock settings will go into dynamic mode. VrApi asserts and issues a warning in this case. The Samsung Galaxy S8 and higher devices have a CPU frequency range instead of a set clock. CPU levels will be set to the minimum frequency in that range and scale up as needed. Note: Use caution with combinations (2,3) and (3,3) because they are likely to lead quickly to overheating. For most apps, we recommend ensuring it runs well at (2,2).

Mobile | Native Engine Integration | 41

To set the CPU and GPU clock levels call: vrapi_SetClockLevels( ovrMobile * ovr, const int32_t cpuLevel, const int32_t gpuLevel );

With your desired clock level. Default clock levels are cpuLevel = 2, gpuLevel = 2. Dynamic Clock Throttling on Go Oculus Go introduces Dynamic Clock Throttling which scales the performance of the Go’s CPU and GPU up as necessary to maintain performance. As they use similar components, clock frequencies of the Go will be similar to the Samsung Galaxy S7. Go apps still set levels using the Fixed Clock Level API as described above, but these levels are now treated as a baseline and the system can choose to dynamically increase the CPU and GPU clock up based on the app and system performance. The Dynamic Clock will never downclock your app's performance. As the Oculus Go is able to manage thermal issues much more efficiently than a phone, we’ve opened CPU and GPU level 4 as an even higher performance benchmark. More information about how Dynamic Clock Throttling works is available on the Optimizing Oculus Go for Performance blog post. When testing and debugging your Go app, you should disable Dynamic Throttling so that it does not interfere with performance timing. You can do this via adb: adb shell setprop debug.oculus.adaclocks.force 0

The system will remain off until you restart the device, or turn Dynamic Throttling back on by setting the above property to 1. Alternatively, you can also test your app by using the VrAPI logs to review the clock level of your app. Please review the Testing and Troubleshooting on page 61 guide for more information.

Power Management and Performance

There are no magic settings in the SDK to fix power consumption - this is critical. The length of time your application will be able to run before running into the thermal limit depends on two factors: how much work your app is doing, and what the clock rates are. Changing the clock rates all the way down only yields about a 25% reduction in power consumption for the same amount of work, so most power saving has to come from doing less work in your app. If your app can run at the (0,0) setting, it should never have thermal issues. It is certainly possible to make sophisticated applications at that level, but Unity-based applications might be difficult to optimize for this setting. There are effective tools for reducing the required GPU performance: • • • • •

Don’t use chromatic aberration correction on TimeWarp. Don’t use 4x MSAA. Reduce the eye target resolution. Using 16-bit color and depth buffers may help. It is probably never a good trade to go below 2x MSAA – you should reduce the eye target resolution instead.

These all entail quality tradeoffs which need to be balanced against steps you can take in your application: • Reduce overdraw (especially blended particles) and complex shaders. • Always make sure textures are compressed and mipmapped.

42 | Native Engine Integration | Mobile

In general, CPU load seems to cause more thermal problems than GPU load. Reducing the required CPU performance is much less straightforward. Unity apps should always use the multithreaded renderer option, since two cores running at 1 GHz do work more efficiently than one core running at 2 GHz. If you find that you just aren’t close, then you may need to set MinimumVsyncs to 2 and run your game at 30 FPS, with TimeWarp generating the extra frames. Some things work out okay like this, but some interface styles and scene structures highlight the limitations. For more information on how to set MinimumVsyncs, see the TimeWarp technical note. In summary, our general advice: If you are making an app that will probably be used for long periods of time, like a movie player, pick very low levels. Ideally use (0,0), but it is possible to use more graphics if the CPUs are still mostly idle, perhaps up to (0,2). If you are okay with the app being restricted to ten-minute chunks of play, you can choose higher clock levels. If it doesn’t work well at (2,2), you probably need to do some serious work. With the clock rates fixed, observe the reported FPS and GPU times in logcat. The GPU time reported does not include the time spent resolving the rendering back to main memory from on-chip memory, so it is an underestimate. If the GPU times stay under 12 ms or so, you can probably reduce your GPU clock level. If the GPU times are low, but the frame rate isn’t 60 FPS, you are CPU limited. Always build optimized versions of the application for distribution. Even if a debug build performs well, it will draw more power and heat up the device more than a release build. Optimize until it runs well. For more information on how to improve your Unity application’s performance, see Best Practices and Performance Targets in our Unity documentation.

Power State Notification and Mitigation Strategy The VrAPI provides power level state detection and handling.

Power level state refers to whether the device is operating at normal clock frequencies or if the device has risen above a thermal threshold and thermal throttling (power save mode) is taking place. In power save mode, CPU and GPU frequencies will be switched to power save levels. The power save levels are equivalent to setting the fixed CPU and GPU clock levels to (0, 0). If the temperature continues to rise, clock frequencies will be set to minimum values which are not capable of supporting VR applications. Once we detect that thermal throttling is taking place, the app has the choice to either continue operating in a degraded fashion or to immediately exit to the Oculus Menu with a head-tracked error message. In the first case, when the application first transitions from normal operation to power save mode, the following will occur: • A system menu will be brought up to display a dismissible warning message indicating that the device needs to cool down. • Once the message is dismissed, the application will resume in 30Hz TimeWarp mode with correction for chromatic aberration disabled. • If the device clock frequencies are throttled to minimum levels after continued use, a non-dismissible error message will be shown and the user will have to undock the device. In this mode, the application may choose to take additional app-specific measures to reduce performance requirements. For Native applications, you may use the following VrApi call to detect if power save mode is active: vrapi_GetSystemStatusInt( &java, VRAPI_SYS_STATUS_THROTTLED ) != VRAPI_FALSE ).

Mobile | Native Engine Integration | 43

In the second case, when the application transitions from normal operation to power save mode, a system menu will be brought up to display a non-dismissible error message and the user will have to undock the device to continue. This mode is intended for applications which may not perform well at reduced levels even with 30Hz TimeWarp enabled. To enable power save mode, the native application should specify the ovrModeFlags flag VRAPI_MODE_FLAG_ALLOW_POWER_SAVE on the ovrModeParms for vrapi_EnterVrMode.

Advanced Rendering This section describes advanced rendering features available through the SDK.

Multi-View Overview With stock OpenGL, the method of stereo rendering is achieved by rendering to the two eye buffers sequentially. This typically doubles the application and driver overhead, despite the fact that the command streams and render states are almost identical. The GL_OVR_multiview extension addresses the inefficiency of sequential multi-view rendering by adding a means to render to multiple elements of a 2D texture array simultaneously. Using the multi-view extension, draw calls are instanced into each corresponding element of the texture array. The vertex program uses a new ViewID variable to compute per-view values, typically the vertex position and view-dependent variables like reflection. The formulation of the extension is high level in order to allow implementation freedom. On existing hardware, applications and drivers can realize the benefits of a single scene traversal, even if all GPU work is fully duplicated per-view. But future support could enable simultaneous rendering via multi-GPU, tile-based architectures could sort geometry into tiles for multiple views in a single pass, and the implementation could even choose to interleave at the fragment level for better texture cache utilization and more coherent fragment shader branching. The most obvious use case in this model is to support two simultaneous views: one view for each eye. However, multi-view can also be used for foveated rendering, where two views are rendered per eye, one with a wide field of view and the other with a narrow one. The nature of wide field of view planar projection is that the sample density can become unacceptably low in the view direction. By rendering two inset eye views per eye, the required sample density is achieved in the center of projection without wasting samples, memory, and time by oversampling in the periphery. Basic Usage The GL_OVR_multiview extension is not a turn-key solution that can simply be enabled to support multi-view rendering in an application. An application must explicitly support GL_OVR_multiview to get the benefits. The GL_OVR_multiview extension is used by the application to render the scene, and the VrApi is unaware of its use. The VrApi supports sampling from the layers of a texture array, but is otherwise completely unaware of how the application produced the texture data, whether multi-view rendering was used or not. However, because of various driver problems, an application is expected to query the VrApi to find out whether or not multi-view is properly supported on a particular combination of device and OS. For example: vrapi_GetSystemPropertyInt( &Java, VRAPI_SYS_PROP_MULTIVIEW_AVAILABLE ) == VRAPI_TRUE;

44 | Native Engine Integration | Mobile

Restructuring VrAppFramework Rendering For Multi-view The following section describes how to convert your application to be Multi-View compliant based on the VrAppFramework Multi-View setup. In order to set up your rendering path to be multi-view compliant, your app should specify a list of surfaces and render state back to App Frame(). Immediate GL calls inside the app main render pass are not compatible with multi-view rendering and not allowed. The first section below describes how to transition your app from rendering with DrawEyeview and instead return a list of surfaces back to the application framework. The section below describes multi-view rendering considerations and how to enable it in your app. Return Surfaces From Frame Set up the Frame Result: Apps should set up the ovrFrameResult which is returned by Frame with the following steps: 1. Set up the ovrFrameParms - storage for which should be maintained by the application. 2. Set up the FrameMatrices - this includes the CenterEye and View and Projection matrices for each eye. 3. Generate a list of render surfaces and append to the frame result Surfaces list. a. Note: The surface draw order will be the order of the list, from lowest index (0) to highest index. b. Note: Do not free any resources which surfaces in list rely on while Frame render is in flight. 4. Optionally, specify whether to clear the color or depth buffer with clear color. OvrSceneView Example An example using the OvrSceneView library scene matrices and surface generation follows: ovrFrameResult OvrApp::Frame( const ovrFrameInput & vrFrame ) { ... // fill in the frameresult info for the frame. ovrFrameResult res; // Let scene construct the view and projection matrices needed for the frame. Scene.GetFrameMatrices( vrFrame.FovX, vrFrame.FovY, res.FrameMatrices ); // Let scene generate the surface list for the frame. Scene.GenerateFrameSurfaceList( res.FrameMatrices, res.Surfaces ); // Initialize the FrameParms. FrameParms = vrapi_DefaultFrameParms( app->GetJava(), VRAPI_FRAME_INIT_DEFAULT, vrapi_GetTimeInSeconds(), NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { FrameParms.Layers[0].Textures[eye].ColorTextureSwapChain = vrFrame.ColorTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].DepthTextureSwapChain = vrFrame.DepthTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].TextureSwapChainIndex = vrFrame.TextureSwapChainIndex; FrameParms.Layers[0].Textures[eye].TexCoordsFromTanAngles = vrFrame.TexCoordsFromTanAngles; FrameParms.Layers[0].Textures[eye].HeadPose = vrFrame.Tracking.HeadPose; } FrameParms.ExternalVelocity = Scene.GetExternalVelocity(); FrameParms.Layers[0].Flags = VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION; res.FrameParms = (ovrFrameParmsExtBase *) & FrameParms; return res; }

Custom Rendering Example

Mobile | Native Engine Integration | 45

First, you need to make sure any immediate GL render calls are represented by an ovrSurfaceDef. In the DrawEyeView path, custom surface rendering was typically done by issuing immediate GL calls. glActiveTexture( GL_TEXTURE0 ); glBindTexture( GL_TEXTURE_2D, BackgroundTexId ); glDisable( GL_DEPTH_TEST ); glDisable( GL_CULL_FACE ); GlProgram & prog = BgTexProgram; glUseProgram( prog.Program ); glUniform4f( prog.uColor, 1.0f, 1.0f, 1.0f, 1.0f ); glBindVertexArray( globeGeometry.vertexArrayObject ); glDrawElements( globeGeometry.primitiveType, globeGeometry.indexCount, globeGeometry.IndicesType, NULL ); glUseProgram( 0 ); glActiveTexture( GL_TEXTURE0 ); glBindTexture( GL_TEXTURE_2D, 0 );

Instead, with the multi-view compliant path, an ovrSurfaceDef and GlProgram would be defined at initialization time as follows. static ovrProgramParm BgTexProgParms[] = { { "Texm", ovrProgramParmType::FLOAT_MATRIX4 }, { "UniformColor", ovrProgramParmType::FLOAT_VECTOR4 }, { "Texture0", ovrProgramParmType::TEXTURE_SAMPLED }, }; BgTexProgram= GlProgram::Build( BgTexVertexShaderSrc, BgTexFragmentShaderSrc, BgTexProgParms, sizeof( BgTexProgParms) / sizeof( ovrProgramParm ) ); GlobeSurfaceDef.surfaceName = "Globe"; GlobeSurfaceDef.geo = BuildGlobe(); GlobeSurfaceDef.graphicsCommand.Program = BgTexProgram; GlobeSurfaceDef.graphicsCommand.GpuState.depthEnable = false; GlobeSurfaceDef.graphicsCommand.GpuState.cullEnable = false; GlobeSurfaceDef.graphicsCommand.UniformData[0].Data = &BackGroundTexture; GlobeSurfaceDef.graphicsCommand.UniformData[1].Data = &GlobeProgramColor;

At Frame time, the uniform values can be updated, changes to the gpustate can be made, and the surface(s) added to the render surface list. Note: This manner of uniform parm setting requires the application to maintain storage for the uniform data. Future SDKs will provide helper functions for setting up uniform parms and materials. An example of setting up FrameResult using custom rendering follows: ovrFrameResult OvrApp::Frame( const ovrFrameInput & vrFrame ) { ... // fill in the frameresult info for the frame. ovrFrameResult res; // calculate the scene matrices for the frame. res.FrameMatrices.CenterView = vrapi_GetCenterEyeViewMatrix( &app->GetHeadModelParms(), &vrFrame.Tracking, NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { res.FrameMatrices.EyeView[eye] = vrapi_GetEyeViewMatrix( &app->GetHeadModelParms(), &CenterEyeViewMatrix, eye ); res.FrameMatrices.EyeProjection[eye] = ovrMatrix4f_CreateProjectionFov( vrFrame.FovX, vrFrame.FovY, 0.0f, 0.0f, 1.0f, 0.0f ); } // Update uniform variables and add needed surfaces to the surface list. BackGroundTexture = GlTexture( BackgroundTexId, 0, 0 ); GlobeProgramColor = Vector4f( 1.0f, 1.0f, 1.0f, 1.0f ); res.Surfaces.PushBack( ovrDrawSurface( &GlobeSurfaceDef ) );

46 | Native Engine Integration | Mobile

// Initialize the FrameParms. FrameParms = vrapi_DefaultFrameParms( app->GetJava(), VRAPI_FRAME_INIT_DEFAULT, vrapi_GetTimeInSeconds(), NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { FrameParms.Layers[0].Textures[eye].ColorTextureSwapChain = vrFrame.ColorTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].DepthTextureSwapChain = vrFrame.DepthTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].TextureSwapChainIndex = vrFrame.TextureSwapChainIndex; FrameParms.Layers[0].Textures[eye].TexCoordsFromTanAngles = vrFrame.TexCoordsFromTanAngles; FrameParms.Layers[0].Textures[eye].HeadPose = vrFrame.Tracking.HeadPose; } FrameParms.ExternalVelocity = Scene.GetExternalVelocity(); FrameParms.Layers[0].Flags = VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION; res.FrameParms = (ovrFrameParmsExtBase *) & FrameParms; return res; }

Specify the Render Mode: In your app Configure(), specify the appropriate render mode. To configure the app to render using the surfaces returned by Frame, set the following: settings.RenderMode = RENDERMODE_STEREO;

Multi-view Render Path Before enabling the multi-view rendering path, you will want to make sure your render data is multi-view compatible. This involves: Position Calculation App render programs should no longer specify Mvpm directly and should instead calculate gl_Position using the system provided TransformVertex() function which accounts for the correct view and projection matrix for the current viewID. Per-Eye View Calculations Apps will need to take into consideration per-eye-view calculations: Examples follow: Per-Eye Texture Matrices: In the DrawEyeView path, the texture matrix for the specific eye was bound at the start of each eye render pass. For multi-view, an array of texture matrices indexed by VIEW_ID should be used. Note: Due to a driver issue with the Adreno 420 and version 300 programs, uniform array of matrices should be contained inside a uniform buffer object. Stereo Images: In the DrawEyeView path, the image specific to the eye was bound at the start of each eye render pass. For multi-view, while an array of texture index by VIEW_ID would be preferable, Not all supported platforms support using texture arrays. Instead, specify both textures in the fragment shader and the selection determined by the VIEW_ID. External Image Usage Applications which make use of image_external, i.e. video rendering applications, must take care when constructing image_external shader programs. Not all drivers support image_external as version 300. The good news is that drivers which fully support multiview will support image_external in the version 300 path, which means image_external programs will work

Mobile | Native Engine Integration | 47

correctly when the multi-view path is enabled. However, for drivers which do not fully support multi-view, these shaders will be compiled as version 100. These shaders must continue to work in both paths, i.e., version 300 only constructs should not be used and the additional extension specification requirements, listed above, should be made. For some cases, the cleanest solution may be to only use image_external during Frame to copy the contents of the external image to a regular texture2d which is then used in the main app render pass (which could eat into the multi-view performance savings) Enable Multi-view Rendering Finally, to enable the multi-view rendering path, set the render mode in your app Configure() to the following: settings.RenderMode = RENDERMODE_MULTIVIEW;

Fixed Foveated Rendering Fixed Foveated Rendering (FFR) renders the edges of your eye textures at a lower resolution than the center. The effect, which is nearly imperceptible, lowers the fidelity of the scene in the viewer's peripheral vision. This reduces the GPU load as a result of the reduction in pixel shading requirements. FFR can dramatically increase the resolution of the eye texture, improving the image show in the headset. Complex fragment shaders also benefit from this form of multi-resolution rendering. Unlike some other forms of foveation technologies, Oculus Go's Fixed Foveation system is not based on eye tracking. The high-resolution pixels are fixed in the center of the eye texture. Fixed Foveated Rendering is only available on the Oculus Go. A detailed look at the benefits of using FFR can be found in our Optimizing Oculus Go for Performance blog post. Implementing Fixed Foveated Rendering First, you may wish to check if the device supports foveated rendering. Check the system property VRAPI_SYS_PROP_FOVEATION_AVAILABLE, which will return VRAPI_TRUE if foveated rendering is supported. Then, to use FFR, call the following to set the degree of foveation vrapi_SetPropertyInt( &Java, VRAPI_FOVEATION_LEVEL, level );

Where level can be 0, 1, 2, or 3. • • • •

0 disables multi-resolution 1 low FFR setting 2 medium FFR setting 3 high FFR setting

These values set the degree of foveation. The images below demonstrate the degree to which the resolution will be affected.

48 | Native Engine Integration | Mobile

Low FFR

Medium FFR

Mobile | Native Engine Integration | 49

High FFR In the images above, the white areas at the center of our FOV, the resolution is native: every pixel of the texture will be computed independently by the GPU. However, in the red areas, only 1/2 of the pixels will be calculated, 1/4 for the green areas, 1/8 for the blue areas, and 1/16 for the magenta tiles. The missing pixels will be interpolated from the calculated pixels at resolve time, when the GPU stores the result of its computation in general memory. You may choose to change the degree of foveation based on the scene elements. Apps or scenes with high pixel shader costs will see the most benefit from using FFR. Apps with very simple shaders may see a net performance loss from the overhead of using FFR. Proper implementation of FFR requires testing and tuning to balance visual quality and GPU performance.

72 Hz Mode The Oculus Go can optionally render your application at 72 frames per second rather than the normal 60 frames. The resulting output is brighter with improved color clarity. Note: This feature is only available for the Oculus Go. To change the refresh rate, call vrapi_SetDisplayRefreshRate(). The available refresh rates can be queried with vrapi_GetSystemPropertyInt() using VRAPI_SYS_PROP_NUM_SUPPORTED_DISPLAY_REFRESH_RATES to get the number of supported rates, and vrapi_GetSystemPropertyFloatArray() with VRAPI_SYS_PROP_SUPPORTED_DISPLAY_REFRESH_RATES to get an array of the actual rates that are supported. An app rendering at 72 Hz requires roughly 20% more power to maintain the same framerate as rendering at 60 Hz.

50 | Native Engine Integration | Mobile

Our Optimizing Oculus Go for Performance blog post contains recommendations for when to use 72 Hz mode.

Mobile | Native Application Framework | 51

Native Application Framework The VrAppFramework provides a wrapper around Android activity that manages the Android lifecycle. We have provided the simple sample scene VrCubeWorld_Framework to illustrate using the application framework with GuiSys, OVR_Locale, and SoundEffectContext. Please see Native Samples on page 24 for details.

Creating New Apps with the Framework Template This section will get you started with creating new native applications using VrAppFramework. Template Project Using the Application Framework VrTemplate is the best starting place for creating your own mobile app using VrAppFramework. The VrTemplate project is set up for exploratory work and as a model for setting up similar native applications using the Application Framework. We include the Python script make_new_project.py (for Mac OS X) and make_new_project.bat (with wrapper for Windows) to simplify renaming the project name set in the template. Usage Example: Windows To create your own mobile app based on VrTemplate, perform the following steps: 1. Run \VrSamples\Native\VrTemplate\make_new_project.bat, passing the name of your new app and your company as arguments. For example: make_new_project.bat VrTestApp YourCompanyName

2. Your new project will now be located in \VrSamples\Native\VrTestApp. The packageName will be set to com.YourCompanyName.VrTestApp. 3. Copy your oculussig file to the Project/assets/ folder of your project. (See Application Signing for more information.) 4. Navigate to your new project directory. With your Android device connected, execute the build.bat located inside your test app directory to verify everything is working. 5. build.bat should build your code, install it to the device, and launch your app on the device. One parameter controls the build type: • clean - cleans the project’s build files

• debug - builds a debug application • release - builds a release application • -n - skips installation of the application to the phone The Java file VrSamples/Native/VrTestApp/java/com/YourCompanyName/vrtestapp/MainActivity.java handles loading the native library that was linked against VrApi, then calls nativeSetAppInterface() to allow the C ++ code to register the subclass of VrAppInterface that defines the application. See VrAppFramework/Include/ App.h for comments on the interface. The standard Oculus convenience classes for string, vector, matrix, array, et cetera are available in the Oculus LibOVRKernel, located at LibOVRKernel\Src\. You will also find convenience code for OpenGL ES texture, program, geometry, and scene processing that is used by the demos.

52 | Native Application Framework | Mobile

UI and Input Handling This guide describes resources for handling UI and input for native apps using VrAppFramework. Native User Interface Applications using the application framework have access to the VrGUI interface code. The VrGUI system is contained in VrAppSupport/VrGui/Src. Menus are represented as a generic hierarchy of menu objects. Each object has a local transform relative to its parent and local bounds. VrGUI may be used to implement menus in a native application, such as the Folder Browser control used in Oculus 360 Photos SDK and Oculus 360 Videos SDK (found in FolderBrowser.cpp). VrGUI allows for functional improvements by implementing a component model. When a menu object is created, any number of components can be specified. These components derive from a common base class, defined in VRMenuComponent.h, that handles events sent from the parent VRMenu. Components can be written to handle any event listed in VRMenuEvent.h. Events are propagated to child objects either by broadcasting to all children or by dispatching them along the path from the menu’s root to the currently focused object. When handling an event a component can consume it by returning MSG_STATUS_CONSUMED, which will halt further propagation of that event instance. See VRMenuEventHandler.cpp for implementation details. Examples of reusable components can be found in the native UI code, including DefaultComponent.h, ActionComponents.h and ScrollBarComponent.h. Input Handling Input to the application is intercepted in the Java code in VrActivity.java in the dispatchKeyEvent() method. If the event is NOT of type ACTION_DOWN or ACTION_UP, the event is passed to the default dispatchKeyEvent() handler. If this is a volume up or down action, it is handled in Java code. Otherwise the key is passed to the buttonEvent() method, which passes the event to nativeKeyEvent(). nativeKeyEvent() posts the message to an event queue, which is then handled by the AppLocal::Command() method. In previous versions of the SDK, AppLocal::Command() called AppLocal::KeyEvent() with the event parameters. However, in 0.6.0 this was changed to buffer the events into the InputEvents array. Up to 16 events can be buffered this way, per frame. Each time the VrThreadFunction executes a frame loop, these buffered events are passed into VrFrameBuilder::AdvanceVrFrame(), which composes the VrFrame structure for that frame. The InputEvents array is then cleared. After composition, the current VrFrame is passed to FrameworkInputProcessing(), which iterates over the input event list and passes individual events to the native application via VrAppInterface::OnKeyEvent(). Native applications using the VrAppFramework should overload this method in their implementation of VrAppInterface to receive key events. The application is responsible for sending events to the GUI or other systems from its overloaded VrAppInterface::OnKeyEvent(), and returning true if the event is consumed at any point. If OnKeyEvent() returns true, VrAppFramework assumes the application consumed the event and will not act upon it. If the application passes an input event to the VrGUI System, any system menus or menus created by the application have a chance to consume it in their OnEvent_Impl implementation by returning MSG_STATUS_CONSUMED, or pass it to other menus or systems by returning MSG_STATUS_ALIVE.

Mobile | Native Application Framework | 53

Native SoundEffectContext Use SoundEffectContext to easily play sound effects and replace sound assets without recompilation. SoundEffectContext consists of a simple sound asset management class, SoundAssetMapping, and sound pool class, SoundPool. The SoundAssetMapping is controlled by a JSON file in which sounds are mapped as key-value pairs, where a value is the actual path to the .wav file. For example: "sv_touch_active" : "sv_touch_active.wav"

In code, we use the key to play the sound, which SoundEffectManger then resolves to the actual asset. For example: soundEffectContext->Play( "sv_touch_active" );

The string "sv_touch_active" is first passed to SoundEffectContext, which resolves it to an absolute path, as long as the corresponding key was found during initialization. The following two paths specify whether the sound file is in the res/raw folder of VrAppFramework (e.g., sounds that may be played from any app, such as default sounds or Settings Menu sounds), or the assets folder of a specific app: "res/raw/ sv_touch_active.wav"

or "assets/ sv_touch_active.wav"

If SoundEffectContext fails to resolve the passed-in string within the SoundEffectContext::Play function, the string is passed to SoundPooler.play in Java. In SoundPooler.play, we first try to play the passed-in sound from res/raw, and if that fails, from the current assets folder. If that also fails, we attempt to play it as an absolute path. The latter allows for sounds to be played from the phone’s internal memory or SD card. The JSON file loaded by SoundAssetMapping determines which assets are used with the following scheme: 1. Try to load sounds_assets.json in the Oculus folder on the sdcard: sdcard/Oculus/sound_assets.json 2. If we fail to find the above file, we the load the following two files in this order: res/raw/sound_assets.json assets/sound_assets.json The loading of the sound_assets.json in the first case allows for a definition file and sound assets to be placed on the SD card in the Oculus folder during sound development. The sounds may be placed into folders if desired, as long as the relative path is included in the definition. For example, if we define the following in sdcard/Oculus/sound_assets.json: "sv_touch_active" : "SoundDev/my_new_sound.wav"

we would replace all instances of that sound being played with our new sound within the SoundDev folder. The loading of the two asset definition files in the second step allows for overriding the framework's built-in sound definitions, including disabling sounds by redefining their asset as the empty string. For example: "sv_touch_active" : ""

The above key-value pair, if defined in an app’s sound_assets.json (placed in its asset folder), will disable that sound completely, even though it is still played by other VrAppSupport code such as VrGUI.

54 | Native Application Framework | Mobile

You will find the sound effect source code in VrAppSupport/VrSound/.

Runtime Threads The UI thread is the launch thread that runs the normal Java code. The VR Thread is spawned by the UI thread and is responsible for the initialization, the regular frame updates, and for drawing the eye buffers. All of the AppInterface functions are called on the VR thread. You should put any heavyweight simulation code in another thread, so this one basically just does drawing code and simple frame housekeeping. Currently this thread can be set to the real-time SCHED_FIFO mode to get more deterministic scheduling, but the time spent in this thread may have to be limited. Non-trivial applications should create additional threads -- for example, music player apps run the decode and analyze in threads, app launchers load JPG image tiles in threads, et cetera. Whenever possible, do not block the VR thread on any other thread. It is better to have the VR thread at least update the view with new head tracking, even if the world simulation hasn't finished a time step. The Talk To Java (TTJ) Thread is used by the VR thread to issue Java calls that aren't guaranteed to return almost immediately, such as playing sound pool sounds or rendering a toast dialog to a texture. Sensors have their own thread so they can be updated at 500 Hz.

Mobile | Other Native Libraries | 55

Other Native Libraries This guide describes other native libraries included with the Mobile SDK. Overview Additional supplemental libraries included with this SDK include: • VrAppSupport • LibOVRKernel VrAppSupport The VrGui library contains functionality for a fully 3D UI implemented as a scene graph. Each object in the scene graph can be functionally extended using components. This library has dependencies on VrAppFramework and must be used in conjunction with it. See the CinemaSDK, Oculus360PhotosSDK and Oculus360Videos SDKs for examples. The VrLocale library is a wrapper for accessing localized string tables. The wrapper allows for custom string tables to be used seamlessly with Android’s own string tables. This is useful when localized content is not embedded in the application package itself. This library depends on VrAppFramework. The VrModel library implements functions for loading 3D models in the .ovrScene format. These models can be exported from .fbx files using the FbxConverter utility (see FBXConverter). This library depends on rendering functionality included in VrAppFramework. The VrSound library implements a simple wrapper for playing sounds using the android.media.SoundPool class. This library does not provide low latency or 3D positional audio and is only suitable for playing simple UI sound effects. LibOVRKernel LibOVRKernel is a reduced set of libraries primarily of low-level functions, containers, and mathematical operations. It is an integral part of VrAppFramework, but may generally be disregarded by developers using VrApi.

56 | Media and Assets | Mobile

Media and Assets This guide describes how to work with still images, videos, and other media for use with mobile VR applications.

Mobile VR Media Overview Author all media, such as panoramas and movies, at the highest-possible resolution and quality, so they can be resampled to different resolutions in the future. This topic entails many caveats and tradeoffs.

Panoramic Stills Use 4096x2048 equirectangular projection panoramas for both games and 360 photos. 1024x1024 cube maps is for games, and 1536x1536 cube maps is for viewing in 360 Photos with the overlay code.

Panoramic Videos The Qualcomm H.264 video decoder is spec driven by the ability to decode 4k video at 30 FPS, but it appears to have some headroom above that. The pixel rate can be flexibly divided between resolution and frame rate, so you can play a 3840x1920 video @ 30 FPS or a 2048x2048 video @ 60 FPS. The Android software layer appears to have an arbitrary limit of 2048 rows on video decode, so you may not choose to encode, say, a 4096x4096 video @ 15 FPS. The compressed bit rate does not appear to affect the decoding rate; panoramic videos at 60 Mb/s decode without problems, but most scenes should be acceptable at 20 Mb/s or so. The conservative specs for panoramic video are: 2880x1440 @ 60 FPS or 2048x2048 @ 60 FPS stereo. If the camera is stationary, 3840x1920 @ 30 FPS video may be considered. The GALAXY S7 decoder allows 4K at 10bit color, 3840x2160 @ 60 FPS (HEVC) or 8-bit, 3840x2160 @ 60 FPS (VP9). The GALAXY S8 decoder allows 4k at 3840x2160 @ 60 FPS. Oculus 360 Video is implemented using equirectangular mapping to render panoramic videos. Top-bottom, bottom-top, left-right, and right-left stereoscopic video support is implemented using the following naming convention for videos: "_TB.mp4" Top / bottom stereoscopic panoramic video "_BT.mp4" Bottom / top stereoscopic panoramic video "_LR.mp4" Left / right stereoscopic panoramic video "_RL.mp4" Right / left stereoscopic panoramic video Default

Non stereoscopic video if width does not match height, otherwise loaded as top / bottom stereoscopic video

Mobile | Media and Assets | 57

Movies on Screens Comfortable viewing size for a screen is usually less than 70 degrees of horizontal field of view, which allows the full screen to be viewed without turning your head significantly. For video playing on a surface in a virtual world using the recommended 1024x1024 eye buffers, anything over 720x480 DVD resolution is wasted, and if you don’t explicitly build mipmaps for it, it will alias and look worse than a lower resolution video. With the TimeWarp overlay plane code running in Oculus Cinema on the 1440 devices, 1280x720 HD resolution is a decent choice. The precise optimum depends on seating position and may be a bit lower, but everyone understands 720P, so it is probably best to stick with that. Use more bit rate than a typical web stream at that resolution, as the pixels will be magnified so much. The optimal bit rate is content dependent, and many videos can get by with less, but 5 Mb/s should give good quality. 1080P movies play, but the additional resolution is wasted and power consumption is needlessly increased. 3D movies should be encoded "full side by side" with a 1:1 pixel aspect ratio. Content mastered at 1920x1080 compressed side-by-side 3D should be resampled to 1920x540 resolution full side-by-side resolution.

Movie Meta-data When loading a movie from the sdcard, Oculus Cinema looks for a sidecar file with metadata. The sidecar file is simply a UTF8 text file with the same filename as the movie, but with the extension .txt. It contains the title, format (2D/3D), and category. { "title": "format": "category": "theater": }

"Introducing Henry", "2D", "trailers", ""

Title is the name of the movie. Oculus Cinema will use this value instead of the filename to display the movie title. Format describes how the film is formatted. If left blank, it will default to 2D (unless the movie has ‘3D’ in its pathname). Format may be one of the following values: 2D

Full screen 2D movie

3D

3D movie with left and right images formatted side-by-side

3DLR 3DLRF

3D movie with left and right images formatted side-by-side full screen (for movies that render too small in 3DLR)

3DTB

3D movie with left and right images formatted top-and-bottom

3DTBF

3D movie with left and right images formatted top-and-bottom full screen (for movies that render too small in 3DTB)

Category can be one of the following values: Blank

Movie accessible from "My Videos" tab in Oculus Cinema

58 | Media and Assets | Mobile

Trailers

Movie accessible from "Trailers" tab in Oculus Cinema

Multiscreen Movie accessible from "Multiscreen" tab in Oculus Cinema

Oculus 360 Photos and Videos Meta-data The retail version of 360 Photos stores all its media attribution information in a meta file that is packaged into the apk. This allows the categories to be dynamically created and/or altered without the need to modify the media contents directly. For the SDK 360 Photos, the meta file is generated automatically using the contents found in Oculus/360Photos. The meta data has the following structure in a meta.json file which is read in from the assets folder: {

"Categories":[ { "name" : "Category1" }, { "name" : "Category2" }

], "Data":[ { "title": "Media title", "author": "Media author", "url": "relative/path/to/media" "tags" : [ { "category" : "Category2" } ] } { "title": "Media title 2", "author": "Media author 2", "url": "relative/path/to/media2" "tags" : [ { "category" : "Category" }, { "category" : "Category2" } ] } }

For both the retail and sdk versions of 360 Videos, the meta data structure is not used and instead the categories are generated based on what’s read in from the media found in Oculus/360Videos.

Media Locations The SDK comes with three applications for viewing stills and movies. The Oculus Cinema application can play both regular 2D movies and 3D movies. The Oculus 360 Photos application can display 360 degree panoramic stills and the Oculus 360 Videos application can display 360 degree panoramic videos. These applications have the ability to automatically search for media in specific folders on the device. Oculus 360 Photos uses metadata which contains the organization of the photos it loads in addition to allowing the data to be dynamically tagged and saved out to its persistent cache. This is how the "Favorite" feature works, allowing the user to mark photos as favorites without any change to the media storage itself. The following table indicates where to place additional media for these applications. Media

Folders

Application

2D Movies

Movies\

Oculus Cinema

DCIM\

Mobile | Media and Assets | 59

Media

Folders

Application

Oculus\Movies \My Videos 3D Movies

Movies\3D

Oculus Cinema

DCIM\3D Oculus\Movies \My Videos\3D 360 degree panoramic stills

Oculus \360Photos

Oculus 360 Photos

(In the app assets\meta.json) 360 degree panoramic videos

Oculus \360Videos

Movie Oculus\Cinema Theater .ovrscene \Theaters

Oculus 360 Videos

Oculus Cinema

Native VR Media Applications Oculus Cinema Oculus Cinema uses the Android MediaPlayer class to play videos, both conventional (from /sdcard/Movies/ and /sdcard/DCIM/) and side by side 3D (from /sdcard/Movies/3D and /sdcard/DCIM/3D), in a virtual movie theater scene (from sdCard/Oculus/Cinema/Theaters). See Mobile VR Media Overview for more details on supported image and movie formats. Before entering a theater, Oculus Cinema allows the user to select different movies and theaters. A theater is typically one big mesh with two textures. One texture with baked static lighting for when the theater lights are one, and another texture that is modulated based on the movie when the theater lights are off. The lights are gradually turned on or off by blending between these two textures. To save battery, the theater is rendered with only one texture when the lights are completely on or completely off. The texture with baked static lighting is specified as the diffuse color texture. The texture that is modulated based on the movie is specified as the emissive color texture. The two textures are typically 4096 x 4096 with 4-bits/texel in ETC2 format. Using larger textures may not work on all devices. Using multiple smaller textures results in more draw calls and may not allow all geometry to be statically sorted to reduce the cost of overdraw. The theater geometry is statically sorted to guarantee front-toback rendering on a per triangle basis which can significantly reduce the cost of overdraw. In addition to the mesh and textures, Oculus Cinema currently requires a 350x280 icon for the theater selection menu.

60 | Media and Assets | Mobile

Oculus 360 Photos Oculus 360 Photos is a viewer for panoramic stills. The SDK version of the application presents a single category of panorama thumbnail panels which are loaded in from Oculus/360Photos on the SDK sdcard. Gazing towards the panels and then swiping forward or back on the Gear VR touchpad will scroll through the content. When viewing a panorama still, touch the Gear VR touchpad again to bring back up the panorama menu which displays the attribution information if properly set up. Additionally, the top button or tapping the back button on the Gear VR touchpad will bring back the thumbnail view. The bottom button will tag the current panorama as a Favorite which adds a new category at the top of the thumbnail views with the panorama you tagged. Pressing the Favorite button again will untag the photo and remove it from Favorites. Gamepad navigation and selection is implemented via the left stick and d-pad used to navigate the menu, the single dot button selects and the 2-dot button backs out a menu. See Mobile VR Media Overview for details on creating custom attribution information for panoramas. Oculus 360 Videos Oculus 360 Videos works similarly to 360 Photos as they share the same menu functionality. The application also presents a thumbnail panel of the movie read in from Oculus/360Videos which can be gaze selected to play. Touch the Gear VR touchpad to pause the movie and bring up a menu. The top button will stop the movie and bring up the movie selection menu. The bottom button restarts the movie. Gamepad navigation and selection is implemented via the left stick and d-pad used to navigate the menu, the single dot button selects and the 2-dot button backs out a menu. See Mobile VR Media Overview for details on the supported image and movie formats.

Mobile | Testing and Troubleshooting | 61

Testing and Troubleshooting Welcome to the testing and troubleshooting guide. This guide describes tools and procedures necessary for testing and troubleshooting mobile VR applications.

Tools and Procedures Welcome to the testing and troubleshooting guide.

Android System Properties

Use Android System Properties to set various device configuration options for testing and debugging. Use Android System properties to set device configuration options. Local Preferences are being deprecated and should no longer be used for setting debug options. System Properties Note: System Properties reset when the device reboots. All commands are case sensitive. Basic Usage Set value: adb shell setprop

Example: adb shell setprop debug.oculus.frontbuffer 0

Result: Disables front buffer rendering. Get current value: adb shell getprop

Example: adb shell getprop debug.oculus.gpuLevel

Result: Returns currently-configured GPU Level, e.g., 1. Get all current values: adb shell getprop | fgrep "debug.oculus"

Result: Returns all System Preferences settings on the device.

62 | Testing and Troubleshooting | Mobile

System Property

Values

Function

debug.oculus.enableCapture

0, 1

Enables support for Oculus Remote Monitor to connect to the application.

debug.oculus.cpuLevel

0, 1, 2, 3

Changes the fixed CPU level.

debug.oculus.gpuLevel

0, 1, 2, 3

Changes the fixed GPU level.

debug.oculus.asyncTimewarp

0, 1

Set to 0 to disable Asynchronous TimeWarp/enable Synchronous TimeWarp. Default is 1 (ATW is enabled).

debug.oculus.frontbuffer

0, 1

Disable/enable front buffer.

debug.oculus.gpuTimings

0, 1, 2

Turns on GPU timings in logcat (off by default due to instability). A setting of 2 is necessary on phones using the Mali GPU.

debug.oculus.simulateUndock

-1, 0, 1, …, 10

Simulate an undocking event after the specified number of seconds.

debug.oculus.enableVideoCapture

0, 1

When enabled, each enterVrMode generates a new mp4 file in /sdcard/ oculus/VideoShots/. Videos are full resolution, undistorted, single-eye, with full-compositing support. Defaults are 1024 resolution at 5 Mb/s.

debug.oculus.phoneSensors

0, 1

Set to 0 to disable limited orientation tracking provided by phone sensors while in Developer Mode (enabled by default).

debug.oculus.textureWidth

1-max

Set the default VrApi texture width (default = 1024). Allows resetting apps to different resolutions for testing without repackaging.

debug.oculus.textureHeight

1-max

Set the default VrApi texture height (default = 1024). Allows resetting apps to different resolutions for testing without repackaging.

debug.oculus.videoBitrate

No defined values

Set the video capture bitrate (default = 5 Mb/s). Note - bitrates above 5 Mb/s may have a significant performance cost.

debug.oculus.videoResolution

No defined values

Set the video capture resolution (default = 1024). Common resolutions are 512, 720, and 1024. Resolutions above 1024 may have a significant performance cost.

Mobile | Testing and Troubleshooting | 63

Screenshot and Video Capture

Full-resolution, undistorted, single-eye, full-layer-support 2D screenshots and video capture for VR apps are available through the sharing menu. Video capture is also available by configuring localprefs. Photo and Video Capture Using the Sharing Menu The sharing menu, available by pressing the Oculus button and selecting Sharing, allows you to take screenshots and record video anywhere in VR.

These captures and recordings are stored locally until you export them. Video Capture using Android System Properties Note: Screenshot is not available with Android System Properties. To enable video capture, set the debug.oculus.enableVideoCapture to 1 with the following command: adb shell setprop debug.oculus.enableVideoCapture 1

When enabled, each enterVrMode will generate a new mp4 file, and every vrapi_EnterVrMode() will create a new video file. For example, if you launch an app from Home, you may find one video file for your Home session, one for the app you launch, one for System Activities if you long-press, and so forth. To help ensure that there is no disruption to the play experience while recording, you may wish to force the GPU level up and chromatic correction off: adb shell setprop debug.oculus.enableVideoCapture 1 debug.oculus.gpuLevel 3

The FOV is reduced to 80 degrees, so you are unlikely to see any black pull-in at the edges.

64 | Testing and Troubleshooting | Mobile

Note: Make sure to disable video capture when you are done, or you will fill your phone up with video of every VR session you run!

Oculus Remote Monitor The Oculus Remote Monitor client connects to VR applications running on remote devices to capture, store, and analyze data streams. Oculus Remote Monitor is compatible with any Oculus mobile application built with Unity, Unreal Engine, or Native development tools. Download the Oculus Remote Monitor client: • Oculus Remote Monitor for Windows • Oculus Remote Monitor for macOS

Mobile | Testing and Troubleshooting | 65

Setup Enable Capture Server Note: Some of the menu locations described in this process might slightly differ depending on your Samsung model. To enable the Capture Server on a device: 1. 2. 3. 4.

Enable Developer Mode (instructions here) and USB Debugging on your device (instructions here). Plug your device into your computer via USB. If your device requests permission to "Allow USB debugging", accept. Open Oculus Remote Monitor and select Settings > Set ADB Path. Verify that the ADB Path is set to your local installation of adb (included with the Android SDK). For example: C:\Users\$username\AppData\Local \Android\sdk\platform-tools\adb.exe 5. Click Enable/Disable Remote Capture Server in the Remote Capture menu. You should now see the device ID of your connected device in the Device Settings list. Select your device ID and click OK. The Capture Server will now run automatically whenever a VR application starts. You may need to restart your VR app for it to appear for debugging. Note: If you reboot your device, the Device Settings are reset and you must click the Enable Capture check box again. Network Setup Oculus Remote Monitor uses a UDP-broadcast-based auto-discovery mechanism to locate remote hosts, and then a TCP connection to access the capture stream. The host and the client must be on the same subnet, and the network must not block or filter UDP broadcasts or TCP connections. If you are on a network that may have such restrictions (e.g., a corporate network), we recommend setting up a dedicated network or tethering directly to your device. Furthermore, frame buffer capturing is extremely bandwidth intensive. If your signal strength is low or you have a lot of interference or traffic on your network, you may need to disable Capture Frame Buffer before connecting to improve capture performance. Oculus Remote Monitor uses UDP port 2020 and TCP ports 3030->3040. Application Setup Because we use a direct network connection, the following permission is required in your application's AndroidManifest.xml:

Basic Usage Stream Remote Capture If the host and client are on the same subnet and the network is configured correctly (see Network Setup above), Oculus Remote Monitor automatically discovers any compatible applications running on the network. To begin capturing and viewing data: 1. Click Start Remote Capture in the Remote Capture menu.

66 | Testing and Troubleshooting | Mobile

2. Select the host you want to monitor. 3. Click Session Settings and select the session settings you want to monitor. (Optional) 4. Click OK. Open Capture File Each time you connect to a host, the Oculus Remote Monitor automatically compresses and saves the incoming data stream to disk under a unique filename in the format package-YYYYMMDD-HHMMSS-ID.dat. The default recordings directory for these files are in your Documents folder, under the OVRMonitorRecordings subfolder. You can change this directory in the Settings menu. To open saved capture files, click File > Open Local Capture File. Performance Overview The overview provides a high-level performance overview. It plots a graphical summary of the VrAPI messages and error conditions against a timeline. To view the log, click on the Overview tab. • Click the Performance Overview icon. Move the pointer over the performance overview to reveal details of the collected data. Double-click anywhere in the overview to open the Profiler Data view at that precise point in the timeline.

Frame Buffer

Screen captures of the pre-distorted frame buffer. Move the pointer over this section to view the screenshots captured at that point in time.

Mobile | Testing and Troubleshooting | 67

Frames per Second

• Delivered Frames. A well-performing application continuously delivers 60 frames per second. • Screen Tears. The number of tears per second. A well-performing application does not exhibit any tearing. • Early Frames. The number of frames that are completed a whole frame early. • Stale Frames. The number of stale frames per second. A well-performing application does not exhibit any stale frames. For more information, see Basic Performance Stats through Logcat on page 85.

The number of milliseconds between the latest sensor sampling for tracking Head-Pose Prediction Latency and the anticipated display time of new eye images. (ms) Performance Levels

The CPU and GPU clock levels and associated clock frequencies, set by the application. Lower clock levels result in less heat and less battery drain.

Thermal (°C)

Temperatures in degrees Celsius. Well-optimized applications do not cause the temperature to rise quickly. There is always room for more optimization, which allows lower clock levels to be used, which, in return, reduces the amount of heat that is generated.

Available Memory The amount of available memory, displayed every second. It is important (GB) to keep a reasonable amount of memory available to prevent Android from killing backgrounded applications, like Oculus Home. Console VrAPI reports various messages and error conditions to Android's logcat as well as to the Oculus Remote Monitor, which provides the thread and timestamp of each message. The Logging Viewer provides raw access to this data. To view the log, click on the Logs tab. Warnings and errors are color-coded to stand out easier, and unlike logcat, thread IDs are tracked so you know exactly when and where it occurred.

68 | Testing and Troubleshooting | Mobile

Remote Variables Applications may expose user-adjustable parameters and variables in their code. Nearly any constant in your code may be turned into a knob that can be updated in real-time during a play test. When streaming a capture, a control panel allowing you to view and adjust the available remote variables will automatically appear.

VrApi exposes CPU and GPU Levels to allow developers to quickly identify their required clock rates. Applications are also free to expose their own options that you can select or adjust. ShowTimeWarpTextureDensity: This experimental feature toggles a visualization mode that colors the screen based on the texel:pixel ratio when running TimeWarp (green indicates 1:1 ratio; dark green < 1:1 ratio; red > 2:1 ratio). Settings You may view and adjust the settings in the Settings menu. ADB Path

Configurable any time. If it does not point to a valid executable when Monitor runs, Monitor will attempt to locate a valid copy of adb by checking the environment variable. If that fails, Monitor will search under ANDROID_HOME.

Recordings Directory

Specifies the location in which Monitor will automatically store capture files when connected to a remote host. The default is the current user's Documents directory under OVRMonitorRecordings.

Mobile | Testing and Troubleshooting | 69

Frame Buffer Compression Quality

Used for client-side recompression of the frame buffer. This helps offload compression load from the host while allowing for significant savings in memory usage on the client. Lower-quality settings provide greater memory savings, but may result in blocking artifacts in the frame buffer viewer.

Profiler Data The Profiler Data view provides both real-time and offline inspection of the following data streams on a single, contiguous timeline: • • • •

CPU/GPU events Sensor readings Console messages, warnings, and errors Frame buffer captures

VrAPI also has a number of other events embedded to help diagnose VR-specific scheduling issues. To view the data streams, click on the Profiler tab. The Profiler Data view has a number of controls to help you analyze the timeline. • The space bar toggles between real-time timelines scrolling and freezing at a specific point in time. This lets you quickly alternate between watching events unfold in real-time and pausing to focus in on a point of interest without stopping or restarting. • The mouse wheel zooms the view in and out. • Right-click to save a screenshot of the current view or to hide a data stream. To un-hide hidden data streams, click Show Hidden Data Streams • Click and drag to pan the timeline forwards or backwards in time.

70 | Testing and Troubleshooting | Mobile

Mobile | Testing and Troubleshooting | 71

The Performance Data Viewer screen shows a selected portion of the application timeline: Frame Buffer

Provides screen captures of the pre-distorted frame buffer, timestamped the moment immediately before the frame was handed off to the TimeWarp context. The left edge of each screenshot represents the point in time in which it was captured from the GPU.

VSync

Displays notches on every driver v-sync event.

GPU Context

GPU Zones inserted into the OpenGL Command Stream via Timer Queries are displayed in a similar manner as CPU events. Each row corresponds to a different OpenGL Context. Typical VR applications will have two contexts: one for TimeWarp, and one for application rendering. Note that on tiler GPUs, these events should be regarded as rough estimates rather than absolute data.

CPU Thread

Hierarchical visualization of wall clock time of various functions inside VrAPI along with OpenGL Draw calls inside the host application. Log messages are displayed on their corresponding CPU thread as icons. Mouse over each icon to display the corresponding message (blue circles), warning (yellow squares), or error (red diamonds).

Sensor

General sensor data visualizer. CPU and GPU clocks are visualized in the screenshot shown above, but other data may be displayed here, such as thermal sensors, IMU data, et cetera.

72 | Testing and Troubleshooting | Mobile

Using Oculus Remote Monitor to Identify Common Issues Tearing Tearing occurs in VR applications any time TimeWarp rendering fails to render ahead of scanout. VrApi attempts to detect this with a GPU Sync Object to determine when the GPU completes rendering distortion for a given eye. If for any reason it does not complete in time, VrApi prints a warning to logcat, which Oculus Remote Monitor picks up. V-sync %d: Eye %d, CPU latency %f, GPU latency %f, Total latency %f

If you are running on a Samsung GALAXY S6, you may also see tearing events by looking at the GPU Context that is running WarpToScreen. Example:

Mobile | Testing and Troubleshooting | 73

In this example, because the refresh rate of the display is 60 Hz, the ideal running time of WarpToScreen is 16.66 ms, but a scheduling/priority issue in the application caused the second eye to be executed 10 ms late, pushing WarpToScreen to run for 26.81 ms. The actual eye distortion draw calls are barely visible as two distinct notches under each WarpToScreen event on the GPU. Dropped Frames Your Oculus Remote Monitor capture will show you any lost frames during the capture session and information about how the app was performing at the time the frame was lost. Use this data and the corresponding screenshot to troubleshoot problem areas in your app. Information about CPU/GPU levels can be found on the Power Management on page 40 page.

74 | Testing and Troubleshooting | Mobile

Application OpenGL Performance Issues Oculus Remote Monitor is capable of capturing OpenGL calls across the entire process (enabled with the Graphics API option). Application-specific performance issues can therefore be spotted at times. In the example below, an application was mapping the same buffer several times a frame for a particle effect. On a Note 4 running KitKat, the GL driver triggered a sync point on the second glUnmapBuffer call, causing it to take away 2.73 ms - without the sync point, this same call takes around 0.03 ms. After spotting this issue, the developer was able to quickly fix the buffer usage and reclaim that CPU time.

Mobile | Testing and Troubleshooting | 75

Unlocked CPU Clocks VrApi attempts to lock the CPU and GPU clocks at particular frequencies to ensure some level of execution speed and scheduling guarantees. These are configurable via the CPULevel and GPULevel available in the API. When in VR Developer Mode, the clocks may occasionally unlock when out of the headset for too long. When this happens, the CPU/GPU Clock Sensors go from extremely flat to extremely noisy, typically causing many performance issues like tearing and missed frames, as seen below:

76 | Testing and Troubleshooting | Mobile

Note that on a Samsung GALAXY S6, we allow the clocks to boost slightly under certain conditions, but only by a small amount in typical cases, and it should never drop below the requested level. It is also fairly common for some cores to go completely on and offline occasionally. Network Bandwidth Stalls VrCapture uses a fixed-size FIFO internally to buffer events before they are streamed over the network. If this buffer fills faster than we can stream its contents, we are left in a tricky situation. If your network connection stalls long enough for any reason, it eventually causes the host application to stall as well. This is easily spotted in Oculus Remote Monitor - look for around two seconds of extremely long events on the OVR::Capture thread followed by other threads stalling as well. We provide a large internal buffer, so a few network hitches shouldn’t affect your application, but if they persist long enough, a random event inside your application will eventually stall until the Capture thread is able to flush the buffer. In the example below, several seconds of poor network connectivity (see by long AsyncStream_Flush events) eventually caused the MapAndCopy event on the application’s render thread to stall until it was eventually released by the Capture thread:

Mobile | Testing and Troubleshooting | 77

If you find it difficult to capture reliably because of this issue, we recommend disabling Frame Buffer Capturing before connecting, as this feature consumes the bulk of the bandwidth required. Thermal Limits If your application requests CPU/GPU Levels are too high, the internal SoC and battery temperatures will rise slowly, yet uncontrollably, until it hits the thermal limit. When this happens, GearVR Service will terminate the application and display a thermal warning until the device cools down. It may take quite a long time to encounter this scenario during testing. Monitoring thermals in Oculus Remote Monitor is a great way to quickly see if your application causes the device temperature to rise perpetually.

78 | Testing and Troubleshooting | Mobile

Mouse over the Sensor graph to give the precise readout at any given time. We recommend keeping an eye on it. If the temperature exceeds the device’s first thermal trip point, the graph turns bright red, which typically occurs a few minutes before GearVR Service shuts the application down, and should serve as a stern warning that you should probably lower the CPU/GPU Levels.

OVR Metrics Tool OVR Metrics Tool is an application that provides performance metrics for Oculus mobile applications. OVR Metrics Tool reports application frame rate, heat, GPU and CPU throttling values, utilization, and the number of tears and stale frames per second. It is available for download from our Downloads page. OVR Metrics Tool can be run two modes. In Report Mode, it displays performance report about a VR session after it is complete. Report data may be easily exported as a CSV and PNG graphs. In Performance HUD Mode, OVR Metrics Tool renders performance graphs as a VR overlay over any running Oculus application. OVR Metrics Tool may be used with any Oculus mobile application, including those built with Unity, Unreal, or our native mobile SDK. Installation Install OVRMetricsTool.apk on any Gear VR-compatible Android phone. For more details, see "Using adb to Install Applications" in our adb guide or the "Install an App" section of Google's adb documentation. Report Mode Usage Run your VR application and conduct a session you wish to gather data for. Note that data will be logged from every application you run, including Oculus Home. After you have finished your session and exited VR, open OVR Metrics Tool from your phone's desktop launcher and click the log entry that corresponds to your application. You will see a series of graphs describing the performance of your session. Use the buttons at the bottom of the report to save the information or share an image of the report.

Mobile | Testing and Troubleshooting | 79

Performance HUD Usage Connect to your device using adb (for general instructions, see our adb guide), Enable Performance HUD Mode with the following commands: adb shell setprop debug.oculus.notifPackage com.oculus.ovrmonitormetricsservice adb shell setprop debug.oculus.notifClass com.oculus.ovrmonitormetricsservice.PanelService adb shell am force-stop com.oculus.ovrmonitormetricsservice

After setting these properties, restart OVR Metrics Tool. This can be done over adb with the following command: adb shell am force-stop com.oculus.ovrmonitormetricsservice

Note that these properties must be set after each phone restart. Once Performance HUD Mode is enabled, you can customize the HUD itself using the Options screen in the OVR Metrics Tool toolbar menu or by using the following adb commands: adb shell setprop debug.oculus.omms.enableGraph (true|false) // show or hide the graph adb shell setprop debug.oculus.omms.enableGraph2 (true|false) // show or hide a secondary graph stat (dependent on stat)

80 | Testing and Troubleshooting | Mobile

adb shell adb shell center) adb shell adb shell adb shell adb shell position

setprop debug.oculus.omms.enableStats (true|false) // show or hide the stats setprop debug.oculus.omms.pitch (number) // set the pitch of the perf hud (degrees from setprop debug.oculus.omms.yaw (number) // set the yaw of the perf hud (degrees from center) setprop debug.oculus.omms.distance (number) // set the distance of the perf hud (meters) setprop debug.oculus.omms.scale (number) // set the scale of the perf hud (1,2, or 3) setprop debug.oculus.omms.headLocked (true|false) // whether to head lock the hud or it in space

The app needs to be restarted for changes to take effect. To disable Performance HUD Mode, restart your phone or send the following three adb commands: adb shell setprop debug.oculus.notifPackage '' adb shell setprop debug.oculus.notifClass '' adb shell am force-stop com.oculus.ovrmonitormetricsservice

Mobile | Testing and Troubleshooting | 81

82 | Testing and Troubleshooting | Mobile

CPU/GPU Utilizaton Monitoring Enable CPU and GPU utilization monitoring with the following property: adb shell setprop debug.oculus.vrapilayers UtilPoller

*GPU utilization is not available on devices with Mali chipsets.

Android Debugging This document describes utilities, tips and best practices for improving debugging for any application on Android platforms. Most of these tips apply to both native and Unity applications.

Adb

This guide describes how to perform common tasks using adb. Android Debug Bridge (adb) is included in the Android SDK and is the main tool used to communicate with an Android device for debugging. We recommend familiarizing yourself with it by reading the official documentation located here: http://developer.android.com/tools/help/adb.html For a list of available commands and options, make a connection as described below and enter: adb help

Connecting to a Device with adb Using adb from the OS shell, it is possible to connect to and communicate with an Android device either directly through USB, or via TCP/IP over a Wi-Fi connection. You must install the Android SDK and appropriate device drivers to use adb (see Device Setup - Gear VR on page 9 for more information). To connect a device via USB, plug the device into the PC with a compatible USB cable. After connecting, open up an OS shell and type: adb devices

If the device is connected properly, adb will show the device id list such as: List of devices attached ce0551e7

device

Adb may not be used if no device is detected. If your device is not listed, the most likely problem is that you do not have the correct Samsung USB driver - see Device Setup - Gear VR on page 9 for more information. You may also wish to try another USB cable and/or port. Connecting adb via Wi-Fi Connecting to a device via USB is generally faster than using a TCP/IP connection, but a TCP/IP connection is sometimes indispensable, especially when debugging behavior that only occurs when the phone is placed in the Gear VR, in which case the USB connection is occupied by the Gear VR jack.

Mobile | Testing and Troubleshooting | 83

To connect via TCP/IP, first determine the IP address of the device and make sure the device is already connected via USB. You can find the IP address of the device under Settings -> About device -> Status. Then issue the following commands: adb tcpip adb connect :

For example: > adb tcpip 5555 restarting in TCP mode port: 5555 > adb connect 10.0.32.101:5555 connected to 10.0.32.101:5555

The device may now be disconnected from the USB port. As long as adb devices shows only a single device, all adb commands will be issued for the device via Wi-Fi. To stop using the Wi-Fi connection, issue the following adb command from the OS shell: adb disconnect

Using adb to Install Applications To install an APK on your mobile device using adb, connect to the target device and verify connection using adb devices as described above. Then install the APK with the following command: adb install

Use the -r option to overwrite an existing APK of the same name already installed on the target device. For example: adb install -r C:\Dev\Android\MyProject\VrApp.apk

For more information, see the Installing an Application section of Android's Android Debug Bridge guide. Connection Troubleshooting Note that depending on the particular device, detection may be finicky from time to time. In particular, on some devices, connecting while a VR app is running or when adb is waiting for a device, may prevent the device from being reliably detected. In those cases, try ending the app and stop adb using Ctrl-C before reconnecting the device. Alternatively the adb service can be stopped using the following command after which the adb service will be automatically restarted when executing the next command: adb kill-server

Multiple devices may be attached at once, and this is often valuable for debugging client/server applications. Also, when connected via Wi-Fi and also plugged in via USB, adb will show a single device as two devices. In the multiple device case adb must be told which device to work with using the -s switch. For example, with two devices connected, the adb devices command might show: List of devices attached ce0551e7 10.0.32.101:5555

device device

84 | Testing and Troubleshooting | Mobile

The listed devices may be two separate devices, or one device that is connected both via Wi-Fi and plugged into USB (perhaps to charge the battery). In this case, all adb commands must take the form: adb -s

where is the id reported by adb devices. So, for example, to issue a logcat command to the device connected via TCP/IP: adb -s 10.0.32.101:55555 logcat -c

and to issue the same command to the device connected via USB: adb -s ce0551e7

Logcat

The Android SDK provides the logcat logging utility, which is essential for determining what an application and the Android OS are doing. To use logcat, connect the Android device via USB or Wi-Fi, launch an OS shell, and type: adb logcat

If the device is connected and detected, the log will immediately begin outputting to the shell. In most cases, this raw output is too verbose to be useful. Logcat solves this by supporting filtering by tags. To see only a specific tag, use: adb logcat -s

This example: adb logcat -s VrApp

will show only output with the "VrApp" tag. In the native VrAppFramework code, messages can generally be printed using the LOG() macro. In most source files this is defined to pass a tag specific to that file. Log.h defines a few additional logging macros, but all resolve to calling __android_log_print().

Using Logcat to Determine the Cause of a Crash Logcat will not necessarily be running when an application crashes. Fortunately, it keeps a buffer of recent output, and in many cases a command can be issued to logcat immediately after a crash to capture the log that includes the backtrace for the crash: adb logcat > crash.log

Simply issue the above command, give the shell a moment to copy the buffered output to the log file, and then end adb (Ctrl+C in a Windows cmd prompt or OS X Terminal prompt). Then search the log for "backtrace:" to locate the stack trace beginning with the crash. If too much time as elapsed and the log does not show the backtrace, there a full dump state of the crash should still exist. Use the following command to redirect the entire dumpstate to a file: adb shell dumpstate > dumpstate.log

Mobile | Testing and Troubleshooting | 85

Copying the full dumpstate to a log file usually takes significantly longer than simply capturing the currently buffered logcat log, but it may provide additional information about the crash.

Getting a Better Stack Trace The backtrace in a logcat capture or dumpstate generally shows the function where the crash occurred, but does not provide line numbering. To get more information about a crash, the Android Native Development Kit (NDK) must be installed. When the NDK is installed, the ndk-stack utility can be used to parse the logcat log or dumpstate for more detailed information about the state of the stack. To use ndk-stack, issue the following: ndk-stack -sym -dump > stack.log

For example, this command: ndk-stack -sym VrNative\Oculus360Photos\obj\local\armeabi-v7a -dump crash.log > stack.log

uses the symbol information for Oculus 360 Photos to output a more detailed stack trace to a file named stack.log, using the backtrace found in crash.log.

Application Performance Analysis A guide to performance analysis during mobile VR application development. While this document is geared toward native application development, most of the tools presented here are also useful for improving performance in Android applications developed in Unity or other engines.

Basic Performance Stats through Logcat

A simple way to get some basic performance numbers is to use logcat with a filter for VrApi. Sample usage: adb logcat -s VrApi

A line resembling this example will be displayed every second: I/VrApi (26422): FPS=60,Prd=49ms,Tear=0,Early=60,Stale=0,VSnc=1,Lat=1,CPU4/GPU=2/1,1000/350MHz,OC=FF,TA=F0/F0/F0/ F0,SP=F/F/N/N,Mem=1026MHz,Free=1714MB,PSM=0,PLS=0,Temp=31.5C/27.

FPS: Frames Per Second. An application that performs well will continuously display 59-60 FPS. Prd: The number of milliseconds between the latest sensor sampling for tracking and the anticipated display time of new eye images. For a single-threaded application this time will normally be 33 milliseconds. For an application with a separate renderer thread (like Unity) this time will be 49 milliseconds. New eye images are not generated for the time the rendering code is executed, but are instead generated for the time they will be displayed. When an application begins generating new eye images, the time they will be displayed is predicted. The tracking state (head orientation, et cetera) is also predicted ahead for this time. Tears: The number of tears per second. A well behaved and well performing application will display zero tears. Tears can be related to Asynchronous TimeWarp, which takes the last completed eye images and warps them onto the display. The time warp runs on a high priority thread using a high priority OpenGL ES context. As such, the time warp should be able to preempt any application rendering and warp the latest eye images onto the display just in time for the display refresh. However, when there are a lot of heavyweight background

86 | Testing and Troubleshooting | Mobile

operations or the application renders many triangles to a small part of the screen, or uses a very expensive fragment program, then the time warp may have to wait for this work to be completed. This may result in the time warp not executing in time for the display refresh, which, in return, may result in a visible tear line across one of the eyes or both eyes. Early: The number of frames that are completed a whole frame early. Stale: The number of stale frames per second. A well-behaved application performing well displays zero stale frames. New eye images are generated for a predicted display time. If, however, the new eye images are not completed by this time, then the time warp may have to re-project and display a previous set of eye images. In other words, the time warp displays a stale frame. Even though the time warp re-projects old eye images to make them line up with the latest head orientation, the user may still notice some degree of intra-frame motion judder when displaying old images. Vsnc: The value of MinimumVsyncs, which is the number of V-Syncs between displayed frames. This value directly controls the frame rate. For instance, MinimumVsyncs = 1 results in 60 FPS and MinimumVsyncs = 2 results in 30 FPS. CPU0/GPU: The CPU and GPU clock levels and associated clock frequencies, set by the application. Lower clock levels result in less heat and less battery drain. F/F [Thread Affinity]: This field describes the thread affinity of the main thread (first hex nibble) and renderer thread (second hex nibbles). Each bit represents a core, with 1 indicating affinity and 0 indicating no affinity. For example, F/1 (= 1111 0001) indicates the main thread can run on any of the lower four cores, while the rendered thread can only run on the first core. In practice, F/F and F0/F0 are common results. Free: The amount of available memory, displayed every second. It is important to keep a reasonable amount of memory available to prevent Android from killing backgrounded applications, like Oculus Home. PLS: Power Level State, where "0" = normal, "1" = throttled, and "2" = undock required. Temp: Temperature in degrees Celsius. Well-optimized applications do not cause the temperature to rise quickly. There is always room for more optimization, which allows lower clock levels to be used, which, in return, reduces the amount of heat that is generated.

SysTrace SysTrace is the profiling tool that comes with the Android Developer Tools (ADT) Bundle. SysTrace can record detailed logs of system activity that can be viewed in the Google Chrome browser. With SysTrace, it is possible to see an overview of what the entire system is doing, rather than just a single app. This can be invaluable for resolving scheduling conflicts or finding out exactly why an app isn’t performing as expected. Under Windows: the simplest method for using SysTrace is to run the monitor.bat file that was installed with the ADT Bundle. This can be found in the ADT Bundle installation folder (e.g., C:\android\adt-bundle-windowsx86_64-20131030) under the sdk/tools folder. Double-click monitor.bat to start Android Debug Monitor.

Mobile | Testing and Troubleshooting | 87

Select the desired device in the left-hand column and click the icon highlighted in red above to toggle Systrace logging. A dialog will appear enabling selection of the output .html file for the trace. Once the trace is toggled off, the trace file can be viewed by opening it up in Google Chrome. You can use the WASD keys to pan and zoom while navigating the HTML doc. For additional keyboard shortcuts, please refer to the following documentation: http://developer.android.com/tools/help/systrace.html

NDK Profiler

Use NDK Profiler to generate gprof-compatible profile information. The Android NDK profiler is a port of gprof for Android. The latest version, which has been tested with this release, is 3.2 and can be downloaded from the following location: https://code.google.com/p/android-ndk-profiler/

88 | Testing and Troubleshooting | Mobile

Once downloaded, unzip the package contents to your NDK sources path, e.g.: C:\Dev\Android\android-ndkr9c\sources. Add the NDK pre-built tools to your PATH, e.g.: C:\Dev\Android\android-ndk-r9c\toolchains\arm-linuxandroideabi-4.8\prebuilt\windows-x86_64\bin. Android Makefile Modifications 1. Compile with profiling information and define NDK_PROFILE LOCAL_CFLAGS := -pg -DNDK_PROFILE

2. Link with the ndk-profiler library LOCAL_STATIC_LIBRARIES := android-ndk-profiler

3. Import the android-ndk-profiler module $(call import-module,android-ndk-profiler)

Source Code Modifications Add calls to monstartup and moncleanup to your Init and Shutdown functions. Do not call monstartup or moncleanup more than once during the lifetime of your app. #if defined( NDK_PROFILE ) extern "C" void monstartup( char const * ); extern "C" void moncleanup(); #endif extern "C" { void Java_oculus_VrActivity2_nativeSetAppInterface( JNIEnv * jni, jclass clazz ) { #if defined( NDK_PROFILE ) setenv( "CPUPROFILE_FREQUENCY", "500", 1 ); // interrupts per second, default 100 monstartup( "libovrapp.so" ); #endif app->Init(); } void Java_oculus_VrActivity2_nativeShutdown( JNIEnv *jni ) { app->Shutdown(); #if defined( NDK_PROFILE ) moncleanup(); #endif } } // extern "C"

Manifest File Changes You will need to add permission for your app to write to the SD card. The gprof output file is saved in /sdcard/ gmon.out.

Mobile | Testing and Troubleshooting | 89

Profiling your App To generate profiling data, run your app and trigger the moncleanup function call by pressing the Back button on your phone. Based on the state of your app, this will be triggered by OnStop() or OnDestroy(). Once moncleanup has been triggered, the profiling data will be written to your Android device at /sdcard/gmon.out. Copy the gmon.out file to the folder where your project is located on your PC using the following command: adb pull /sdcard/gmon.out To view the profile information, run the gprof tool, passing to it the non-stripped library, e.g.: arm-linux-androideabi-gprof

obj/local/armeabi/libovrapp.so

For information on interpreting the gprof profile information, see the following: http://sourceware.org/binutils/ docs/gprof/Output.html

Snapdragon Profiler

The Qualcomm Snapdragon Profiler allows developers to analyze performance on Android devices with Snapdragon processors over USB, including CPU, GPU, memory, power, and thermal performance. Samsung Snapdragon phones running VrDriver 1.5.3 and later auto-detect when Snapdragon Profiler is running, and configure themselves for best capture. Basic Usage 1. Download and install the Snapdragon Profiler from Qualcomm. 2. Attach a Snapdragon-based S7 with Android M or N via USB, with VR developer mode enabled (for instructions, see #unique_14). 3. Run Snapdragon Profiler. It is important to do this before starting the app you want to use. 4. Select Connect to a Device. If you do this right after starting, you may need to wait a few seconds for the phone icon to turn green (driver files are being transferred). 5. Click connect. 6. Run the VR app that you want to profile. Note that you will see poor performance because of optimizations related to the performance testing - it will not affect your session. 7. Select either trace or snapshot capture modes. 8. In the "Data Sources" panel, select the app name. Note that only applications with the OpenGL requirement set in the manifest will show up. If the application has the required manifest setting but does not appear, try rebooting the phone and restarting the Snapdragon Profiler. 9. For traces, enable OpenGL ES -> Rendering stages for the most useful information, then click start capture, wait a second or two, and click stop capture. 10.For snapshots, you can capture a frame of commands without any extra options checked. The capture process can take tens of seconds if there is a lot of texture data to transfer 11.We recommend shutting down and restarting the Snapdragon Profiler between sets of tests. 12.Quit Snapdragon Profiler before unplugging your phone, so it can clean up. Don’t forget this step!

Native Debugging This guide provides basic recommendations for working with the Oculus Mobile SDK in Android Studio and Gradle, and is intended to supplement the relevant Android Studio documentation.

90 | Testing and Troubleshooting | Mobile

Native Debugging with Android Studio

This section introduces debugging our sample native apps in Android Studio. Note: Native support in Android Studio is still under development. Some developers have reported problems using run-as with the Note 4 with Lollipop (5.0.x) and S6 with 5.0.0, which may cause issues with debugging. If you have problems debugging, try updating to the latest system software, and please let us know on the Oculus Forums. The default configurations created during project import only support Java debugging. Select Edit Configurations… in the Configurations drop-down menu in the Android Studio toolbar.

Create a new Android Native configuration as show below:

In the General tab of the Run/Debug Configuration dialog box, assign your configuration a name, select the target module, and select the target device mode:

Mobile | Testing and Troubleshooting | 91

In the Native tab of the Run/Debug Configuration dialog box, add symbol paths:

92 | Testing and Troubleshooting | Mobile

Note that ndk-build places stripped libraries inside the libs/ directory. You must point the symbol search paths at the obj/local/ directory. This is also not a recursive search path, so you must put the full path to the obj/local/armeabi-v7a directory.

Mobile | Testing and Troubleshooting | 93

94 | Testing and Troubleshooting | Mobile

Native Debugging with ndk-gdb

This guide provides basic recommendations for using ndk-gdb to debug native mobile VR projects, and is intended to supplement the relevant Android Studio documentation. The Android NDK includes a powerful debugging tool called ndk-gdb, a small shell script wrapped around GDB. Using ndk-gdb from the command line adds convenient features to your debugging workflow by allowing, for example, adding breakpoints, stepping through code, and inspecting variables. Create Breakpoints Break On Function (gdb) break SomeFunctionName()

or (gdb) break SomeClass::SomeMethod()

Example usage: (gdb) break OVR::VrCubeWorld::Frame(OVR::VrFrame const&) Breakpoint 2 at 0xf3f56118: file jni/../../../Src/VrCubeWorld_Framework.cpp, line 292.

Note: GDB supports tab completion for symbol names. Break On File:Line (gdb) break SomeFile.cpp:256

Conditional Breakpoints Add if to the end of your break command. Example usage: (gdb) break OVR::VrCubeWorld::Frame(OVR::VrFrame const&) if vrFrame.PredictedDisplayTimeInSeconds > 24250.0 Breakpoint 6 at 0xf3f58118: file jni/../../../Src/VrCubeWorld_Framework.cpp, line 292.

Break At Current Execution Example usage: (gdb) break OVR::VrCubeWorld::Frame(OVR::VrFrame const&) if vrFrame.PredictedDisplayTimeInSeconds > 24250.0 Breakpoint 6 at 0xf3f58118: file jni/../../../Src/VrCubeWorld_Framework.cpp, line 292.

Break At Current Execution When an application is actively running, press control-c to break immediately and bring up the gdb prompt. Stepping Step Over (gdb) next

Mobile | Testing and Troubleshooting | 95

or (gdb) n

Step Into (gdb) step

or (gdb) s

Continue Execution (gdb) continue

or (gdb) c

Printing Print Struct You may enable pretty printing mode to add new lines between struct elements (optional): (gdb) set print pretty on

To print the struct: (gdb) print SomeStructVariable

Example usage: (gdb) print currentRotation $1 = { x = 23185.9961, y = 23185.9961, z = 0, static ZERO = { x = 0, y = 0, z = 0, static ZERO = } }

printf Example usage: (gdb) printf "x = %f\n", currentRotation.x x = 23185.996094

Breakpoint Commands GDB breakpoints may be set to automatically execute arbitrary commands upon being hit. This feature is useful for inserting print statements into your code without recompiling or even modifying data at key points in your program without halting execution entirely. It is controlled by the commands command.

96 | Testing and Troubleshooting | Mobile

You may specify the breakpoint number with an optional single argument; if omitted, the final breakpoint is used. You are then presented with a series of lines beginning with > where you may enter GDB commands (one per line) that will be executed upon hitting the breakpoint, until the command sequence is terminated with end. In the following example, we create a breakpoint automatically prints the value of a local variable and then continues execution, upon being hit: (gdb) break OVR::VrCubeWorld::Frame(OVR::VrFrame const&) Breakpoint 1 at 0xf3f56118: file jni/../../../Src/VrCubeWorld_Framework.cpp, line 292. (gdb) commands Type commands for breakpoint(s) 1, one per line. End with a line saying just "end". >silent >printf "time = %f\n", vrFrame.PredictedDisplayTimeInSeconds >continue >end

Text User Interface (TUI) To enable or disable the TUI, press CTRL+x followed by CTRL+a. For more information on TUI keyboard shortcuts, see https://sourceware.org/gdb/onlinedocs/gdb/TUIKeys.html.

Mobile | Mobile Native SDK Migration Guide | 97

Mobile Native SDK Migration Guide This section details migrating from earlier versions of the Mobile SDK for native development.

Migrating to Mobile SDK 1.14.0 This section is intended to help you upgrade from the Oculus Mobile SDK version 1.12.0 to 1.14.0. Overview This release provides support for the Samsung Galaxy s9 smartphone as well as provides support for a new gamepad input API. VrApi Changes • vrapi_Initialize can now return a new error code on failure, VRAPI_INITIALIZE_ALREADY_INITIALIZED • Added Samsung s9 Device Types to the API. • VRAPI_FRAME_FLAG_INHIBIT_SRGB_FRAMEBUFFER has been deprecated in favor of using the per-layer flag VRAPI_FRAME_LAYER_FLAG_INHIBIT_SRGB_FRAMEBUFFER. • Input API now exposes and enumerates Gamepads. • vrapi_ReturnToHome has been removed. • vrapi_ShowSystemUIWithExtra is marked deprecated and should no longer be used. • VRAPI_REORIENT_HMD_ON_CONTROLLER_RECENTER property provided to allow apps the ability to opt into a behavior that combines the controller recenter action with reorienting the headset. To enable this, set the property to 1 using vrapi_SetPropertyInt. This feature is disabled by default. Gamepad Input API The VrApi Input Api now exposes gamepads in addition to the tracked remotes and the headset. • • • •

The new enumeration for the gamepad is ovrControllerType_Gamepad. The capabilities struct is ovrInputGamepadCapabilities. The input state struct is ovrInputStateGamepad. The queries are handled in the same way as the tracked remote, and the headset, using: • vrapi_enumerateInputDevices • vrapi_GetInputDeviceCapabilities • vrapi_GetCurrentInputState

Example: for ( int i = 0; ; i++ ) { ovrInputCapabilityHeader cap; ovrResult result = vrapi_EnumerateInputDevices( app->Ovr, i, &cap ); if ( result < 0 ) { Break; } if ( cap.Type == ovrControllerType_Gamepad ) { ovrInputStateGamepad gamepadState; gamepadState.Header.ControllerType = ovrControllerType_Gamepad;

98 | Mobile Native SDK Migration Guide | Mobile

}

}

result = vrapi_GetCurrentInputState( app->Ovr, i, &gamepadState.Header ); if ( result == ovrSuccess ) { backButtonDownThisFrame |= gamepadState.Buttons & ovrButton_Back; aButtonDownThisFrame |= gamepadState.Buttons & ovrButton_A; rightTriggerValueThisFrame = gamepadState.RightTrigger; leftJoystickValueThisFrame = gamepadState.LeftJoyStick; } break;

Migrating to Mobile SDK 1.12.0 This section is intended to help you upgrade from the Oculus Mobile SDK version 1.9.0 to 1.12.0. Overview This release provides support for the Oculus Go and Samsung Galaxy A8/A8+ (2018) smartphones. Build System Changes The following build tool versions have been changed to: • • • •

Android NDK r16b Gradle 4.3.1 Android Plugin for Gradle 3.0.1 Android SDK Build-Tools 26.0.2

Manifest requirements have been updated to account for new Android-OS versions, please see the Android Manifest Settings on page 25 page for more information. VrApi Changes • • • • • • •

• • •

Added a mechanism to specify the Foveation Level for the Eye-Buffer SwapChain Textures. Added Oculus Go Device Types to the API. Added Samsung A-series (2018) Device Types to the API. Added a new ovrModeFlags flag, VRAPI_MODE_FLAG_CREATE_CONTEXT_NO_ERROR, to support applications which want to create a no-error GL context. VRAPI_TEXTURE_SWAPCHAIN_FULL_MIP_CHAIN has been removed. Applications will need to explicitly pass in the number of mipLevels on SwapChain creation. Controllers are now affected by the application specified Tracking Transform. The SwapChain represented by VRAPI_DEFAULT_TEXTURE_SWAPCHAIN now defaults to white instead of black. This is to support solid color frames of more than just black. The application layer’s ColorScale parameter will determine the solid color used. The ovrMobile structure will now always be freed on vrapi_LeaveVrMode. Applications are now required to pass through explicit EGL objects (Display, ShareContext, NativeWindow) to vrapi_EnterVrMode, otherwise the call will fail. VRAPI_SYS_PROP_BACK_BUTTON_DOUBLETAP_TIME has been removed. If applications implement double-tap logic, they can still detect this by checking if the time is less than the VRAPI_SYS_PROP_BACK_BUTTON_SHORTPRESS_TIME.

Mobile | Mobile Native SDK Migration Guide | 99

Foveation The new Foveation API allows the application to adjust the multi-resolution level for the eye-texture SwapChain. This value may be specified using the following API call: vrapi_SetPropertyInt( &Java, VRAPI_FOVEATION_LEVEL, level );

Where: • • • •

Level = 0 disables multi-resolution Level = 1 low setting (good image quality, not as good performance gains) Level = 2 medium setting Level = 3 maximum setting

Currently, this is only available for the Oculus Go developer kits. More information can be found in the High Fidelity OC4 talk (https://youtu.be/pjg309WSzlM?t=1677). LibOVRKernel Changes OVR_Geometry source files have been moved to VrAppFramework. VrAppFramework Changes OVR_Geometry source files have been added to VrAppFramework. LoadTextureBuffer now generates mipmaps after loading png / jpg / bmp files. Double-tap Back-Button logic has been removed.

Migrating to Mobile SDK 1.9.0 This section is intended to help you upgrade from the Oculus Mobile SDK version 1.7.0 to 1.9.0. Overview This release provides support for a new frame submission path which allows for new layer types such as Cylinder, Cube, and Equirect; it enables you to specify a user-defined tracking transform; and it adds support for a new performance API. Build System Changes • We recommend NDK version r15c . With NDK r15, app_dummy is now deprecated in favor of force exporting ANativeActivity_onCreate. • Pre-built libraries are no longer provided by default for the VrAppFramework and VrAppSupport libraries. If your application build files are relying on the pre-built libraries, you will need to change your build files to reference the library’s build path as Projects/Android instead of Projects/AndroidPrebuilt. VrApi Changes • This version adds a new entry point, vrapi_SubmitFrame2, which adds support for flexible layer lists, new layer types, and a single fence to signal completion for the frame. This new frame submission API no longer takes performance parameters as input, and therefore, the new performance API should be used when moving to this path.

100 | Mobile Native SDK Migration Guide | Mobile

• New entry points were added for specifying performance parameters: • vrapi_SetClockLevels • vrapi_SetPerfThread • vrapi_SetExtraLatencyMode • ovrHeadModelParms and corresponding helper functions were removed from the API. Applications should no longer apply the head model to the tracking head pose as this is done by the VrApi runtime. • New entry points for specifying a user-defined tracking transform have been added. The default tracking transform is Eye-Level. • vrapi_RecenterInputPose has been marked deprecated and should not be used.

• VRAPI_SYS_STATUS_HEADPHONES_PLUGGED_IN is no longer provided on the API. For an example of how to query the headphone plugged state, see VrFrameBuilder in the native SDK library, VrAppFramework. • The default swapchain provided by VrApi has been renamed to VRAPI_DEFAULT_TEXTURE_SWAPCHAIN. Note that if you want an application to create a frame of solid black, the ColorScale parameter on the layer must be set to { 0.0f, 0.0f, 0.0f, 0.0f } to prevent any compatibility issues with newer runtimes which expect the ColorScale to be set. Frame Submission Updates The new frame submission API adds support for flexible layer lists and now requires only one completion fence to represent the entire frame. Performance parameters are no longer specified through this API, and should instead be specified through the new Performance API (see below). The following is an example of how to construct frame submission with the new API for the simple case of basic eye buffer rendering: ovrLayerProjection2 & worldLayer = Layers[LayerCount++ ].Projection; worldLayer = vrapi_DefaultLayerProjection2(); worldLayer.HeadPose = Tracking.HeadPose; for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { worldLayer.Textures[eye].ColorSwapChain =ColorTextureSwapChain[eye]; worldLayer.Textures[eye].SwapChainIndex = TextureSwapChainIndex; worldLayer.Textures[eye].TexCoordsFromTanAngles = TexCoordsFromTanAngles; } ovrLayerHeader2 * LayerHeaderList[ovrMaxLayerCount] = {}; for ( int i = 0; i < LayerCount; i++ ) { LayerHeaderList[i] = &Layers[i].Header; } ovrSubmitFrameDescription2 frameDesc = {}; frameDesc.Flags = 0; frameDesc.SwapInterval = 1; frameDesc.FrameIndex = FrameIndex; frameDesc.CompletionFence = (size_t)Fence->Sync; frameDesc.DisplayTime = PredictedDisplayTime; frameDesc.LayerCount = LayerCount; frameDesc.Layers = LayerHeaderList; vrapi_SubmitFrame2( OvrMobile, &frameDesc );

Performance API Performance parameters such as clock levels, application high-performance threads, and extra latency mode are now specified independently of frame submission via the following performance API: vrapi_SetClockLevels( ovr, PerformanceParms.CpuLevel, PerformanceParms.GpuLevel ); vrapi_SetPerfThread( ovr, VRAPI_PERF_THREAD_TYPE_MAIN,PerformanceParms.MainThreadTid ); vrapi_SetPerfThread( ovr, VRAPI_PERF_THREAD_TYPE_RENDERER,PerformanceParms.RenderThreadTid );

Mobile | Mobile Native SDK Migration Guide | 101

vrapi_SetExtraLatencyMode( ovr, VRAPI_EXTRA_LATENCY_MODE_OFF );

Note that these entry points take an ovrMobile pointer and therefore should only be called when in VR Mode, ie between vrapi_EnterVrMode and vrapi_LeaveVrMode. The performance parameters provided by the application will take effect on the next call to vrapi_SubmitFrame(2). Tracking Transform API The new tracking transform API allows applications to specify which space the tracking poses are reported in. The default tracking transform is Eye-Level. To change the transform space, the application may call the following: vrapi_SetTrackingTransform( ovr, vrapi_GetTrackingTransform( ovr, VRAPI_TRACKING_TRANSFORM_SYSTEM_CENTER_FLOOR_LEVEL ) );

Head Model Updates The head model is now applied internally in the VrApi runtime, and ovrHeadModelParms and corresponding helper functions are no longer present on the API. Applications migrating from SDKs 1.7.0 or earlier should no longer apply their own head model. Helper functions for deriving IPD and eye height from the ovrTracking2 data have been added to VrApi_Helpers.h. If you previously built an application with SDK 1.7.0 and your app assumes 1) that poses will have the head model applied, and 2) the y-position is relative to the floor, then update your application to issue the following call immediately after vrapi_EnterVrMode() to get the expected behavior: vrapi_SetTrackingTransform( ovr, vrapi_GetTrackingTransform( ovr, VRAPI_TRACKING_TRANSFORM_SYSTEM_CENTER_FLOOR_LEVEL ) ); VrAppFramework Changes VrAppFramework has been ported to work with the new frame submission path, performance api, and userdefined tracking transform api.

Migrating to Mobile SDK 1.7.0 This section is intended to help you upgrade from the Oculus Mobile SDK version 1.5.0 to 1.7.0. Overview Mobile SDK 1.7.0 provides a new VrApi interface method for obtaining predicted tracking information along with the corresponding view and projection matrices as well as provides build system improvements and native debugging support with externalNativeBuild. Build System Changes • The SDK now uses externalNativeBuild for ndkBuild instead of the deprecated ndkCompile path. externalNativeBuild provides a more robust Native Debugging mechanism that works out of the box and allows for stepping seamlessly between Java and Native code. Ensure you set the ANDROID_NDK_HOME environment variable to your ndk path location. • Android KitKat support has been deprecated. Both the minSdkVersion and compileSdkVersion must now be 21 (Lollipop 5.0) or later. • The build tools have been updated to the following versions:

102 | Mobile Native SDK Migration Guide | Mobile

• Android SDK Build Tools Revision 25.0.1 • Android Plugin for Gradle 2.3.2 • Gradle 3.3 • We recommend NDK version r14b. VrApi Changes • A new entry point, vrapi_GetPredictedTracking2(), was added for querying the predicted tracking info along with corresponding view and projection matrices for each eye. • The default head model is now applied in vrapi_GetPredictedTracking() and vrapi_GetPredictedTracking2() for apps targeting SDK 1.7.0 and higher. Apps should no longer apply the head model themselves. • For apps targeting SDK 1.7.0 and higher, the head pose Y translation will now include eye height above floor. • The vrapi_GetCenterEye() helper functions have been removed and replaced with vrapi_GetFromPose()* helper functions to remove the notion of a 'center eye'. • VrApi_LocalPrefs.h has been removed. Applications can use android system properties for any development debug needs. • VRAPI_FRAME_LAYER_FLAG_WRITE_ALPHA and DST_ALPHA layer blend modes have been deprecated. • vrapi_Initialize now returns an error code when the Oculus System Driver is not found on the device instead of forcing the app to exit(0). LibOVRKernel Changes Deprecated OVR_Math constants have been removed. See OVR_Math.h for the equivalent replacements. VrAppFramework Changes VrAppFramework now queries the predicted tracking state using the new vrapi_GetPredictedTracking2 method and, as such, no longer explicitly applies the head model or manages head model parameters. The following methods are therefore no longer provided: • const ovrHeadModelParms & GetHeadModelParms() const; • void SetHeadModelParms( const ovrHeadModelParms & parms ); VrModel changes • ModelFile now has additional data structures that describe a full scene of data, instead of just a single ModelDef with a list of draw surfaces. • LoadModelFile(...) now returns nullptr if the system failed to load the file. • OvrSceneView.GetWorldModel(...) now returns a pointer instead of a reference. • The correct way to iterate over all the draw surfaces in a model file is now: for ( int i = 0; i < modelFile->Models.GetSizeI(); i++ ) { for ( int j = 0; j < modelFile->Models[i].surfaces.GetSizeI(); j++ ) { // work } }

• ModelState.modelMatrix is now private and must be accessed through the Get and Set commands.

Mobile | Mobile Native SDK Migration Guide | 103

Migrating to Mobile SDK 1.5.0 This section is intended to help you upgrade from the Oculus Mobile SDK version 1.0.4 to 1.5.0. Overview Mobile SDK 1.5.0 provides: • VrApi Input API for Gear VR Controller and Gear VR headset • • • •

Experimental 64-bit SDK libraries Removal of deprecated DrawEyeView render path. Removal of deprecated VrApi Layer Types. Removal of deprecated VrApi Frame Flags.

Build System Changes We now recommend NDK version r13b. We now provide experimental 64-bit SDK Libraries at the following library path: Libs/Android/arm64-v8a/ VrApi Changes VrApi now provides a new Input API which includes support for the Gear VR Controller and headset. See VrApi Input API on page 34 and VrApi/Include/VrApi_Input.h. For example usage, see VrSamples/Native/ VrController/. The deprecated ovrFrameLayerType types are now removed. Explicit indices should be used to index the layer list instead. The deprecated TimeWarp Debug Graph ovrFrameFlags types are now removed. OvrMonitor should be used for performance analysis instead. For more information, see Oculus Remote Monitor on page 64. LibOVRKernel Changes OVR_GlUtils files have been moved to VrAppFramework. VrAppFramework Changes OVR_GlUtils files have been added to VrAppFramework. VrAppFramework no longer overrides application specified frame parameters : FrameIndex, MinimumVSyncs, PerformanceParms. Note: If your application was previously relying on this behavior, make sure to add the following to your ovrFrameParms per-frame set up: frameParms.FrameIndex = vrFrame.FrameNumber; frameParms.MinimumVsyncs = app->GetMinimumVsyncs(); frameParms.PerformanceParms = app->GetPerformanceParms();

VrAppFramework no longer provides the deprecated DrawEyeView render path. See "Restructing App Rendering For Multi-view" in the 1.0.3 SDK Migration section for more information regarding how to restructure your application to use the multiview compatible render path. The VrSamples included with the SDK also provides examples of this render path.

104 | Mobile Native SDK Migration Guide | Mobile

Migrating to Mobile SDK 1.0.4 This section is intended to help you upgrade from the Oculus Mobile SDK version 1.0.3 to 1.0.4. Overview Mobile SDK 1.0.4 provides : • System Utilities Library Dependency is removed and functionality is handled directly within VrApi. • Long-press back button handling (including gaze timer rendering) is now detected and handled directly within VrApi. Applications should no longer implement this logic. • VrApi Loader is now Java free. • The VrApi implementation is now distributed through the Oculus System Driver application. Build System Changes The SDK now defaults to building with clang as GCC is no longer supported starting with ndk r13 and higher. See NDK Release Notes rev 13 for more information. We now recommend NDK version r12b. SDK libraries no longer provide jar file equivalents. The aar files should be used instead. The SDK now provides both debug and release versions of the sdk libraries. Note the new paths to these libraries: • Libs/Android/armeabi-v7a/Release/ • Libs/Android/aar/Release/ VrApi Changes VrApi now handles Long-Press Back Button detection and logic to show the UM as well as the gaze timer rendering on Back Button down. VrApi now handles System Utilities functionality behind the scenes. A new API has been provided for interacting with System Menus, see VrApi_SystemUtils.h. vrapi_Initialize now handles displaying of fatal error cases internally. The application should only need to check for failure cases on return and handle appropriately, ie: int32_t initResult = vrapi_Initialize( &initParms ); if ( initResult != VRAPI_INITIALIZE_SUCCESS ) { // As of 1.0.4, vrapi directly triggers the display of Fatal Errors in vrapi_Initialize. vrapi_Shutdown(); exit( 0 ); }

ovrFrameLayerType types are now deprecated. Explicit indices should be used to index the layer list instead. The TimeWarp Debug Graph has been removed. OvrMonitor should be used for performance analysis instead. VrApi Loader is now Java free. Build files should no longer specify VrApi-Loader.aar as a dependency or VrApi:Projects:AndroidPrebuilt as a dependency path. VrAppFramework Changes Input.h has been renamed to OVR_Input.h.

Mobile | Mobile Native SDK Migration Guide | 105

Calls to StartSystemActivity( PUI_XXX ) should be replaced with ShowSystemUI( VRAPI_SYS_UI_XXX ). Apps should remove Back Button Long Press Handling as this is now handled directly by VrApi. The following input key event is no longer provided: KEY_EVENT_LONG_PRESS . VrAppsupport Changes SystemUtils The System Utils Library has been removed and all functionality subsumed by VrApi. Build files should no longer specify systemutils.a or SystemUtils.aar as build dependencies. Any references to "SystemActivities.h" should be replaced with "VrApi_SystemUtils.h". Values for BACK_BUTTON_DOUBLE_TAP_TIME_IN_SECONDS and BACK_BUTTON_SHORT_PRESS_TIME_IN_SECONDS are now exposed on the VrApi System Property interface and can be queried as: float doubleTapTimeInSeconds = vrapi_GetSystemPropertyFloat( &app->Java, VRAPI_SYS_PROP_BACK_BUTTON_DOUBLETAP_TIME ); float shortPressTimeInSeconds = vrapi_GetSystemPropertyFloat( &app->Java, VRAPI_SYS_PROP_BACK_BUTTON_SHORTPRESS_TIME ); Applications are no longer responsible for managing the SystemUtils app events and should remove the following function calls: • • • •

SystemActivities_Init SystemActivities_Shutdown SystemActivities_Update SystemActivities_PostUpdate Note: Recenter-on-mount is now handled directly within VrApi. Apps do not need to implement this logic any longer. If an application needs to detect that a recenter has occurred, they can now do so through checking a recenter count through VrApi, eg: const int currentRecenterCount = vrapi_GetSystemStatusInt( &Java, VRAPI_SYS_STATUS_RECENTER_COUNT ); // Determine if the recenter count has changed since last frame if ( currentRecenterCount != RecenterCount ) { // reset menu orientations RecenterCount = currentRecenterCount; }

SystemActivities_SendIntent and SystemActivities_SendLaunchIntent are no longer provided. Instead, the versions provided by VrAppframework can be used: SendIntent and SendLaunchIntent. SystemActivities_ReturnToHome is no longer provided. vrapi_ReturnToHome should be used instead. VrGui The Gaze Timer has been removed as it is now handled directly by VrApi. ShowInfoText and DebugFont is now provided through VrGui.

106 | Mobile Native SDK Migration Guide | Mobile

Migrating to Mobile SDK 1.0.3 This section is intended to help you upgrade from the Oculus Mobile SDK version 1.0.0 to 1.0.3. Overview Mobile SDK 1.0.3 provides support for multi-view rendering. In order for your application to take advantage of multi-view, you will need to restructure your application rendering to return a list of render surfaces from Frame instead of relying on the DrawEyeView path. Information regarding how to set up your application rendering for multi-view rendering can be found in "Restructing App Rendering For Multi-view" below. Note: The DrawEyeView render path is deprecated and will be removed in a future SDK update. To make the VrApi more explicit and to make integration with heavily threaded engines easier, the EGL objects and ANativeWindow that are used by the VrApi can now be explicitly passed through the API. The VrAppInterface has been refactored to simplify the interface, support multi-view rendering, and to enforce per-frame determinism. Build steps have been moved from the Python build script into Gradle. The following sections provide guidelines for updating your native project to the 1.0.3 mobile SDK. The native SDK samples which ship with the SDK are also a good reference for reviewing required changes (see VrSamples/Native). Build System Changes The SDK now builds using gcc 4.9. For help porting your own code to gcc 4.9, refer to https://gcc.gnu.org/ gcc-4.9/porting_to.html. The SDK now builds using Android SDK Build Tools Revision 23.0.1. All Gradle files specifying buildToolsVersion 22.0.1 should be revised to buildToolsVersion 23.0.1 Applications that build using build.py (including all projects in VrSamples/Native) will need their build configurations edited to include VrApp.gradle: 1. In the project-level build.gradle (e.g. VrCubeWorld_NativeActivity/Projects/Android/build.gradle), add the following lines to the top of the file: apply from: "${rootProject.projectDir}/VrApp.gradle" 2. Delete the brace-enclosed sections beginning with project.afterEvaluate and android.applicationVariants.all Make the following changes all project-level build.gradle files: 1. In the brace-enclosed section marked android, add the following line: project.archivesBaseName = ""

2. Make sure the brace-enclosed section marked android also contains the following: defaultConfig { applicationId "" }

3. In the brace-enclosed section marked dependencies, replace the line: compile name: 'VrAppFramework', ext: 'aar'

with the line: compile project(':VrAppFramework:Projects:AndroidPrebuilt')

Mobile | Mobile Native SDK Migration Guide | 107

and add the following path to your settings.gradle include list: 'VrAppFramework:Projects:AndroidPrebuilt'

See VrTemplate/ gradle files for an example. VrApi Changes VrApi now displays volume change overlays automatically. This behavior may be overridden if necessary by setting VRAPI_FRAME_FLAG_INHIBIT_VOLUME_LAYER as an ovrFrameParm flag. Note: When using the VrApi volume notifier, make sure that you do not render your own! The ANativeWindow used by the VrApi can now be explicitly passed through the API by specifying the following ovrModeParm flag: parms.Flags |= VRAPI_MODE_FLAG_NATIVE_WINDOW;

Note: While the VrApi currently continues to support the old behavior of not passing EGL objects explicitly with threading restrictions, this functionality will be removed in a future version of VrApi. ovrModeParms AllowPowerSave is now specified as an ovrModeParm flag: parms.Flags |= VRAPI_MODEL_FLAG_ALLOW_POWER_SAVE

ovrFrameLayer ProgramParms are now specified explicitly. LibOVRKernel Changes ovrPose member names have changed: Orientation -> Rotation Position -> Translation Size member names have changed: Width -> w Height -> h The Math constants are deprecated in favor of the new MATH_FLOAT_ and MATH_DOUBLE defines. VrAppFramework Changes VrAppInterface Restructing VrAppInterface has been refactored to simplify the interface, support multi-view, and enforce per-frame determinism. The VrAppInterface::Frame() function signature has changed. Frame() now takes an ovrFrameInput structure that contains all of the per-frame state information needed for an application. It returns an ovrFrameResult structure that should contain all of the state information produced by a single application frame. In particular, ovrFrameInput now contains key state changes and ovrFrameResult must return a list of surfaces to be rendered. When multi-view rendering is enabled, this list of surfaces is submitted only once, but rendered for both eyes. Prior to multi-view support, all surfaces were both submitted and rendered twice each frame.

108 | Mobile Native SDK Migration Guide | Mobile

At a minimum, Frame() must return a frame result with the center view matrix as follows: ovrFrameResult res; res.FrameMatrices.CenterView = Scene.GetCenterEyeViewMatrix(); return res;

As a result of the changes to support multi-view, DrawEyeView() is now deprecated. OneTimeInit and NewIntent have been removed from the interface. Android intents are now passed into EnteredVrMode() along with an intentType flag. On the first entry into EnteredVrMode() after the application’s main Activity is created, intentType will be INTENT_LAUNCH. If the application is re-launched while the main Activity is already running (normally when a paused application is resumed) with a new intent, intentType will be INTENT_NEW. If the application was resumed without a new intent, intentType will be INTENT_OLD. Applications that implemented OneTimeInit() and NewIntent() must now call OneTimeInit() and NewIntent() explicitly from their overloaded EnteredVrMode() method instead. In general, this means: • When intentType is INTENT_LAUNCH, call OneTimeInit(), then NewIntent(). • When intentType is INTENT_NEW, call NewIntent() only. • When intentType is INTENT_OLD, do not call OneTimeInit() or NewIntent(). OneTimeShutdown() has been removed from VrAppInterface. Application shutdown code must now be called from the destructor of the VrAppInterface derived class. OnKeyEvent() was removed from VrAppInterface to allow all input events to be passed through the ovrFrameInput structure. This reduces the number of ways application code receives events to just ovrFrameInput. This requires each application to implement its own code to dispatch key events to OnKeyEvent(). The application’s existing OnKeyEvent() can remain intact and is called from the beginning of Frame() as follows: // process input events first because this mirrors the behavior when OnKeyEvent was // a virtual function on VrAppInterface and was called by VrAppFramework. for ( int i = 0; i < vrFrame.Input.NumKeyEvents; i++ ) { const int keyCode = vrFrame.Input.KeyEvents[i].KeyCode; const int repeatCount = vrFrame.Input.KeyEvents[i].RepeatCount; const KeyEventType eventType = vrFrame.Input.KeyEvents[i].EventType; if ( OnKeyEvent( keyCode, repeatCount, eventType ) ) { continue; // consumed the event } // If nothing consumed the key and it's a short-press of the back key, then exit the application to OculusHome. if ( keyCode == OVR_KEY_BACK && eventType == KEY_EVENT_SHORT_PRESS ) { app->StartSystemActivity( PUI_CONFIRM_QUIT ); continue; } }

LeavingVrMode() has been added. This will be called any time the application leaves VR mode, i.e., whenever the application is paused, stopped or destroyed. App CreateToast has been removed. Instead, App::ShowInfoText can be used for displaying debug text. AppLocal DrawScreenMask() and OverlayScreenFadeMaskProgram are no longer provided and should be implemented by the application which requires it. See VrSamples/Native/CinemaSDK for an example. ovrDrawSurface

Mobile | Mobile Native SDK Migration Guide | 109

ovrDrawSurface has been refactored and now contains a matrix instead of a pointer to a matrix. The joints member was removed and joints must now be specified explicitly in the GlProgram uniform parms as a uniform buffer object. ovrMaterialDef The ovrMaterialDef interface has been merged into ovrGraphicsCommand and is marked for deprecation. ovrMaterialDef programObject -> ovrGraphicsCommand Program.Program ovrMaterialDef gpuState -> ovrGraphicsCommand GpuState ovrSurfaceDef The materialDef member has been replaced by graphicsCommand. The cullingBounds member was removed from ovrSurfaceDef. GlGeometry now calculates and stores a localBounds. GlGeometry GlGeometry now calculates a localBounds on Create in order to guarantee that all geometry submitted to the renderer has valid bounds. The BuildFadedScreenMask() function is no longer provided and should be implemented by the application which requires it. See VrSamples/Native/CinemaSDK for an example. GlProgram GlProgram has undergone significant refactoring for multi-view support: Internal data members were renamed so that the first letter of each member is capitalized. For example, program is now Program. GlProgram now defaults to building shader programs with version 300 to support multi-view. There is one exception: if a shader requires the use of image_external (typically used for video rendering), the shader may be built with v100 due to driver incompatibility with image_external and version 300. All drivers which fully support multi-view will support image_external with version 300. Any program which uses image_external will need to make the following change so that the program is compatible with both v100 and v300, change: #extension GL_OES_EGL_image_external : require

To: #extension GL_OES_EGL_image_external : enable #extension GL_OES_EGL_image_external_essl3 : enable

For more information regarding version 300, see: https://www.khronos.org/registry/gles/specs/3.0/ GLSL_ES_Specification_3.00.3.pdf Shader directives (extensions, optimizations, etc.) must now be specified separately from the main shader source. See VrSamples/Native/CinemaSDK for an example. As part of multi-view support, uniformMvp is no longer part of GlProgram. To be multi-view compliant, apps must remove usage of Mvpm and instead use TransformVertex() for calculating the projected position, i.e.: gl_Position = TransformVertex( Position );

110 | Mobile Native SDK Migration Guide | Mobile

Note: The TransformVertex function is provided in a default system header block which is added to every GlProgram. GlProgram now provides explicit support for uniform parm setup. The old path of relying on a small subset of hard-coded system level uniforms is now deprecated and will be removed in a future SDK. An example of setting up uniform parms with the new path is provided below. GlTexture GlTexture was changed to require the width and height of textures. In order to enforce this requirement, the GlTexture constructors were changed and a GlTexture cannot be implicitly constructed from an integer any longer. Input Application input handling was changed so that input is now passed into the application’s Frame() function via the ovrFrameInput structure. Please see the section on VrAppInterface restructuring for an explanation and example code. VrAppFramework now also handles Android key code 82. A short-press of key code 82 opens the UM, while a long-press goes directly to Home. Texture Manager VrAppFramework now implements a Texture Manager. This allows VrGUI objects to share textures across surfaces, which in turn can make for significant load-time improvements for applications using VrGUI. At present, textures are not reference counted and will not be freed automatically when VrGUI objects are released. Projects that load significant numbers of textures onto VrGUI surfaces may need to use the texture manager to free texture memory explicitly. New instances of ovrTextureManager can be created by the application to manage separate texture memory pools. VrAppSupport VrGUI Individual instances of VrGUI components can now have names. GetComponentByName() was changed to GetComponentByTypeName() and a new template function, GetComponentByName(), exists for retrieving a VRMenuObject’s components by name. Because the new GetComponentByName() signature is different, old code calling GetComponentByName() will not compile until GetComponentByName() is changed to GetComponentByTypeName(). The OvrGuiSys::RenderEyeView() function interface has changed from: GuiSys->RenderEyeView( Scene.GetCenterEyeViewMatrix(), viewMatrix, projectionMatrix );

To: GuiSys->RenderEyeView( Scene.GetCenterEyeViewMatrix(), viewMatrix, projectionMatrix, app>GetSurfaceRender() );

Note: This functionality is specific to the DrawEyeView render path and will be removed in a future SDK. To support multi-view, VrGUI now provides an interface call for adding all gui surfaces to the application’s render surface list: GuiSys->AppendSurfaceList( Scene.GetCenterEyeViewMatrix(), &res.Surfaces );

VrLocale

Mobile | Mobile Native SDK Migration Guide | 111

The ovrLocale::Create function signature has changed: Locale = ovrLocale::Create( *app, "default" );

To: Locale = ovrLocale::Create( *java->Env, java->ActivityObject, "default" );

VrModel The SceneView::DrawEyeView() function signature has changed:: Scene.DrawEyeView( eye, fovDegreesX, fovDegreesY );

To: Scene.DrawEyeView( eye, fovDegreesX, fovDegreesY, app->GetSurfaceRender() );

Restructuring App Rendering For Multi-view In order to set up your rendering path to be multi-view compliant, your app should specify a list of surfaces and render state back to App Frame(). Immediate GL calls inside the app main render pass are not compatible with multi-view rendering and not allowed. The first section below describes how to transition your app from rendering with DrawEyeview and instead return a list of surfaces back to the application framework. The section below describes multi-view rendering considerations and how to enable it in your app. Return Surfaces From Frame Set up the Frame Result: Apps should set up the ovrFrameResult which is returned by Frame with the following steps: 1. Set up the ovrFrameParms - storage for which should be maintained by the application. 2. Set up the FrameMatrices - this includes the CenterEye and View and Projection matrices for each eye. 3. Generate a list of render surfaces and append to the frame result Surfaces list. a. Note: The surface draw order will be the order of the list, from lowest index (0) to highest index. b. Note: Do not free any resources which surfaces in list rely on while Frame render is in flight. 4. Optionally, specify whether to clear the color or depth buffer with clear color. OvrSceneView Example An example using the OvrSceneView library scene matrices and surface generation follows: ovrFrameResult OvrApp::Frame( const ovrFrameInput & vrFrame ) { ... // fill in the frameresult info for the frame. ovrFrameResult res; // Let scene construct the view and projection matrices needed for the frame. Scene.GetFrameMatrices( vrFrame.FovX, vrFrame.FovY, res.FrameMatrices ); // Let scene generate the surface list for the frame. Scene.GenerateFrameSurfaceList( res.FrameMatrices, res.Surfaces ); // Initialize the FrameParms.

112 | Mobile Native SDK Migration Guide | Mobile

FrameParms = vrapi_DefaultFrameParms( app->GetJava(), VRAPI_FRAME_INIT_DEFAULT, vrapi_GetTimeInSeconds(), NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { FrameParms.Layers[0].Textures[eye].ColorTextureSwapChain = vrFrame.ColorTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].DepthTextureSwapChain = vrFrame.DepthTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].TextureSwapChainIndex = vrFrame.TextureSwapChainIndex; FrameParms.Layers[0].Textures[eye].TexCoordsFromTanAngles = vrFrame.TexCoordsFromTanAngles; FrameParms.Layers[0].Textures[eye].HeadPose = vrFrame.Tracking.HeadPose;

}

FrameParms.ExternalVelocity = Scene.GetExternalVelocity(); FrameParms.Layers[0].Flags = VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION; res.FrameParms = (ovrFrameParmsExtBase *) & FrameParms; return res; }

Custom Rendering Example First, you need to make sure any immediate GL render calls are represented by an ovrSurfaceDef. In the DrawEyeView path, custom surface rendering was typically done by issuing immediate GL calls. glActiveTexture( GL_TEXTURE0 ); glBindTexture( GL_TEXTURE_2D, BackgroundTexId ); glDisable( GL_DEPTH_TEST ); glDisable( GL_CULL_FACE ); GlProgram & prog = BgTexProgram; glUseProgram( prog.Program ); glUniform4f( prog.uColor, 1.0f, 1.0f, 1.0f, 1.0f ); globeGeometry.Draw(); glUseProgram( 0 ); glActiveTexture( GL_TEXTURE0 ); glBindTexture( GL_TEXTURE_2D, 0 );

Instead, with the multi-view compliant path, an ovrSurfaceDef and GlProgram would be defined at initialization time as follows. static ovrProgramParm BgTexProgParms[] = { { "Texm", ovrProgramParmType::FLOAT_MATRIX4 }, { "UniformColor", ovrProgramParmType::FLOAT_VECTOR4 }, { "Texture0", ovrProgramParmType::TEXTURE_SAMPLED }, }; BgTexProgram= GlProgram::Build( BgTexVertexShaderSrc, BgTexFragmentShaderSrc, BgTexProgParms, sizeof( BgTexProgParms) / sizeof( ovrProgramParm ) ); GlobeSurfaceDef.surfaceName = "Globe"; GlobeSurfaceDef.geo = BuildGlobe(); GlobeSurfaceDef.graphicsCommand.Program = BgTexProgram; GlobeSurfaceDef.graphicsCommand.GpuState.depthEnable = false; GlobeSurfaceDef.graphicsCommand.GpuState.cullEnable = false; GlobeSurfaceDef.graphicsCommand.UniformData[0].Data = &BackGroundTexture; GlobeSurfaceDef.graphicsCommand.UniformData[1].Data = &GlobeProgramColor;

At Frame time, the uniform values can be updated, changes to the gpustate can be made, and the surface(s) added to the render surface list. Note: This manner of uniform parm setting requires the application to maintain storage for the uniform data. Future SDKs will provide helper functions for setting up uniform parms and materials. An example of setting up FrameResult using custom rendering follows: ovrFrameResult OvrApp::Frame( const ovrFrameInput & vrFrame ) { ...

Mobile | Mobile Native SDK Migration Guide | 113

// fill in the frameresult info for the frame. ovrFrameResult res; // calculate the scene matrices for the frame. res.FrameMatrices.CenterView = vrapi_GetCenterEyeViewMatrix( &app->GetHeadModelParms(), &vrFrame.Tracking, NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { res.FrameMatrices.EyeView[eye] = vrapi_GetEyeViewMatrix( &app->GetHeadModelParms(), &CenterEyeViewMatrix, eye ); res.FrameMatrices.EyeProjection[eye] = ovrMatrix4f_CreateProjectionFov( vrFrame.FovX, vrFrame.FovY, 0.0f, 0.0f, 1.0f, 0.0f ); } // Update uniform variables and add needed surfaces to the surface list. BackGroundTexture = GlTexture( BackgroundTexId, 0, 0 ); GlobeProgramColor = Vector4f( 1.0f, 1.0f, 1.0f, 1.0f ); res.Surfaces.PushBack( ovrDrawSurface( &GlobeSurfaceDef ) ); // Initialize the FrameParms. FrameParms = vrapi_DefaultFrameParms( app->GetJava(), VRAPI_FRAME_INIT_DEFAULT, vrapi_GetTimeInSeconds(), NULL ); for ( int eye = 0; eye < VRAPI_FRAME_LAYER_EYE_MAX; eye++ ) { FrameParms.Layers[0].Textures[eye].ColorTextureSwapChain = vrFrame.ColorTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].DepthTextureSwapChain = vrFrame.DepthTextureSwapChain[eye]; FrameParms.Layers[0].Textures[eye].TextureSwapChainIndex = vrFrame.TextureSwapChainIndex; FrameParms.Layers[0].Textures[eye].TexCoordsFromTanAngles = vrFrame.TexCoordsFromTanAngles; FrameParms.Layers[0].Textures[eye].HeadPose = vrFrame.Tracking.HeadPose;

}

FrameParms.ExternalVelocity = Scene.GetExternalVelocity(); FrameParms.Layers[0].Flags = VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION; res.FrameParms = (ovrFrameParmsExtBase *) & FrameParms; return res; }

Specify the Render Mode: In your app Configure(), specify the appropriate render mode. To configure the app to render using the surfaces returned by Frame, set the following: settings.RenderMode = RENDERMODE_STEREO;

Note: A debug render mode option has been provided which alternates rendering between the deprecated DrawEyeView path and the new path for returning a render surface list from Frame. This aides the conversion process so it is easier to spot discrepancies between the two paths. settings.RenderMode = RENDERMODE_DEBUG_ALTERNATE_STEREO;

Multi-view Render Path Before enabling the multi-view rendering path, you will want to make sure your render data is multi-view compatible. This involves: Position Calculation App render programs should no longer specify Mvpm directly and should instead calculate gl_Position using the system provided TransformVertex() function which accounts for the correct view and projection matrix for the current viewID. Per-Eye View Calculations Apps will need to take into consideration per-eye-view calculations: Examples follow: Per-Eye Texture Matrices:

114 | Mobile Native SDK Migration Guide | Mobile

In the DrawEyeView path, the texture matrix for the specific eye was bound at the start of each eye render pass. For multi-view, an array of texture matrices indexed by VIEW_ID should be used. Note: Due to a driver issue with the Adreno 420, KitKat, and version 300 programs, uniform array of matrices should be contained inside a uniform buffer object. Stereo Images: In the DrawEyeView path, the image specific to the eye was bound at the start of each eye render pass. For multi-view, while an array of texture index by VIEW_ID would be preferable, Android KitKat does not support the use of texture arrays. Instead, both textures can be specified in the fragment shader and the selection determined by the VIEW_ID. External Image Usage Applications which make use of image_external, i.e. video rendering applications, must take care when constructing image_external shader programs. Not all drivers support image_external as version 300. The good news is that drivers which fully support multiview will support image_external in the version 300 path, which means image_external programs will work correctly when the multi-view path is enabled. However, for drivers which do not fully support multi-view, these shaders will be compiled as version 100. These shaders must continue to work in both paths, i.e., version 300 only constructs should not be used and the additional extension specification requirements, listed above, should be made. For some cases, the cleanest solution may be to only use image_external during Frame to copy the contents of the external image to a regular texture2d which is then used in the main app render pass (which could eat into the multi-view performance savings) Enable Multi-view Rendering Finally, to enable the multi-view rendering path, set the render mode in your app Configure() to the following: settings.RenderMode = RENDERMODE_MULTIVIEW;

Migrating to Mobile SDK 1.0.0 This section is intended to help you upgrade from the 0.6.2 SDK to 1.0.0. VrApi Changes The function vrapi_Initialize now returns an ovrInitializeStatus. It is important to verify that the initialization is successful by checking that VRAPI_INITIALIZE_SUCCESS is returned. The function vrapi_GetHmdInfo has been replaced with vrapi_GetSystemPropertyInt and vrapi_GetSystemPropertyFloat. These functions use the ovrSystemProperty enumeration to get the individual properties that were previously returned on the ovrHmdInfo structure. The functions ovr_DeviceIsDocked, ovr_HeadsetIsMounted, ovr_GetPowerLevelStateThrottled, and ovr_GetPowerLevelStateMinimum have been replaced with vrapi_GetSystemStatusInt and vrapi_GetSystemStatusFloat. These functions use the ovrSystemStatus enumeration to select the individual status to be queried. These functions may now be used to also query various performance metrics. The other functions from VrApi_Android.h wrapped Android functionality or dealt with launching or returning from System Activities (Universal Menu, et cetera). These functions were removed from VrApi because they are not considered part of the core minimal API for VR rendering. The functions to get/set brightness, comfort mode and do-not-disturb mode have been removed. The other functions are now available through the

Mobile | Mobile Native SDK Migration Guide | 115

VrAppSupport/SystemUtils library. See the VrAppSupport Changes section for more detailed information about using the SystemUtils library. The layer textures are passed to vrapi_SubmitFrame() as "texture swap chains" (ovrTextureSwapChain). These texture swap chains are allocated through vrapi_CreateTextureSwapChain(). It is important to allocate these textures through the VrApi to allow them to be allocated in special system memory. When using a static layer texture, the texture swap chain does not need to be buffered and the chain only needs to hold a single texture. When the layer textures are dynamically updated, the texture swap chain needs to be buffered. When the texture swap chain is passed to vrapi_SubmitFrame(), the application also passes in the chain index to the most recently updated texture. The behavior of the TimeWarp compositor is no longer specified by selecting a warp program with a predetermined composition layout. The ovrFrameLayers in ovrFrameParms now determine how composition is performed. The application must specify the number of layers to be composited by setting ovrFrameParms::LayerCount, and the application must initialize each ovrFrameLayer in ovrFrameParms::Layers to achieve the desired compositing. See CinemaSDK in the SDK native samples for an example of how to setup and configure layer composition. The ovrFrameOption enumeration has been renamed to ovrFrameFlags, and the VRAPI_FRAME_OPTION_* flags have been renamed to VRAPI_FRAME_FLAG_*. The VRAPI_FRAME_OPTION_INHIBIT_CHROMATIC_ABERRATION_CORRECTION flag has been replaced with VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION. which is now set on ovrFrameLayer::Flags. Note that chromatic aberration correction is now disabled by default. The VRAPI_FRAME_LAYER_FLAG_CHROMATIC_ABERRATION_CORRECTION flag must be set on each layer that needs chromatic aberration correction. Only enable chromatic aberration correction on the layers that need it, to maximize performance and minimize power draw. The ovrFrameLayer::WriteAlpha and ovrFrameLayer::FixedToView booleans have been replaced with the ovrFrameLayer::Flags VRAPI_FRAME_LAYER_FLAG_WRITE_ALPHA and VRAPI_FRAME_LAYER_FLAG_FIXED_TO_VIEW, respectively. The ovrFrameLayerTexture::TextureRect member now affects composition. When there are opportunities to eliminate fragment shading work in the compositor, regions outside the TextureRect may be skipped. However, rendering must still work correctly, even if the regions are not skipped, because in some cases, regions outside the rect must still be rendered. VrApi now allows several OpenGL objects to be explicitly passed through vrapi_EnterVrMode and vrapi_SubmitFrame. The structure ovrModeParms now allows the Display, WindowSurface and ShareContext to be passed explicitly to vrapi_EnterVrMode. If these OpenGL objects are explicitly set on the ovrModeParms, then it is not necessary to call vrapi_EnterVrMode from a thread that has an OpenGL context current on the active Android window surface. If these objects are not explicitly set on ovrModeParms (i.e., they are set to the default value of 0), then vrapi_EnterVrMode will behave the same way it used to, and vrapi_EnterVrMode must be called from a thread that has an OpenGL context current on the active Android window surface. The ovrFrameLayerTexture structure, as part of the ovrFrameParms, now allows a CompletionFence to be passed explicitly to vrapi_SubmitFrame. If this OpenGL object is explicitly set on all layer textures, then it is not necessary to call vrapi_SubmitFrame from the thread with the OpenGL context current that was used to render the eye images. If this object is not explicitly set on all layer textures, then vrapi_SubmitFrame will behave the same it used to, and vrapi_SubmitFrame must be called from the thread with the OpenGL context current that was used to render the eye images. VrAppSupport Changes The support modules in VrAppSupport are now available as prebuilt libraries.

116 | Mobile Native SDK Migration Guide | Mobile

SystemUtils Library All Gear VR application interactions with the System Activities application are managed through the SystemUtils library under VrAppSupport, which contains the System Activities API. All native applications must now link in this VrAppSupport/SystemUtils library by including the following line in the Android.mk file and adding the SystemUtils.aar (jar additionally provided) as a build dependency: $(call import-module,VrAppSupport/SystemUtils/Projects/AndroidPrebuilt/jni)

In order to utilize the SystemUtils library, applications must do the following: • Call SystemActivites_Init() when the application initializes. • Call SystemActivities_Update() to retrieve a list of pending System Activities events. • The application may then handle any events in this list that it wishes to process. The application is expected to call SystemActivities_RemoveAppEvent() to remove any events that it handled. • Call SystemActivities_PostUpdate() during the application’s frame update. • Call SystemActivities_Shutdown() when the application is being destroyed. Gear VR applications may use this API to send a specially formatted launch intent when launching the System Activities application. All applications may start System Activities in the Universal Menu and Exit to Home menu by calling SystemActivities_StartSystemActivity() with the appropriate PUI_ command. See the SystemActivities.h header file in the VrAppSupport/SystemUtils/Include folder for more details, and for a list of available commands. In theory, Gear VR applications may receive messages from System Activities at any time, but in practice these are only sent when System Activities returns to the app that launched it. These messages are handled via SystemActivities_Update() and SystemActivites_PostUpdate(). When Update is called, it returns an array of pending events. Applications may handle or consume events in this events array. If an application wishes to consume an event, it must be removed from the events array using SystemActivities_RemoveAppEvent(). This array is then passed through PostUpdate, where default handling is performed on any remaining events that have default behaviors, such as the reorient event. This example sequence illustrates how System Activities typically interacts with an application: 1. The user long-presses the back button. 2. The application detects the long-press and launches System Activities with SystemActivities_StartSystemActivity and the PUI_GLOBAL_MENU command. 3. In the Universal Menu, the user selects Reorient. 4. System Activities sends a launch intent to the application that launched the Universal Menu. 5. System Activities sends a reorient message to the application that launched it. 6. The application receives the message and adds it to the System Activities event queue. 7. The application resumes. 8. The application calls SystemActivities_Update() to get a list of pending events. 9. If the application has special reorient logic, e.g., to re-align UI elements to be in front of the user after reorienting, it can call vrapi_RecenterPose(), reposition its UI, and then remove the event from list of events. 10.The application calls SystemActivities_PostUpdate(). If the reorient event was not handled by the application, it is handled inside of PostUpdate. In the case of an exitToHome message, the SystemUtils library always handles the message with a default action of calling finish() on the application’s main Activity object without passing it on through the Update/ PostUpdate queue. This is done to allow the application to exit while still backgrounded.

Mobile | Mobile Native SDK Migration Guide | 117

The VrAppFramework library already handles System Activities events, so native applications using the VrAppFramework do not need to make any System Activities API calls. After VrAppFramework calls SystemActivities_Update(), it places a pointer to the event array in the VrFrame object for that frame. Applications using the framework can then handle those System Activities events in their frame loop, or ignore them entirely. Any handled events should be removed from the event array using SystemActivities_RemoveAppEvent() before the application returns from it’s VrAppInterface::Frame() method. The Unity plugin (in Legacy 0.8+ and Utilities 0.1.3+) already includes the SystemUtils library and internally calls the System Activities API functions as appropriate. Unity applications cannot currently consume System Activities events, but can handle them by adding an event delegate using SetVrApiEventDelegate() on the OVRManager class. VrGui Library The VrGui library contains functionality for a fully 3D UI implemented as a scene graph. Each object in the scene graph can be functionally extended using components. This library has dependencies on VrAppFramework and must be used in conjunction with it. See the CinemaSDK, Oculus360PhotosSDK and Oculus360Videos SDKs for examples. VrLocale Library The VrLocale library is a wrapper for accessing localized string tables. The wrapper allows for custom string tables to be used seamlessly with Android’s own string tables. This is useful when localized content is not embedded in the application package itself. This library depends on VrAppFramework. VrModel Library The VrModel library implements functions for loading 3D models in the .ovrScene format. These models can be exported from .fbx files using the FbxConverter utility (see FBXConverter). This library depends on rendering functionality included in VrAppFramework. VrSound The VrSound library implements a simple wrapper for playing sounds using the android.media.SoundPool class. This library does not provide low latency or 3D positional audio and is only suitable for playing simple UI sound effects.

118 | Release Notes | Mobile

Release Notes This section describes changes for each version release.

1.14 Release Notes This document provides an overview of new features, improvements, and fixes included in the latest version of the Oculus Mobile SDK. 1.14.0 For additional information, see Migrating to Mobile SDK 1.14.0 on page 97. New Features The following new features can be found in 1.14: • Support for the Samsung Galaxy S9 and S9+ smartphones. • Gamepads are now exposed and enumerated through the input API. • Opt in ability to combine the controller recenter and headset reorient action. This new behavior provides the most benefit for experiences that are focused in front of the user (for instance, UI-centric applications). Full 360 degree experiences may wish to retain the old behavior if the developer feels it is more intuitive to leave the recenter of the controller independent of the headset reorient. API Changes • • • • • • • •

vrapi_Initialize can now return a new error code on failure, VRAPI_INITIALIZE_ALREADY_INITIALIZED. Added a mechanism to adjust the display refresh rate for Oculus Go: vrapi_SetDisplayRefreshRate. Added Samsung Galaxy S9 Device Types to the API. VRAPI_FRAME_FLAG_INHIBIT_SRGB_FRAMEBUFFER has been deprecated in favor of using the per-layer flag VRAPI_FRAME_LAYER_FLAG_INHIBIT_SRGB_FRAMEBUFFER. Input API now exposes and enumerates Gamepads. vrapi_ReturnToHome has been removed. vrapi_ShowSystemUIWithExtra is marked deprecated and should no longer be used. VRAPI_REORIENT_HMD_ON_CONTROLLER_RECENTER property has been provided to allow apps the ability to opt into a behavior that combines the controller recenter action with reorienting the headset. To enable this, set the property to 1 using vrapi_SetPropertyInt. This feature is disabled by default.

Bug Fixes • Removed code path in samples that always forced a maximum screen brightness. Known Issues • There are no known issues in this release.

Mobile | Release Notes | 119

1.12 Release Notes This document provides an overview of new features, improvements, and fixes included in the latest version of the Oculus Mobile SDK. 1.12.0 This release provides support for the Oculus Go and Samsung Galaxy A8/A8+ (2018) smartphones. For additional information, see Migrating to Mobile SDK 1.12.0 on page 98. New Features The following build tool versions have been changed to: • • • •

Android NDK r16b Gradle 4.3.1 Android Plugin for Gradle 3.0.1 Android SDK Build-Tools 26.0.2

API Changes • • • • • • •

• • •

Added a mechanism to specify the Foveation Level for the Eye-Buffer SwapChain Textures. Added Oculus Go Device Types to the API. Added Samsung A-series (2018) Device Types to the API. Added a new ovrModeFlags flag, VRAPI_MODE_FLAG_CREATE_CONTEXT_NO_ERROR, to support applications which want to create a no-error GL context. VRAPI_TEXTURE_SWAPCHAIN_FULL_MIP_CHAIN has been removed. Applications will need to explicitly pass in the number of mipLevels on SwapChain creation. Controllers are now affected by the application specified Tracking Transform. The SwapChain represented by VRAPI_DEFAULT_TEXTURE_SWAPCHAIN now defaults to white instead of black. This is to support solid color frames of more than just black. The application layer’s ColorScale parameter will determine the solid color used. The ovrMobile structure will now always be freed on vrapi_LeaveVrMode. Applications are now required to pass through explicit EGL objects (Display, ShareContext, NativeWindow) to vrapi_EnterVrMode, otherwise the call will fail. VRAPI_SYS_PROP_BACK_BUTTON_DOUBLETAP_TIME has been removed. If applications implement double-tap logic, they can still detect this by checking if the time is less than the VRAPI_SYS_PROP_BACK_BUTTON_SHORTPRESS_TIME.

Bug Fixes • Fixed a bug in VrSamples where incorrect texture target was specified for GL_TEXTURE_BORDER_COLOR when multi-view was enabled. • Fixed the “ndk-build not found” error in ovrbuild.py script when building solely from Android Studio. Known Issues • There are no known issues in this release.

120 | Release Notes | Mobile

1.9 Release Notes This document provides an overview of new features, improvements, and fixes included in the latest version of the Oculus Mobile SDK. 1.9.0 This release provides support for a new frame submission path which allows for new layer types such as Cylinder, Cube, and Equirect, enables you to specify a user-defined tracking transform, and adds support for a new performance API. FBX Converter has been removed from the SDK. For additional information, see Migrating to Mobile SDK 1.9.0 on page 99. New Features • The Android NDK build version has changed to r15c. API Changes • Added a new frame submission API which supports flexible layer lists and only requires a single fence for signaling frame completion. • Added a new layer API which supports the following layer types: Projection, Cylinder, Cube, and Equirect. • Added a new performance API for specifying clock levels, application performance threads, and extra latency mode setting. • Added a new tracking transform API which allows the application to specify the space that tracked poses are reported in, i.e., Floor Level or Eye Level. The default tracking transform is Eye Level. • Head Model Parameters are no longer provided on the API. Since the VrApi runtime is responsible for applying the head model, applications should no longer do so. The new tracking transform API can be used to achieve the desired Floor or Eye Level space. • MinimumVsyncs was renamed to SwapInterval. • Marked vrapi_RecenterInputPose as deprecated. • Removed VRAPI_SYS_STATUS_HEADPHONES_PLUGGED_IN, VRAPI_SYS_UI_GLOBAL_MENU, and VRAPI_SYS_STATUS_THROTTLED2 from API. Bug Fixes • Fixed bug in VrAppFramework where MemBufferT took ownership of a buffer allocated with malloc and freed the buffer using delete[]. Known Issues • The 1.7.0 SDK changed how head poses were reported. Before 1.7.0, applications were expected to apply a head model if necessary. With 1.7.0, head poses were returned with the head model already applied, but also with a pose y position relative to the floor instead of relative to the nominal eye height. With 1.9.0, head poses are reported with a head model applied, but the default space they are reported in is again relative to nominal eye height. The space that poses are reported in can be changed with vrapi_{Get/ Set}TrackingTransform().

Mobile | Release Notes | 121

1.7 Release Notes This document provides an overview of new features, improvements, and fixes included in the latest version of the Oculus Mobile SDK. 1.7.0 Overview of Major Changes Mobile SDK 1.7.0 provides a new VrApi interface method for obtaining predicted tracking information, build system improvements, and native debugging support with externalNativeBuild. For details on migrating to Mobile SDK 1.7.0 from previous versions, see Migrating to Mobile SDK 1.7.0 on page 101. New Features • The build tools versions have changed to: • Android NDK r14b • Gradle 3.3 • Android Plugin for Gradle 2.3.3 • Android SDK Build-Tools 25.0.1 • Added the system status enumeration VRAPI_SYS_STATUS_SYSTEM_UX_ACTIVE to detect if the either the long press timer or recenter timer system layers are active. • vrapi_Initialize now returns an error code if the system driver is not found on the device instead of terminating the app with an exit(0). API Changes • Added a new entry point, vrapi_GetPredictedTracking2, for querying the predicted tracking information along with corresponding view and projection matrices for each eye. • A default head model is now automatically applied in both vrapi_GetPredictedTracking() and vrapi_GetPredictedTracking2() for apps targeting SDK 1.7.0 and later. Because these tracking methods no longer explicitly apply the head model or manage head model parameters, we've removed the following methods from the VrAppFramework library:



• • • • •

• const ovrHeadModelParms & GetHeadModelParms() const; • void SetHeadModelParms( const ovrHeadModelParms & parms ); The predicted tracking methods now return the head pose Y translation as height above the floor. Previously, the Y translation was relative to the head position in its canonical pose, that is, it generally hovered around 0.0m. Apps that previously applied a bias to place the view in the virtual world space must be adjusted if targeting SDK 1.7.0 or later. The vrapi_GetCenterEye() helper functions have been removed and replaced with vrapi_GetFromPose()* helper functions to remove the notion of a 'center eye'. VRAPI_FRAME_LAYER_FLAG_WRITE_ALPHA from ovrFrameFlags has been deprecated. DST_ALPHA from ovrLayerType has been deprecated. VrApi_LocalPrefs.h has been removed. Applications can use Android system properties for any development debug needs. ovr_GetLocalPreferenceValueForKey and ovr_SetLocalPreferenceValueForKey are no longer provided on the interface. Applications that need similar development testing functionality should instead use Android system properties directly.

122 | Release Notes | Mobile

1.5 Release Notes This document provides an overview of new features, improvements, and fixes included in the latest version of the Oculus Mobile SDK. 1.5.0 Overview of Major Changes This version adds support for the Samsung Gear VR Controller. For details on migrating to Mobile SDK 1.5.0 from previous versions, see Migrating to Mobile SDK 1.5.0 on page 103. Note: With release 1.5.0, the Mobile SDK versioning scheme has changed to MAJOR.MINOR.PATCH.

New Features • Added support for Gear VR Controller. For more information, see VrApi Input API on page 34. • Added VrController sample illustrating VrApi Input API. API Changes • Added VrApi Input API for Gear VR Controller and Gear VR headset. For more information, see VrApi Input API. • Added ovrFrameLayerFlags flag for clipping fragments outside the layer's TextureRect: VRAPI_FRAME_LAYER_FLAG_CLIP_TO_TEXTURE_RECT. • Removed deprecated ovrLayerType. • Removed deprecated ovrFrameFlags.

1.0 Release Notes This document provides an overview of new features, improvements, and fixes included in the latest version of the Oculus Mobile SDK. 1.0.4 Overview of Major Changes The VrApi implementation is now distributed through the Oculus System Driver application. Long-press back button handling (including gaze timer rendering) and recenter-on-mount are now detected and handled directly by VrApi. Applications should no longer implement this logic. The System Utilities library dependency has been removed and its functionality is now handled directly within VrApi. Applications which require VrApi only no longer need to link to anything else. The VrApi Loader is now Java free, further reducing the number of dependencies an application is required to link against. TimeWarp Debug Graph has been removed. Please use OVRMonitor instead.

Mobile | Release Notes | 123

For details on migrating to Mobile SDK 1.0.4 from previous versions, see Migrating to Mobile SDK 1.0.4 on page 104. New Features • Gaze Cursor Timer is now rendered automatically in VrApi as a TimeWarp Layer. API Changes • Back-button long press handling and recenter-on-mount are now handled directly by VrApi. Apps should not implement this logic any longer. • VrApi now provides an interface for displaying System UIs, returning to Home, and displaying fatal error messages. See VrApi_SystemUtils.h. • System Level Button Timings are now exposed on the VrApi System Properties interface. See VRAPI_SYS_PROP_BACK_BUTTON_SHORTPRESS_TIME and VRAPI_SYS_PROP_BACK_BUTTON_DOUBLTAP_TIME. • A recenter count has been added to the VrApi System Status interface. See VRAPI_SYS_STATUS_RECENTER_COUNT. • Applications are no longer responsible for managing the SystemUtils app events and should remove the following function calls: • • • •

SystemActivities_Init SystemActivities_Shutdown SystemActivities_Update SystemActivities_PostUpdate

1.0.3 Overview of Major Changes Multi-view Mobile SDK 1.0.3 adds multi-view rendering support. Multi-view rendering allows drawing to both eye views simultaneously, significantly reducing driver API overhead. It includes GPU optimizations for geometry processing. Preliminary testing has shown that multi-view can provide: • 25-50% reduction in CPU time consumed by the application • 5% reduction in GPU time on the ARM Mali • 5%-10% reduction in power draw Obviously the freed up CPU time could be used to issue more draw calls. However, instead of issuing more draw calls, we recommend that applications maintain the freed up CPU time for use by the driver threads to reduce/eliminate screen tears. While current driver implementations of multi-view primarily reduce the CPU usage, the GPU usage is not always unaffected. On the Exynos based devices, multi-view not only reduces the CPU load, but slightly reduces the GPU load by only computing the view-independent vertex attributes once for both eyes, instead of separately for each eye. Even though there are significant savings in CPU time, these savings do not directly translate into a similar reduction in power draw. The power drawn by the CPU is only a fraction of the total power drawn by the device (which includes the GPU, memory bandwidth, display etc.). Although all applications will have their unique set of challenges to consider, multi-view should allow most applications to lower the CPU clock frequency (CPU level) which will in turn improve power usage and the

124 | Release Notes | Mobile

thermal envelope. However, this does not help on the Exynos based devices where CPU level 1, 2 and 3 all use the same clock frequency. Multi-view will not be available on all Gear VR devices until driver and system updates become available. The current set of supported devices as of the date of this release is: • • • •

S6 / Android M S6+ / Android M S6 Edge / Android M Note 5 / Android M

• Exynos S7 / Android M • Exynos S7+ / Android M For detailed instructions on how to structure a native application for multi-view rendering, see Migrating to Mobile SDK 1.0.3 on page 106. We are working with Unity and Epic to support multi-view in Unity and Unreal Engine. VrAppInterface VrAppInterface has been refactored to simplify the interface, support multi-view rendering, and enforce per-frame determinism. We highly recommend updating your VrAppInterface based application to support multi-view. However, even if you are not planning on supporting multi-view, it would be good to adopt the VrAppInterface changes because they also pave the way for Vulkan support in the future. VrApi VrAppFramework-based applications now explicitly pass EGL objects to VrApi. Previously, the various VrApi functions had to be called from a thread with a specific EGLContext current. The current EGLContext and EGLSurface were basically invisible parameters to the VrApi functions. By explicitly passing the necessary EGL objects to the API, there are no threading restrictions. Volume notifier is now rendered automatically in VrApi as a TimeWarp layer - the application is no longer responsible for generating and displaying this notification. Be sure not to render your own volume interface! Build process Various build steps have been moved from the Python build scripts into Gradle. New Features • Volume Notifier now rendered automatically in VrApi as a TimeWarp Layer. • VrAppFramework now supports multi-view rendering path. • VrAppFramework now uses explicit EGL objects. • GlTexture now supports RGBA ASTC compressed textures. API Changes • VrAppInterface::OneTimeInit and VrAppInterface::NewIntent have been replaced by VrAppInterface::EnteredVrMode. This function is called right after an application entered VR mode. • VrAppInterface::OneTimeShutdown has been removed in favor of moving shutdown code to the destructor of the VrAppInterface derived class. • VrAppInterface::LeavingVrMode is now called right before the application is about to leave VR mode. • VrAppInterface::Frame now takes an ovFrameInput structure and returns an ovrFrameResult structure. • VrAppInterface::OnKeyEvent was removed. Key events are now explicitly handled in VrAppInterface::Frame.

Mobile | Release Notes | 125

• VrApi ovrModeParmFlags now provide VRAPI_MODE_FLAG_NATIVE_WINDOW for specifying the ANativeWindow explicitly. Bug Fixes • Fixed docked / mounted queries to be accurate without requiring an initial event. • Sample apps no longer prompt for an SD card on devices that don’t support external memory. Known Issues • When converting your app to be multi-view compliant, ensure that your System Activities version is at least 1.0.3.1 or you will receive a required system update message. 1.0.0.1 Overview of Major Changes This minor patch release fixes problems with OS X development and splits Oculus Remote Monitor from the Mobile SDK to make it easier to access for developers using a third-party game engine. New Features • Oculus Remote Monitor • VR Developer Mode is no longer required if you have System Activities 1.0.2 or greater and an app built with Oculus Mobile SDK 1.0 or greater. • Exposed experimental layer texel density and complexity visualizers (supported by apps built with Oculus Mobile SDK 1.0 or later). • Now available as a separate downloadable package from the full Mobile SDK download. Bug Fixes • Fixed OS X "No such file or directory" build problem. • Oculus Remote Monitor • Improved network stability on Windows • Increased VR thread stack size. 1.0.0 Overview of Major Changes VrApi is now dynamically loaded from a separate APK, allowing it to be updated without requiring developers to recompile applications. This will allow us to fix bugs, support new devices, and add new OS versions without disrupting development. VrApi now presents the core minimal API for VR rendering. System-level functionality not necessary for VR rendering has been moved to a VrAppSupport/SystemUtils library. This library primarily deals with loading, and with receiving events from the Universal Menu. Various OpenGL objects may now be explicitly passed to the VrApi. In this case, functions such as vrapi_EnterVrMode and vrapi_SubmitFrame do not need to be called from a thread with a specific OpenGL context current. All native applications are now built and developed using Gradle and Android Studio instead of ANT and Eclipse. It is important to note that while the command-line Gradle build path is mature, Android Studio

126 | Release Notes | Mobile

support for native development should still be considered experimental. Feedback on our developer forums is appreciated! Important Change to SystemActivities The VrApi implementation is now deployed as part of SystemActivities, which is automatically updated. When updating to the latest SDK, it is important to make sure the device is online to allow the SystemActivities to be automatically updated. If for some reason the SystemActivities is not up to date when launching a new application, you may get the message "Oculus updates needed." To allow the application to run, the SystemActivities must be updated. If the auto-update function has not delivered the latest SystemActivities to your device, please ensure that you are connected to a working Wi-Fi network. It may take up to 24 hours for the update to trigger, so please be patient. If your development schedule requires a timely update, you may download the SystemActivities APK from this location as temporary relief during this transition period. Future updates should always be processed by the automatic update system on the Gear VR platform. New Features • • • • •

VrApi now dynamically loads from a separate APK. Various OpenGL objects may now be explicitly passed to VrApi, lifting some threading restrictions. Added support for Gradle and experimental support for Android Studio. Added support for the Samsung Galaxy S6 Edge+ and Note 5. TimeWarp Debug Graph may now be toggled on and off during runtime via VrApi Frame flags.

API Changes • vrapi_Initialize now returns an ovrInitializeStatus. • vrapi_GetHmdInfo has been replaced with vrapi_GetSystemPropertyInt and vrapi_GetSystemPropertyFloat. • ovr_DeviceIsDocked, ovr_HeadsetIsMounted, ovr_GetPowerLevelStateThrottled and ovr_GetPowerLevelStateMinimum have been replaced with vrapi_GetSystemStatusInt and vrapi_GetSystemStatusFloat. • Various functions from VrApi_Android.h have been moved to the VrAppSupport/SystemUtils library. • Various performance metrics can now be queried through vrapi_GetSystemStatusInt and vrapi_GetSystemStatusFloat. Bug Fixes • Fixed reorient on mount. • Fixed spurious incorrect long-press event after a short-press plus a long frame. • Fixed invalid input clock levels resulting in dynamic clock frequency mode.

Mobile | Release Notes | 127

0.6 Release Notes This document provides an overview of new features, improvements, and fixes included in the latest version of the Oculus Mobile SDK. 0.6.2.0 Overview of Major Changes The 0.6.2.0 version of the Oculus Mobile SDK includes a change to how we link with OpenGL ES. Previous versions of the SDK had a hard link to libGLESv3.so, which is problematic for engines which integrate the VrApi and must support non-VR devices. Beginning with 0.6.2, the mobile SDK dynamically loads all OpenGL ES symbols. The source for the Unity SDKExample MediaSurface Plugin is now provided. The Media Surface Plugin provides a native library which is used with the Android MediaPlayer for hardware decoding of a single video to a texture. Note that the SDKExample MediaSurface Plugin is not intended to be production quality, and is provided for reference purposes. We have made the source available in case you would like to modify it for your use. The source and build files are located at the following SDK path: VrAppSupport/MediaSurfacePlugin. For detailed instructions on updating native projects to this SDK version, see Mobile Native SDK: Migration. New Features • VrApi • Dynamically loads OpenGL ES symbols instead of hard-linking to libGLESv3.so. • VrCapture • Added mode to capture straight to local flash storage instead of waiting for a remote connection. Useful for automated testing, capturing the startup sequence, and working around issues caused by lowbandwidth WiFi networks. API Changes • VrApi • Remove several non-VR android-specific interface calls. • Native Application Framework • VrGUI Library project now contains Java source and resources. • VrAppInterface::DrawEyeView() takes an additional parameter. Bug Fixes • VrApi • Fixed thread affinity failing to be set for the TimeWarp and DeviceManager threads. • Fixed device reset when repeatedly plugging and unplugging the Travel Adapter on the Samsung GALAXY S6. • VrAppFramework • Optimized GazeCursor to render in two draw calls per eye instead of 32. • Fixed GlGeometry BuildGlobe so that equirect center is at -Z (forward).

128 | Release Notes | Mobile

0.6.1.0 Overview of Major Changes The 0.6.1.0 version of the Oculus Mobile SDK includes major structural changes to the native VrAppFramework library. Several subsystems that were previously part of VrAppFramework have been moved into individual libraries under the VrAppSupport folder, making the application framework leaner and reducing its focus to handling the Android application lifecycle. LibOVR has now been renamed to LibOVRKernel to better represent its intended functionality and to maintain parity with the PC SDK. The Java activity code has been refactored to remove dependencies on the VrActivity class for applications that do not use VrAppFramework. The SDK Native Samples were updated to a cross-platform friendly structure. Android Library Projects are now found at [ProjectName]/Projects/Android. Changes to Unity Integration The Oculus Unity Integration is no longer bundled with the Mobile SDK and must now be downloaded separately from our Downloads page: https://developer.oculus.com/downloads/ We now provide a Utilities for Unity package for use with Unity’s first-party VR support (available in Unity 5.1+). For legacy project development, we currently offer a legacy Unity Integration package for use with Unity 4.6.7 and later. Please see our Unity documentation for more information. If you are using the Utilities for Unity Package with Unity 5.1+, the SDKExamples are now available for download separately. If you are using the Legacy Unity Integration, update to Oculus Runtime for OS X 0.5.0.1. Before updating your runtime, be sure to run uninstall.app (found in /Applications/Oculus/) to remove your previous installation. New Features • VrApi • Images composited by the time warp are now allocated through the VrApi as "texture swap chains". • Performance params (CPU/GPU level, thread ids) can now be adjusted every frame through vrapi_SubmitFrame. • adb logcat -s VrApi now reports the thread affinity. • ovr_GetSystemProperty now provides options for querying GpuType, Device external memory, and max fullspeed framebuffer samples, see ovrSystemProperty in VrApi_Types.h • VrCubeWorld • Added example to VrCubeWorld_SurfaceView and VrCubeWorld_NativeActivity samples to reduce the latency by re-sampling the tracking state later in the frame. • VrTemplate • make_new_project script is now converted to Python for cross-compatibility. • VrCapture / OVRMonitor • VrCapture may now be integrated into and collect data from multiple shared libraries in your app simultaneously (previously you could capture from VrApi or from your app, but not both at the same time). • OpenGL and logcat calls are now captured throughout the entire process. • Applications may now expose user-adjustable variables via OVR::Capture::GetVariable() and tweak the values in real-time in OVRMonitor.

Mobile | Release Notes | 129

• Frame Buffer capturing now does basic block-based compression on the GPU, reducing network bandwidth by 50%. • GPU Zones are enabled, but we recommend only using them on the Samsung GALAXY S6. • Added Settings View for toggling VR Developer Mode. • Sensor Graphs now turn red when values exceed max defined by SetSensorRange(). API Changes • Native Application Framework • VRMenu, OvrGuiSys, OvrGazeCursor and related classes have been moved to the VrAppSupport/VrGui library. • OvrSceneView, ModelFile and related classes have been moved to the VrAppSupport/VrModel library. • Localization-related functionality has been moved to the VrAppSupport/VrLocale library. • The sound pool and related classes have been moved to the VrAppSupport/VrSound library. • The VrGui library now uses the SoundEffectPlayer interface for sound playback, replacing SoundManager. This simple interface can be overloaded to allow VrGui sounds to be played by any sound library. • VrActivity java class now subclasses Android Activity instead of ActivityGroup. Bug Fixes • VrAPI • Fixed adb logcat -s VrApi failure to report memory stats. • Native Application Framework • Fixed a bug where missing font glyphs were skipped instead of rendering as an asterisk. • Cinema SDK • • • •

Fixed last media poster showing up as the first poster for another category. Play/Pause icon does not function correctly after unmount/mount. Unmount/Mount does not pause media immediately. Fixed bad camera orientation in Void theater when auto-switching from another theater due to starting a video with a resolution greater than 1920x1080. • 360 Photos SDK • Fixed Favorites and Folder Browser icon switching in the attribution menu. • Fixed menu state bug causing background scene not to be drawn. • Fixed menu orientations not resetting on reorient. • Increased vertical spacing between categories in Folder Browser to improve thumbnail scrollbar fit. • 360 Videos SDK • Fixed media failure to pause immediately when unmounted. • Fixed movie not pausing on launching system activities. • Fixed menu orientation when resuming app. • Fixed gaze cursor not showing up when in Browser. • Build Scripts • Fix for devices over adb tcpip: If the phone was connected over TCP, it was trying to find oculussig_WWW.XXX.YYY.ZZZ:PPP when checking for the oculussig file. • If an install and run was requested but no devices found, now reports to user rather than quitting silently.

130 | Release Notes | Mobile

• Change directories in an exception-safe manner. • VrCapture / Remote Monitor • Fixed rare crash when disconnecting from remote host on OS X. • Reconnecting to an app multiple times no longer puts the capture library in an undefined state. Known Issues • Unity 4 with Oculus Runtime for OS X 0.4.4 and Legacy Integration 0.6.1.0 or 0.6.0.2 • Editor crashes when building APK or pressing play in Play View; Mac standalone player crashes. To fix, update to Oculus Runtime for OS X 0.5.0.1. Before updating your runtime, be sure to run uninstall.app (found in /Applications/Oculus/) to remove your previous installation.VrApi implicitly links to libGLESv3.so, so currently you cannot load libvrapi.so on devices without OpenGL ES 3.0 support. • VrCapture / Remote Monitor • • • •

GPU Zones currently work on the Galaxy S6 only. Timer Queries are not functional on Adreno based devices. Integrated systrace support is under development and is currently disabled. Some VPNs break auto-discovery.

0.6.0.1 Overview This release of the Mobile SDK includes a fix to performance issues related to our Universal Menu and a hitching problem associated with data logging in VrApi, as well as some other minor updates. If you are upgrading from Mobile SDK v 0.5, be sure to review the 0.6.0 release notes for additional information on significant changes to our native development libraries as well as other updates. Note that our Mobile SDK documentation is now available online here: https://developer.oculus.com/ documentation/mobilesdk/latest/ New Features • Allow Unity MediaSurface dimensions to be modified via plugin interface. Bug Fixes • Fixed performance regression triggered when coming back from the Universal Menu. • Fixed not being able to enable chromatic aberration correction in the Unity plugin. • Reduced once per second frame drop due to gathering stats. • Fixed Do Not Disturb setting. • Fixed adjusting clock levels from Unity during load. Known Issues • adb logcat -s VrApi always reports the amount of available memory as 0.

Mobile | Release Notes | 131

0.6.0 Overview of Native Changes The 0.6.0 version of the Oculus Mobile SDK introduces several major changes that necessitate updates to the VRLib structure, native app interface, and development workflow. If you are migrating from a previous SDK, please refer to the "Migrating from Earlier Versions" sections of the Native Development guide. VRLib has been restructured into three separate libraries in order to make the code more modular and to provide a smoother workflow: • LibOVR – the Oculus Library • VrApi – the minimal API for VR • VrAppFramework – the application framework used by native apps Both LibOVR and VrAppFramework ship with full source. The VrApi is shipped as a set of public include files, a pre-built shared library, and a jar file. Shipping VrApi as a separate shared library allows the VrApi implementation to be updated and/or changed after an application has been released. This allows us to apply hot fixes, implement new optimizations, and add support for new devices without requiring applications to be recompiled with a new SDK. VrApi source is no longer included with the SDK. The Vr App Interface (now part of VrAppFramework) has been simplified and now has a clearly-defined lifecycle. The order in which functions are called has been clarified – previously, some functions could be called either in VR mode or outside of VR mode. The lifecycle can be found in VrAppFramework/Src/App.h. The VRMenu code has been refactored in preparation for moving it into its own static library. User Interfacerelated interfaces that were previously passed to functions individually are now part of OvrGuiSys. There are three new native samples. These samples implement the same simple scene in three different ways, illustrating three approaches to Native application development • VrCubeWorld_SurfaceView – uses a plain Android SurfaceView and handles all Activity and Surface lifecycle events in native code. This sample uses only the VrApi and uses neither the Oculus Mobile Application Framework nor LibOVR. • VrCubeWorld_NativeActivity – uses the Android NativeActivity class. This sample uses only the VrApi and uses neither the Oculus Mobile Application Framework nor LibOVR. • VrCubeWorld_NativeActivity – uses the Oculus Mobile Application Framework. For developers who prefer to use command-line scripts to build native projects, this SDK provides a robust cross-platform set of python build scripts to replace the platform specific build scripts provided with previous SDKs. Overview of Unity Integration Changes • Oculus Runtime is no longer required for mobile development. • Synced with the Oculus PC SDK 0.6.0.0 beta. • Allows clients to re-map plugin event IDs. For both the PC and Mobile SDKs we recommend the following Unity versions or higher: Unity Pro 4.6.3, Unity Professional 5.0.2.p2, Unity Free 4.6, or Unity Personal 5.0.2.p2. For mobile development, compatibility issues are known to exist with Unity 5 and OpenGL ES 3.0 – please check back for updates. Earlier versions of Unity 5 should not be used with the Mobile SDK. Note: Before installing or integrating this distribution, we strongly recommend backing up your project before attempting any merge operations.

132 | Release Notes | Mobile

New Features • VrAPI • Improved frame prediction, in particular for Unity. • Leaving the CPU clock unlocked until the application starts rendering frames to make application loading/resuming faster. • Improved Performance Metrics via Logcat (see Basic Performance Stats through Logcat section of the Native Development Guide for more information). • Native Application Framework • Improved Android Activity and Android Surface lifecycle handling. • Fixed volume bar not showing on first click of the volume adjustment. • 360 Photos SDK • Gaze Cursor now disappears when looking away from Attribution menu. • Blocksplosion • Added OS X input mappings. API Changes • Native Application Framework • Automatic caching of files extracted from apk. Bug Fixes • VrAPI • Removed additional frame of latency between synthesis and display. • Fixed intra frame object motion judder due to TimeWarp displaying eye buffers too early when eye buffer rendering completed early. • Fixed TimeWarp getting more than one frame behind after a bad hitch. • Workaround for "loss of head tracking" after closing and re-opening the device 96 times. • Native Application Framework • Fixed volume bar not showing on first click of the volume adjustment. • Unity Integration • Fixed prediction glitch every 64 frames. • Use correct prediction for OVR_GetCameraPositionOrientation. • Fixed location of the PlatformMenu Gaze Cursor Timer. • Cinema SDK • Fixed playback control reorienting screen in Void theater when user clicks on controls when they're off the screen on portrait videos. • Fixed divide by zero in SceneManager::GetFreeScreenScale() which caused Void theater to crash when starting a movie. • 360 Photos SDK • Fixed Favorites button not creating Favorites folder. • Blocksplosion

Mobile | Release Notes | 133

• Fixed launch blocks falling straight down when launched when built with Unity 5. • Fixed touch triggering "next level" after returning from the System Activity. • Fixed launch block being offset when looking left or right. Known Issues • Initial launch of 360Photos SDK Sample can crash if a duplicate category folder name is present on the target device’s sdcard. Subsequent launches of the app will not crash. A fix is in the works for the next release.

0.5 Release Notes This document provides an overview of new features, improvements, and fixes included in the version 0.5 of the Oculus Mobile SDK. 0.5.1 Overview of Major Changes This document provides an overview of new features, improvements and fixes that are included in this distribution of the Oculus Mobile SDK. The most significant change in 0.5.1 is to System Activities event handling in Unity. The 0.5.0 code for handling System Activities events in Unity was doing heap allocations each frame. Though this was not a leak, it would cause garbage collection to trigger much more frequently. Even in simple applications, garbage collection routinely takes 1 to 2 milliseconds. In applications that were already close to dropping below 60 Hz, the increased garbage collection frequency could cause notable performance drops. The event handling now uses a single buffer allocated at start up. Other notable changes were to HMT sensor prediction — specifically clamping of the delta time used for prediction. Without this change delta times on application startup could sometimes be huge, causing an apparent jump in screen orientation. Unity Developers: As with Mobile SDK v 0.5.0, Unity developers using this SDK version must install the Oculus Runtime for Windows or OS X. This requirement will be addressed in a future release of the SDK. Note: Before installing or integrating this distribution, we strongly recommend that you backup your project before attempting any merge operations. API Changes • Sensor Prediction: Make sure Predicted deltaTime can never be negative or become huge. • Sensor Prediction: Clamp delta time used for sensor prediction to 1/10th of a second instead of 1/60th so that we don’t under-predict if the target frame rate is not being met. • Better handling for a case where SystemActivities resumes without an explicit command. This can happen if the top app crashes or does a finish() instead of launching Home to exit. Bug Fixes • Unity Integration • Rework System Activities Event handling to prevent any per-frame allocations that could trigger Garbage Collector.

134 | Release Notes | Mobile

• Native Framework • Fixed potential MessageQueue deadlock • Bitmapfont - Fix case where billboarded text is right on top of the view position and results in a zerolength normal vector. • Bitmapfont - Fix for font info height not being y-scaled. • Renamed VERTICAL_BOTTOM to VERTICAL_BASELINE because it aligns to the first row’s baseline rather than the bottom of the entire text bounds. • Bitmapfont - Fix for VERTICAL_CENTER_FIXEDHEIGHT to correctly account for the ascent / descent when rendering single and multi-line text. • VrMenu Fader - Update only performed if frame time is > 0.0f. • VrMenu - Add ProgressBar component. • VrMenu - Parent / child rotation order in menus was backwards, causing confusion when local rotations were used. • VrMenu - Don’t use an old view matrix to reposition menus on a reorient. Since we reorient to identity (with respect to yaw) we should reposition with respect to identity instead of the last frame’s view matrix. • AppLocal::RecenterYaw() now adjusts lastViewMatrix so that it instantly reflects the recenter of the sensor fusion state. • FolderBrowser - Allow implementers to create their own panel object. Known Issues • Application version number remains 0.5.0 and was not incremented to 0.5.1. This does not affect app functionality and will be addressed in a future release. • For use with the Mobile SDK, we recommend Unity versions 4.6.3. The Mobile SDK is compatible with Unity 5.0.1p2, which addresses a problem with OpenGL ES 3.0, but there is still a known Android ION memory leak. Please check back for updates. 0.5.0 Overview of Major Changes The Universal Menu has been removed from VrLib, allowing modifications to the Universal Menu without requiring each app to upgrade to the latest SDK. The Universal Menu is now part of the Oculus System Activities application and is downloaded and updated alongside Oculus Home and Horizon. Make sure you update your version of Home in order to test your application with the new Universal Menu. If you are migrating from a previous SDK, please refer to the "Migrating from Earlier Versions" sections of the Native Development and Unity Integration guides. The Mobile Unity Integration is now synced with the Oculus PC SDK 0.5.0.1 Beta. Please ensure you have installed the corresponding 0.5.0.1 Oculus runtime; it can be found at the following location: https:// developer.oculus.com/downloads/ VrPlatform entitlement checking is now disabled by default in Unity; handling for native development is unchanged. If your application requires this feature, please refer to the Mobile SDK Documentation for information on how to enable entitlement checking. Applications built with Mobile SDK 0.5.0 or later will be compatible with the Samsung GALAXY S6. Note: Before installing or integrating this distribution, we strongly recommend that you back up your project before attempting any merge operations. New Features • Android Manifest

Mobile | Release Notes | 135

• Mobile SDK 0.5.0 no longer requires PlatformActivity in the AndroidManifest.xml file. If you have previously worked with an earlier SDK, the following block must be removed:

• The camera permission is also no longer required and can be removed from your manifest if your app does not rely on it:

• For additional information on manifest requirements, see the relevant documentation in the Native Development Guide, Unity Integration Guide, and Mobile App Submission Guide. • Native Framework • Folder Browser • Added support for dynamically loaded categories. • Factored out MetaData from FolderBrowser into MetaDataManager.h/cpp. • Improved wrap-around controls. • Sound Limiter • Application sound_asset.json files may now override specific menu sounds. • VrMenu • Added hit test result to VRMenuEvent. • Added debugMenuHierarchy console command for debug drawing of VrMenu hierarchy. • Now uses current view matrix for gaze cursor and menu positions. • Added options for horizontal and vertical text justification. • Multi-Line text justification. • Added option to allow text to line up horizontally with different descenders. • Unity Integration • Synced with the Oculus PC SDK 0.5.0.1 Beta. • VrPlatform entitlement checking is now disabled by default. • Cinema SDK • UI reworked using new UI components. • 360 Photos SDK • Added libjpeg.a directly to projects in order to avoid dependency on libjpeg source. • Metadata is now app-extensible. Added functionality for reading and writing extended metadata during app loading and saving. • 360 Videos SDK • Added libjpeg.a directly to projects in order to avoid dependency on libjpeg source. • Metadata is now app-extensible. Added functionality for reading and writing extended metadata during app loading and saving. API Changes • VrLib

136 | Release Notes | Mobile

• Universal Menu moved from VrLib into a separate application. • • • • • • • •

Universal Menu specific functionality removed from VrLib. Adds Oculus Remote Monitor support. VrApi restructured for future modularity and ease of development. Local preferences are now allowed in Developer Mode. Please refer to the Mobile SDK Documentation for more information. Default eye height and interpupillary distance have been changed to conform to the default values used by the PC SDK. The native head-and-neck model has been re-parameterized to use a depth/height pair rather than angle/length to conform to the PC SDK. HMDState sensor acquisition code has been re-written to make it reliable and thread safe. Now restores last-known good HMD sensor yaw when recreating the HMD sensor.

Bug Fixes • Unity Integration • Health and Safety Warning no longer displays in editor Play Mode if a DK2 is not attached. • Cinema SDK • Fixed playback controls reorienting screen in void theater when user clicks on controls when they are off the screen on portrait videos. • OvrGuiSys • RemoveMenu is now DestroyMenu and will now free the menu. Known Issues • Unity Integration • For use with the Mobile SDK, we recommend Unity versions 4.6.3, which includes Android 5.0 Lollipop support as well as important Android bug fixes. While the Mobile SDK is compatible with Unity 5.0.0p2 and higher, several issues are still known to exist, including an Android ION memory leak and compatibility issues with OpenGL ES 3.0. Please check back for updates.

0.4 Release Notes This document provides an overview of new features, improvements, and fixes included in version 0.4 of the Oculus Mobile SDK. 0.4.3.1 Overview of Major Changes This release adds support for Unity 5.0.0p2. Developers using Unity 5 must update to this version, and make sure that they are using the latest patch release from Unity. We would like to highlight the inclusion of the new Mobile Unity Integration with full DK2 support based on the Oculus PC SDK 0.4.4. As this is a significant API refactor, please refer to the Unity Development Guide: Migrating From Earlier Versions section for information on how to upgrade projects built with previous versions of the Mobile Unity Integration.

Mobile | Release Notes | 137

Note: Before installing or integrating this distribution, we strongly recommend that you backup your project before attempting any merge operations. 0.4.3 New Features • Android Manifest • Applications will now be required to specify the following permission to support distortion configuration updates by the system service.

• Note: Always refer to the Oculus Mobile Submission Guidelines for the latest information regarding the submission process. • VrPlatform • Support for entitlement checking with VrPlatform. Integration steps and instructions are included in the Oculus Mobile Developer Guide’s Mobile SDK Setup section. • Unity Integration • New Mobile Unity Integration Based on Oculus PC SDK 0.4.4 • Miscellaneous • The Mobile SDK Documentation folder hierarchy has been re-organized into a single document. API Changes • VrLib • • • • •

Localized string updates for the Universal Menu. Improvements to yaw drift correction. Fixed vsync possibly being turned off by the Universal Menu when selecting reorient. Pre-register nativeSetAppInterface to work around a JNI bug where JNI functions are not always linked. Do not allow nativeSurfaceChanged to use a deleted AppLocal in case surfaceDestroyed is executed after onDestroy. • Removed resetting the time warp when sensor device information is not populated on application launch. • Improved Passthrough Camera latency by disabling Optical Image Stabilization (Exynos chipset only). • Free EGL sync objects on time warp thread shutdown. Bug Fixes • 360 Videos SDK • Fixed bug where a few 360 videos would not play. • Fixed several UI bugs. • Added extra error handling. • 360 Photos SDK • Fixed several UI bugs.

138 | Release Notes | Mobile

0.4.2 Overview of Major Changes If you are developing with Unity, we recommend updating to Unity 4.6.1, which contains Android 5.0 – Lollipop support. We would like to highlight the inclusion of the new Mobile Unity Integration with full DK2 support based on the Oculus PC SDK 0.4.4. As this is a significant API refactor, please refer to the Unity Development Guide: Migrating From Earlier Versions section for information on how to upgrade projects built with previous versions of the Mobile Unity Integration. Note: Before installing or integrating this distribution, we strongly recommend that you backup your project before attempting any merge operations. API Changes • VrLib • Universal Menu localization support: English, French, Italian, German, Spanish, Korean. • Move Direct Render out of VrApi and into TimeWarp. • Print battery temperature to logcat. • Fix rendering of TimeWarp Debug Graph. • Unity Integration • Fix for camera height discrepancies between the Editor and Gear VR device. • Moonlight Debug Util class names now prefixed with OVR to prevent namespace pollution. • Provide callback for configuring VR Mode Parms on OVRCameraController; see OVRModeParms.cs for an example. • Native Framework • Fixed bug in which Volume toast is not dismissed if user transitions to Universal Menu while the toast is active. • Allow for app-specific handling when the user selects Reorient in the Universal Menu. • SearchPaths: Now correctly queries Android storage paths. • SearchPaths: Refactored to OvrStoragePaths. • FolderBrowser: Improved load time by removing check for thumbnails in the application package. • FolderBrowser: Improved scrolling and swiping physics. FolderBrowser: Added bound back and wrap around effects. • Sample Project Changes • 360 Photos SDK • Fixed bug in which the user could easily close the menu unintentionally when returning from a photo. • Fixed crash that occurred when photos stored in the physical "Favorites" folder were untagged as "Favorites". • Fixed crash caused by swiping on the "no media found" screen. • 360 Videos SDK • Background star scene now fades to black when starting a video. • Changed corrupted media message to show only filename so it fits in the view. • Fixed rendering artifact that occurred when starting to play a video.

Mobile | Release Notes | 139

0.4.1 Overview of Major Changes Added support for Android 5.0 (Lollipop) and Unity Free. New Features • Mobile SDK • Added support for Android 5.0 (Lollipop). • Unity • Added Unity Free support for Gear VR developers. 0.4.0 Overview of Major Changes First public release of the Oculus Mobile SDK. New Features • First public release of the Oculus Mobile SDK. API Changes • The Mobile SDK is now using API Level 19. Please make the following change to your manifest file:

Bug Fixes • • • • • • • •

Health and Safety Message no longer required on mount. Lens distortion updated for final Gear VR lenses. Improved Mag Yaw drift correction. Option ability to update distortion file without the need for app rebuild. Changed default font to Clear Sans. Unity vignette rendering updated to match native (slightly increases effective FOV). Unity volume pop-up distance to match native. Fix Gaze Cursor Timer scale.

System Activities/VrApi Release Notes This section describes changes to System Activities and VrApi.

1.22.x Release Notes System Activities 1.22.0 Release Notes New Features:

140 | Release Notes | Mobile

• Android O support.

1.21.x Release Notes System Activities 1.21.0 Release Notes Maintenance updates for System Activities.

1.20.x Release Notes System Activities 1.20.0 Release Notes Maintenance updates for System Activities.

1.19.x Release Notes System Activities 1.19.0 Release Notes This release adds support for the New Universal Menu. Maintenance updates for System Activities.

1.16.x Release Notes System Activities 1.16.0 Release Notes Maintenance update for System Activities.

1.15.x Release Notes System Activities 1.15.1 Release Notes Maintenance update for System Activities. System Activities 1.15.0 Release Notes Maintenance update for System Activities.

1.14.x Release Notes System Activities 1.14.3 Release Notes Maintenance update for System Activities. System Activities 1.14.2 Release Notes Maintenance update for System Activities.

Mobile | Release Notes | 141

System Activities 1.14.1 Release Notes • Fixed an issue where a leader's name didn't appear in party text if there was no profile image. • Made localization updates to Oculus Rooms. • Updated the Gear VR Controller model. System Activities 1.14.0 Release Notes Maintenance update for System Activities.

1.13.x Release Notes System Activities 1.13.3 Release Notes New Features • Updated Gear VR Controller directional indicator ray. System Activities 1.13.2 Release Notes Bug Fixes • Properly ignore release on swipe for Gear VR Controller. System Activities 1.13.1 Release Notes New Features • Updated Universal Menu UI for better consistency with new Home. • Improved performance in the Universal Menu.

1.12.x Release Notes System Activities 1.12.1 Release Notes New Features • Added support for Gear VR Controller. Application support requires Mobile SDK 1.5.0 (available midMarch). Bug Fixes • Fixed System Activities crashing issue affecting phones with certain language settings.

1.11.x Release Notes System Activities 1.11.2 Release Notes Bug Fixes • Localization fixes.

142 | Release Notes | Mobile

System Activities 1.11.1 Release Notes Bug Fixes • Fixed minor Party bugs. • Fixed microphone text going outside of bounds. System Activities 1.11.0 Release Notes New Features • Added support for Parties and Rooms. For more information, see our blog post announcement.

1.10.x Release Notes System Activities 1.10.3 Release Notes New Features • Android N support. System Activities 1.10.2 Release Notes Bug Fixes • Fix for overflowing job thread. System Activities 1.10.1 Release Notes New Features • Localization updates. System Activities 1.10.0 Release Notes Beginning with this release, VrApi implementation has been renamed System Driver and now ships separately from System Activities. For System Driver release notes, see System Driver Release Notes on page 146.

1.0.x Release Notes 1.0.3.7 Release Notes New Features • Added new getting started tutorial support. 1.0.3.6 Release Notes Bug Fixes • Removed VrApi interface threading restrictions when using explicit EGL objects. • Prevent the wrong JNIEnv from being used on vrapi_leaveVrMode when calling VrApi interface methods from different threads.

Mobile | Release Notes | 143

1.0.3.5 Release Notes Bug Fixes • Fixed latency and tearing issues. • VrApi’s framebuffer capture now only streams buffered layers to prevent Oculus Remote Monitor from streaming a single cursor layer. • Modified Developer Mode to not set VRAPI_TRACKING_STATUS_HMD_CONNECTED when device is undocked with phone sensors enabled, allowing code which checks this flag to execute while undocked to run properly in Developer Mode. 1.0.3.4 Release Notes New Features • Phones in Developer Mode now implement limited orientation tracking using phone sensors. This may be disabled with a setting to Local Preferences or System Properties. For more information, see Local Preferences and Android System Properties. • Volume indicator now turns red at volume 11 when headphones detected. • Redesigned icons for volume, Bluetooth, Wi-Fi, airplane mode, notifications, brightness, reorient, and battery. • Moved Pass-Through Camera from Settings to Utilities in Universal Menu. • Reorient function now opens Pass-Through Camera and displays instructional text. • Polished UI elements for friend request, status bar. • Changed brightness/volume UI to a slider bar. 1.0.3.3 Release Notes New Features • The Universal Menu (UM) may now be realigned to any direction except straight down. To realign, look away from the UM and tap the touchpad when the message "Tap to reorient menu" appears. • Added SMS text preview. To view message contents, hover over the notification and click "View." • Game invites from friends may now be accepted from the UM notifications page. When an invite is received through a supported app, a gamepad icon will appear in the upper left corner. The user may enter the UM to see details and choose to join or clear. If ignored, game invites expire in ten minutes. • Gamepad B button now acts as Back button. • A notification count badge now appears over the Notifications button when new notifications are received. • Long text is now auto-scaled or clipped to fit in UI elements. • SMS now displays sender’s phone number. • Volume and brightness indicators can now be changed by gazing over them and swiping forward or backwards. • Added UI for inviting users to games to System Activities and UM. This interface will be exposed to developers in a future release of the Oculus Platform SDK library. Bug Fixes • • • • • •

Fixed duplicate sounds when hovering over items. Fixed misaligned time when using the 24-hour time format. Fixed a text aliasing bug. Fixed a crash when receiving SMS messages. Improved text wrapping in Chinese, Japanese, and Korean. Added missing Korean translation for "incoming calls".

144 | Release Notes | Mobile

• All fonts now support the Unicode replacement character. This character looks like a diamond with a question mark in it and is rendered when a character is requested but not found in the font. • Brightness level indicator no longer changes when re-entering the UM. • Issues related to Android OS sending phantom MENU key events on a back press were fixed. 1.0.3.2 Release Notes New Features • User’s real name now displayed on profile page. Bug Fixes • Fixed minor word-wrapping bug. • Fixed subtle overlay layer jittering associated with large, odd-sized textures. 1.0.3.1 Release Notes New Features • Android N compatibility update in VrApi Loader to work with library namespacing. Bug Fixes • • • •

Updated stb library to fix jpeg loading bugs. Font rendering improvements. Fixed automatic word wrapping for Chinese text. Fixed incorrect word wrapping in Japanese.

1.0.3.0 Release Notes New Features • Added Universal Menu support for Profile, Friends, Notifications, and Game Invites (app must support this feature). • No longer require an OpenGL context to be bound during vrapi_EnterVrMode. • Added ability to change clock levels from a background thread. • Defer deleting texture swapchains. • A fatal error is now triggered if vrapi_Shutdown is called before vrapi_LeaveVrMode. • Made improvements to Oculus Remote Monitor - see OVRMonitor release notes for details. • Local Preferences are now also mapped to Android system properties. Bug Fixes • Return NULL from vrapi_EnterVrMode when eglCreateWindowSurface fails. • Fix for vsync handling taking an extremely large amount of CPU time on S7+Snapdragon devices running certain OS versions. 1.0.2.5 Release Notes New Features • Enabled distortion mesh clipping by default: WARP_MESH_CLIP_MODE_SQUARE. • Updated V-sync timing and events. • Added support for different rendering completion fences.

Mobile | Release Notes | 145

1.0.2.3 Release Notes Bug Fixes • Added workaround to determine the external SDCard path if Gear VR Service version is lower than 2.4.18. 1.0.2.2 Release Notes New Features • Enabled video capture and screenshot buttons in the Utilities submenu of Universal Menu. See Screenshot and Video Capture for details. • Upgraded icon resolutions for all Universal Menu icons. Bug Fixes • Fix to maintain consistent text size for Passthrough Camera Off button. 1.0.2.1 Release Notes This release of System Activities adds new permission requirements for features in development. This update will prompt the user to accept camera and network access. New Features • Added two additional weights to EFIGS font. • Improved text word wrap integration in VRMenuObject. Bug Fixes • • • •

Fixed update for buttons in Universal Menu. Fixed VRMenuObjects to prevent freeing textures that were allocated via the texture manager. Fixed Universal Menu performance regression. Fixed out-of-range font weight making text invisible.

1.0.2.0 Release Notes New Features • Redesigned Universal Menu. • Video Capture (Alpha) video files are now written into the app’s cache directory. If permissions permit, the video file is now moved into /sdcard with the app’s package name appended for easy sorting in the directory /sdcard/Oculus/VideoShots/. Bug Fixes • Fixed Unity support with Video Capture (Alpha). • getApplicationName no longer uses the string table to lookup application name, in the case it is not in the string table. 1.0.1.4 Release Notes Bug Fixes • Fixed HMD Device Creation Failure. • Implemented workaround for Android abandoning the buffer queue when using a NativeActivity.

146 | Release Notes | Mobile

1.0.1.4 Release Notes Bug Fixes • Fixed HMD Device Creation Failure. • Implemented workaround for Android abandoning the buffer queue when using a NativeActivity. 1.0.1.3 Release Notes New Features • Added logging for the vrapi version requested by the app, making it easier to determine which SDK a given application was built against. • Added warning if loading the default distortion file fails. • Modified getdevice and getgputype checks to run after vr permissions check. • Distortion file is now loaded directly from VRSVC without requiring a temp file in /sdcard/Oculus/, eliminating need for READ_EXTERNAL_STORAGE Android permission (requires Mobile SDK 1.0 or later). • "Do Not Disturb" mode in the Universal Menu only checks for change state when toggled. Bug Fixes • Fixed button rendering issue in Universal Menu. Known Issues • On a clean boot of the target device, distortion correction is missing when running a VR app outside of the headset with Developer Mode enabled. Target device must be docked to a Gear VR headset at least once after every clean boot up of the target device in order to cache the distortion file. 1.0.1.2 Release Notes Bug Fixes • Fixed the front buffer extension on Adreno Lollipop Note 4 when the EGLSurface for the front buffer is not created by TimeWarp. 1.0.1.1 Release Notes Bug Fixes • Ensured that initial swapbuffer is called for Oculus Home.

System Driver Release Notes This section describes changes to System Driver (previously named VrApi).

1.14.x Release Notes System Driver 1.14.0 Release Notes Maintenance updates for System Driver.

Mobile | Release Notes | 147

1.13.x Release Notes System Driver 1.13.0 Release Notes Maintenance updates for System Driver.

1.12.x Release Notes System Driver 1.12.0 Release Notes Maintenance updates for System Driver.

1.11.x Release Notes System Driver 1.11.0 Release Notes This release adds support for the New Universal Menu. Maintenance updates for System Driver.

1.10.x Release Notes System Driver 1.10.0 Release Notes This is a maintenance release.

1.9.x Release Notes System Driver 1.9.0 Release Notes Bug Fixes • • • •

Fixed video capture playback issue with skipping frames. Fixed potential video capture crash on shutdown. Fixed tracking service crash while undocking and redocking. Fixed German Gear VR Controller recentering instructions issue affecting Marshmallow phones.

1.8.x Release Notes System Driver 1.8.0 Release Notes • Maintenance update for VrDriver. • Localized the message for holding still while recentering Gear VR Controller.

148 | Release Notes | Mobile

1.7.x Release Notes System Driver 1.7.1 Release Notes • Maintenance update for VrDriver. System Driver 1.7.0 Release Notes • vrapi_Initialize now returns an error code if the system driver is not found on the device instead of terminating the app with an exit(0).

1.6.x Release Notes System Driver 1.6.3 Release Notes Maintenance update for VrDriver. System Driver 1.6.2 Release Notes Maintenance update for VrDriver. System Driver 1.6.1 Release Notes Maintenance update for VrDriver. System Driver 1.6.0 Release Notes VrDriver minimum and target Android SDK versions are now 21. New Features • More robust Gear VR Controller tracking. Bug Fixes • Fixed race condition on shutdown.

1.5.x Release Notes System Driver 1.5.6 Release Notes Bug Fixes • Maintenance update for Gear VR Controller. System Driver 1.5.5 Release Notes New Features • Added Chinese localization. • Usability improvements for Gear VR Controller. System Driver 1.5.4 Release Notes Bug Fixes

Mobile | Release Notes | 149

• Gear VR Controller improvements. • Reduced visible wiggle artifacts for some equirect images. • Fixed missing default loading icon affecting some UE4 apps. System Driver 1.5.3 Release Notes New Features • Improved Snapdragon Profiler support. For more information, see Snapdragon Profiler on page 89. • Improved Gear VR Controller usability. System Driver 1.5.0 Release Notes New Features • Added support for Gear VR Controller. Application support requires Mobile SDK 1.5.0 (available midMarch).

1.0.x Release Notes System Driver 1.0.4.6 Release Notes Maintenance release. System Driver 1.0.4.5 Release Notes Maintenance release. System Driver 1.0.4.4 Release Notes Bug Fixes • Android-N fixes. System Driver 1.0.4.3 Release Notes New Features • Added Android N support. Bug Fixes • Fixed headphone state notification update. System Driver 1.0.4.2 Release Notes New Features • Added support for Parties and Rooms. For more information, see our blog post announcement. System Driver 1.0.4.0 Release Notes Beginning with this release, VrApi has been renamed System Driver and now ships separately from System Activities. For System Activities release notes, see System Activities/VrApi Release Notes on page 139. New Features

150 | Release Notes | Mobile

• Moved SysUtils library from the Mobile SDK into System Driver for applications built with Mobile SDK 1.0.4 or later. • Removed TimeWarp Debug Graph support. • Added support for Prologue startup experience.

Oculus Remote Monitor Release Notes This section describes changes to Oculus Remote Monitor.

1.x Release Notes The Oculus Remote Monitor client connects to mobile VR applications running on remote devices to capture, store, and display streamed-in data for performance evaluation and testing. The Oculus Remote Monitor works with any Oculus mobile application built with Unity, Unreal Engine, or Native development tools. For more information and usage instructions, please see the Oculus Remote Monitor on page 64 page. Oculus Remote Monitor for PC and OS X 1.12 New Features • OVRMonitor's new lost frame capture will now show a screenshot of any frames dropped during the capture session and information about the app performance at the time the frame was lost. Oculus Remote Monitor for PC and OS X 1.7 New Features • Added a high-level Performance Overview. It plots a graphical summary of the VrAPI messages and error conditions against the timeline. Double-click any spot in the overview to open the Profiler Data and zoom in to that precise point in the timeline. Oculus Remote Monitor for PC and OS X 1.0.3 New Features • CPU Scheduler events are now available on Galaxy S7. • Added memory Allocation tracking. Every malloc/free can now be charted in the profiler view. • Added Head Tracking graph. Bug Fixes • Fixed corrupt data streams that would happen on slow networks. • Improved profiler view performance. • Fixed miscellaneous bugs. Oculus Remote Monitor for PC and OS X 1.0.0 This is the initial stand-alone release. Release notes for earlier versions may be found in the Mobile SDK Release Notes. New Features

Mobile | Release Notes | 151

• VR Developer Mode is no longer required if you have System Activities 1.0.2 or greater and an app built with Oculus Mobile SDK 1.0 or greater. • Added experimental layer texel density and complexity visualizers (supported by apps built with Oculus Mobile SDK 1.0 or later). • Improved network stability on Windows. • Now available as a separate downloadable package from the full Mobile SDK download.

152 | Mobile SDK Documentation Archive | Mobile

Mobile SDK Documentation Archive This section provides links to legacy documentation. Select from the following: Version

HTML

PDFs

Latest

Mobile SDK Documentation

Mobile SDK

1.12.0

Mobile SDK Documentation

Mobile SDK

1.9.0

Mobile SDK Documentation

Mobile SDK

1.7.0

Mobile SDK Documentation

Mobile SDK

1.5.0

Mobile SDK Documentation

Mobile SDK

1.0.4

Mobile SDK Documentation

Mobile SDK

1.0.3

Mobile SDK Documentation

Mobile SDK

0.6

Mobile SDK Documentation

Mobile SDK

0.5

Mobile SDK Documentation

Mobile SDK

0.4

Mobile SDK Documentation

Mobile SDK

Life Enjoy

" Life is not a problem to be solved but a reality to be experienced! "

Get in touch

Social

© Copyright 2013 - 2018 DOKUMENTIX.COM - All rights reserved.