Avatar SDK - Oculus

Native C/C++ (Rift) Getting Started. .... Avatar SDK C/C++ Developer Reference. .... Each time a packet is recorded, our code places the packet into a...

1 downloads 39 Views 2MB Size

Recommend Documents


Avatar SDK - Oculus
a Unity package with scripts, prefabs, art assets, and sample scenes for PC and Gear VR development. ..... Speak or sing to see the voice .... void Awake () {.

Avatar SDK - Oculus
2 | Introduction | Avatar ... 11. Translating Touch Controllers To Avatar Movements. .... native C/C++ samples, header files, and libraries for PC and Gear VR development. ..... Avatar SDK OvrAvatar.cs class is already set up to load the avatar ...

Avatar SDK - Oculus
In the Oculus Avatars SDK download, you get: • a Unity package with ... back on a remote system. Lets take a quick tour of the RemoteLoopbackManager script.

Avatar SDK - Oculus
Avatar SDK C/C++ Developer Reference. ... This tutorial shows you how to start using them. ... Let us have a look at the RemoteLoopbackManager script.

Avatar SDK - Oculus
Avatar SDK Getting Started Guide. .... Avatar SDK C/C++ Developer Reference. .... the Oculus Unity packages and also setting up Unity for Android development.

Avatar SDK - Oculus
If you do not have the Oculus Utilities for Unity 5, download its .zip file too. .... for implementing avatars in your own code. Let us take a little tour of it. .... The Avatar SDK consists of native C++ and Unity documentation, samples, plugins, so

Avatar SDK - Oculus
native C/C++ samples, header files, and libraries for PC and Gear VR development. • an Unreal ... This tutorial .... avatar 180 degrees so that its front faces us.

Audio SDK - Oculus
Some 3D positional implementations layer simple “shoebox room” modeling on ..... that you set the attenuation range min/max to match the authored curve as ..... Unity 5 includes a flexible mixer architecture for routing audio sources. ..... the n

Audio SDK - Oculus
2 | Introduction | Audio. 2 | |. Copyrights and ..... Playing Ambisonic Audio in Unity 2017.1 (Beta). ..... Audio | Introduction to Virtual Reality Audio | 9. Directional ...

Platform SDK - Oculus
Your app should constantly check the message queue for notifications and ... Once you've retrieved the message payload call ovr_FreeMessage() to free the ... In your Visual C++ project, open Properties, expand the Linker tab, and select Input. ......

Avatar SDK Version 1.0

2 | Introduction | Avatar

Copyrights and Trademarks ©

2017 Oculus VR, LLC. All Rights Reserved.

OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights reserved. BLUETOOTH is a registered trademark of Bluetooth SIG, Inc. All other trademarks are the property of their respective owners. Certain materials included in this publication are reprinted with the permission of the copyright holder.

2 |  | 

Avatar | Contents | 3

Contents Avatar SDK Getting Started Guide..................................................................... 4 Unity (Rift) Getting Started................................................................................................................................ 4 Unity (Gear VR) Getting Started........................................................................................................................7 Native C/C++ (Rift) Getting Started............................................................................................................... 10 Rendering Avatars...................................................................................................................................... 11 Translating Touch Controllers To Avatar Movements............................................................................... 13 Native C/C++ (Gear VR) Getting Started....................................................................................................... 14

Avatar SDK Guide..............................................................................................17 Unity Features.................................................................................................................................................. 17 Adding C++ Avatar Support...........................................................................................................................22 Rendering Avatar Components..................................................................................................................23 Voice Visualization...................................................................................................................................... 24 Pose Recording and Playback....................................................................................................................24

Avatar SDK C/C++ Developer Reference......................................................... 26

4 | Avatar SDK Getting Started Guide | Avatar

Avatar SDK Getting Started Guide Welcome to Oculus Avatars, a powerful and flexible way to increase presence within VR apps. Avatars deliver true social presence with persistent identities you can take with you across the Oculus ecosystem. Bring other peoples' avatars into your app, game, or experience so they can feel like themselves and even recognize their friends.

Avatars also provide hand presence for Touch controllers, letting you integrate Oculus Touch interaction into Rift apps. Integrating Touch lets your users interact with their environment and enhances their perception of themselves within your virtual world. Add Social Presence, Touch Interaction, or Both As a developer, you are free to mix and match Avatars SDK features. You can use it to provide Touch interaction in one app and social interaction in another. The SDK is extensible and customizable and includes: • a Unity package with scripts, prefabs, art assets, and sample scenes for PC and Gear VR development. • native C/C++ samples, header files, and libraries for PC and Gear VR development.

Unity (Rift) Getting Started The Avatar Unity package contains several prefabs you can drop into your existing Unity projects. This tutorial shows you how to start using them. Download the Oculus Avatars SDK The SDK is packaged in a .zip archive file on our developer website. 1. Download the Oculus Avatars SDK .zip from https://developer.oculus.com/downloads/. 2. If you do not have the Oculus Utilities for Unity 5, download its .zip file too.

Avatar | Avatar SDK Getting Started Guide | 5

3. Extract the contents of the .zip archive files to your local drive. Set Up Unity for Oculus Avatar Development To set up, import the Oculus Unity packages into a project. 1. Create a New Project in Unity named "Unity Avatar Demo Project". 2. Import the Oculus Avatars (OvrAvatar.unityPackage) and Oculus Utilities (OculusUtilities.unitypackage) packages. For each package: a. Click Assets > Import Package > Custom Package. b. Select the package file from your local drive. c. Click All and then click Import. 3. Select the Virtual Reality Supported check box in Edit > Project Settings > Player. 4. Delete Main Camera from your scene and then drag OVRCameraRig from OVR > PreFabs. 5. Reset the transform on OVRCameraRig. Note: You may ignore any No Oculus Rift App ID warnings you see during development. While an App ID is required to retrieve Oculus avatars for specific users, you can prototype and test experiences that make use of Touch and Avatars with just the default blue avatar. Adding an Avatar to the Scene The LocalAvatar prefab renders the player's avatar and hands. Check box options in the Inspector let you choose which parts of the avatar you want to render: body, hands, and Touch controllers. To render avatar hands with controllers: 1. Drag OvrAvatar > Content > Prefabs > LocalAvatar to the Hierarchy window. 2. In the Inspector window, select the Start With Controllers check box. Click Play to test. Experiment with the built-in hand poses and animations by playing with the Touch controllers.

To render avatar hands: 1. In the Hierarchy window, select LocalAvatar. 2. In the Inspector window, clear the Start With Controllers check box.

6 | Avatar SDK Getting Started Guide | Avatar

Click Play to test. Note how the finger joints transform to change hand poses as you squeeze and release the grips and triggers on the Touch controllers. You might sometimes want to use hand poses outside of these movements and we talk more about this in Custom Grip Poses.

To render the avatar body: 1. 2. 3. 4.

In the Hierarchy window, select LocalAvatar. In the Inspector window, select the Show Third Person check box. Change Transform > Position to X:0 Y:0 Z:1.5 Change Transform > Rotation to X:0 Y:180 Z:0

Recording and Playing Back Avatar Pose Updates The avatar packet recording system saves avatar movement data as packets you can send across a network to play back on a remote system. To see a demonstration, open the RemoteLoopback scene in OvrAvatar > Samples > RemoteLoopback. Let us have a look at the RemoteLoopbackManager script.

Avatar | Avatar SDK Getting Started Guide | 7

Setting RecordPackets to true starts the avatar packet recording system. We also subscribe to the event handler PacketRecorded so that we can do something useful each time a packet is recorded. void Start () { LocalAvatar.RecordPackets = true; LocalAvatar.PacketRecorded += OnLocalAvatarPacketRecorded; }

Each time a packet is recorded, our code places the packet into a memory stream we are using as a stand-in for a real network layer. void OnLocalAvatarPacketRecorded(object sender, args) { using (MemoryStream outputStream = new MemoryStream()) { BinaryWriter writer = new BinaryWriter(outputStream); writer.Write(packetSequence++); args.Packet.Write(outputStream); SendPacketData(outputStream.ToArray()); } }

The remainder of our code receives the packet from the memory stream and plays it back on our loopback avatar object. void SendPacketData(byte[] data) { ReceivePacketData(data); } void ReceivePacketData(byte[] data) { using (MemoryStream inputStream = new MemoryStream(data)) { BinaryReader reader = new BinaryReader(inputStream); int sequence = reader.ReadInt32(); OvrAvatarPacket packet = OvrAvatarPacket.Read(inputStream); LoopbackAvatar.GetComponent().QueuePacket(sequence, packet); } }

Unity (Gear VR) Getting Started The Avatar Unity packages contain several prefabs you can drop into your existing Unity projects. This tutorial shows you how to start using them. Download the Oculus Avatars SDK The SDK is packaged in a .zip archive file on our developer website. 1. Download the Oculus Avatars SDK .zip from https://developer.oculus.com/downloads/. 2. If you do not have the Oculus Utilities for Unity 5, download its .zip file too. 3. Extract the contents of the .zip archive files to your local drive. Set Up Unity for Oculus Avatar Gear VR Development The set up includes importing the Oculus Unity packages and also setting up Unity for Android development and debugging. 1. Create a New Project in Unity named "gearvr-avatar".

8 | Avatar SDK Getting Started Guide | Avatar

2. Click File > Build Settings and select Android. Download and install Unity Android Support and then restart Unity if necessary. 3. Click Switch Platform to switch to Android platform. 4. Click Add Open Scenes. 5. Set Texture Compression to ASTC. 6. Click Edit > Project Settings > Player, click the little Android Settings robot, and then set the following options: a. Select the Virtual Reality Supported check box. b. In Bundle Identifier, enter a unique package name.

c. Set Minimum API Level to Android 5.0 'Lollipop' (API level 21). d. Set Install Location to Automatic. 7. Import the Oculus Avatars (OvrAvatar.unityPackage) and Oculus Utilities (OculusUtilities.unitypackage) packages. For each package: a. Click Assets > Import Package > Custom Package. b. Select the package file from your local drive. c. Click All and then click Import. 8. Connect your Android device to your computer. 9. Create an Oculus Signature File for your Android device at https://dashboard.oculus.com/tools/osiggenerator/and then copy it to the folder gearvr-avatar/Assets/Plugins/Android/assets. Create this folder if it doesn't exist. Adding the VR Camera Because the avatar has a default height of 170 cm, we should raise our VR camera rig to the same height. 1. Delete Main Camera from your scene and then drag OVRCameraRig from OVR > PreFabs. 2. Set the Position transform on OVRCameraRig to X:0, Y:1.70, Z:0. Adding an Avatar As the player cannot see his or her own Gear VR avatar, Gear VR avatars should all be of the "third person" type. To make sure the avatar is visible, we can place the avatar 50cm in front of the camera, and rotate the avatar 180 degrees so that its front faces us. Note: The "local" in the prefab name "LocalAvatar" refers to how the avatar object gets its motion data. "Local" means the avatar object is driven by the local headset orientation. 1. Drag OvrAvatar > Content > Prefabs > LocalAvatar to the Hierarchy window.

2. In the Inspector, clear the Show First Person check box and select the Show Third Person check box. 3. Select the Combine Meshes check box. This reduces total draw calls per frame per avatar to 6 from 22. Gear VR apps typically need to stay within 50 to 100 draw calls per frame. 4. Set the Position transform on LocalAvatar to X:0, Y:0, Z:0.50. 5. Set the Rotation transform on LocalAvatar to X:0, Y:180, Z:0. 6. Click File > Build & Run to build an .apk from this scene and have Unity launch it on your Android device.

Avatar | Avatar SDK Getting Started Guide | 9

What to Explore Next? • Loading Personalized Avatars See Unity Features for instructions on how to modify the sample scenes to retrieve Oculus User IDs and display personalized avatars. • Recording and Playing Back Avatar Pose Updates Build our RemoteLoopback example scene and read the accompanying write-up in our Unity (Rift) Getting Started topic.

10 | Avatar SDK Getting Started Guide | Avatar

Native C/C++ (Rift) Getting Started Get started using Oculus Avatars in your own native Rift code by experimenting with our sample Visual Studio 2013 C++ solution. Download the Oculus Avatars SDK The SDK is packaged in a .zip archive file on our developer website. 1. Download the Oculus Avatars SDK .zip from https://developer.oculus.com/downloads/. 2. Extract the contents of the .zip archive files to your local drive. OpenGL is Required The current version of the Avatar SDK only contains OpenGL shaders. Running the Mirror Sample on Microsoft Visual Studio 2013 Our Mirror sample serves as a good foundation for implementing avatars in your own code. Let us take a little tour of it. To set up the Microsoft Visual Studio 2013 solution for our Mirror sample: 1. Download and install cmake from https://cmake.org/download. 2. In Windows Explorer, locate the OVRAvatarSDK\Samples folder and double-click generate_projects.cmd. 3. Wait for the script to finish creating the VS2013 folder and solution. 4. Open and build the solution: Samples\VS2013\Mirror.sln. 5. Press F5 to start debugging. Key Bindings The Mirror sample illustrates several features of the Avatar SDK by letting you toggle them: Press...

to...

1

show/hide the avatar body.

2

show/hide the hands.

3

show/hide the base cone.

4

show/hide the voice visualization.

c

show/hide the Touch controllers.

f

freeze/unfreeze the current hand pose.

s

set the hand pose to 'grip sphere'

u

set the hand pose to 'grip cube'

j

show/hide the joint lines.

r

start avatar packet recording. Press 'r' again to play recorded packets in a loop.

Avatar | Avatar SDK Getting Started Guide | 11

Exploring the Sample Code Open Mirror.cpp file and follow along from main. Our tour explores only the portions of code specific to Oculus avatars.

Rendering Avatars Compiling Shaders We compile the Avatar vertex and fragment shaders for our regular shader and our physically based shader (PBS) using a helper function _compileProgramFromFiles. _skinnedMeshProgram = _compileProgramFromFiles("AvatarVertexShader.glsl", "AvatarFragmentShader.glsl", sizeof(errorBuffer), errorBuffer); ... _skinnedMeshPBSProgram = _compileProgramFromFiles("AvatarVertexShader.glsl", "AvatarFragmentShaderPBS.glsl", sizeof(errorBuffer), errorBuffer);

Retrieving Avatar Data From a User Profile The appearance of every person's avatar is stored in his or her Oculus user profile as an Avatar Specification. The Avatar Specification identifies the meshes and textures that make up a person's avatar. Before we retrieve this specification data, we have to initialize both the Platform SDK and the Avatar SDK using our app ID. To get your own app ID, see Developer Dashboard and Oculus App Setup. #define MIRROR_SAMPLE_APP_ID "958062084316416" ... ovrPlatformInitializeWindows(MIRROR_SAMPLE_APP_ID); ... ovrAvatar_Initialize(MIRROR_SAMPLE_APP_ID);

Avatar Specifications are indexed by Oculus user ID. An app has easy access to the Oculus user ID of the currently logged in user. Tip: If you wanted to create a social experience, you would write code to share user IDs between instances of your app so that you could load and render the appearance of other avatars too. ovrID userID = ovr_GetLoggedInUserID(); ovrAvatar_RequestAvatarSpecification(userID);

The function ovrAvatar_RequestAvatarSpecification() is asynchronous. We use a message queue to determine when the function has finished obtaining our data (ovrAvatarMessageType_AvatarSpecification). // Pump avatar messages while (ovrAvatarMessage* message = ovrAvatarMessage_Pop()) { switch (ovrAvatarMessage_GetType(message)) { case ovrAvatarMessageType_AvatarSpecification: _handleAvatarSpecification(ovrAvatarMessage_GetAvatarSpecification(message)); break; case ovrAvatarMessageType_AssetLoaded: _handleAssetLoaded(ovrAvatarMessage_GetAssetLoaded(message)); break; } ovrAvatarMessage_Free(message); }

With the Avatar Specification in hand, we then use our helper function _handleAvatarSpecification to:

12 | Avatar SDK Getting Started Guide | Avatar

• Create an avatar instance (ovrAvatar_Create). • Load all the relevant avatar assets into that instance. Loading avatar assets is also asynchronous and we again rely on popping our message queue to determine when an asset for an avatar has finished loading (ovrAvatarMessageType_AssetLoaded). static void _handleAvatarSpecification(const ovrAvatarMessage_AvatarSpecification* message) { // Create the avatar instance _avatar = ovrAvatar_Create(message->avatarSpec, ovrAvatarCapability_All);

}

// Trigger load operations for all of the assets referenced by the avatar uint32_t refCount = ovrAvatar_GetReferencedAssetCount(_avatar); for (uint32_t i = 0; i < refCount; ++i) { ovrAvatarAssetID id = ovrAvatar_GetReferencedAsset(_avatar, i); ovrAvatarAsset_BeginLoading(id); ++_loadingAssets; } printf("Loading %d assets...\r\n", _loadingAssets);

static void _handleAssetLoaded(const ovrAvatarMessage_AssetLoaded* message) { // Determine the type of the asset that got loaded ovrAvatarAssetType assetType = ovrAvatarAsset_GetType(message->asset); void* data = nullptr; // Call the appropriate loader function switch (assetType) { case ovrAvatarAssetType_Mesh: data = _loadMesh(ovrAvatarAsset_GetMeshData(message->asset)); break; case ovrAvatarAssetType_Texture: data = _loadTexture(ovrAvatarAsset_GetTextureData(message->asset)); break; }

}

// Store the data that we loaded for the asset in the asset map _assetMap[message->assetID] = data; --_loadingAssets; printf("Loading %d assets...\r\n", _loadingAssets);

Rendering the Avatar Our sample code is called Mirror and it calls the avatar rendering helper function _renderAvatar() twice. The first call renders a first-person avatar. A first person avatar can depict the player's hands and world position. // If we have the avatar and have finished loading assets, render it if (_avatar && !_loadingAssets) { _renderAvatar(_avatar, ovrAvatarVisibilityFlag_FirstPerson, view, proj, eyeWorld, renderJoints);

The second call renders a third-person avatar, transformed so that it faces you as if looking in a mirror. A thirdperson avatar can depict hands, body, and base cone. glm::vec4 reflectionPlane = glm::vec4(0.0, 0.0, -1.0, 0.0); glm::mat4 reflection = _computeReflectionMatrix(reflectionPlane);

}

glFrontFace(GL_CW); _renderAvatar(_avatar, ovrAvatarVisibilityFlag_ThirdPerson, view * reflection, proj, glm::vec3(reflection * glm::vec4(eyeWorld, 1.0f)), renderJoints); glFrontFace(GL_CCW);

Avatar | Avatar SDK Getting Started Guide | 13

Hiding and Displaying Avatar Capabilities When we first created our avatar instance, we created it with all capabilities active: _avatar = ovrAvatar_Create(message->avatarSpec, ovrAvatarCapability_All);

You can enable different capabilities by calling ovrAvatar_SetActiveCapabilities(). In our sample, we toggle different capabilities in real-time using the bit masks defined in ovrAvatarCapabilities: case '1': capabilities ^= ovrAvatarCapability_Body; ovrAvatar_SetActiveCapabilities(_avatar, static_cast(capabilities)); break; case '2': capabilities ^= ovrAvatarCapability_Hands; ovrAvatar_SetActiveCapabilities(_avatar, static_cast(capabilities)); break; case '3': capabilities ^= ovrAvatarCapability_Base; ovrAvatar_SetActiveCapabilities(_avatar, static_cast(capabilities)); break; case '4': capabilities ^= ovrAvatarCapability_Voice; ovrAvatar_SetActiveCapabilities(_avatar, static_cast(capabilities)); break;

Translating Touch Controllers To Avatar Movements Our sample code translates Touch controller input into Avatar movements in two parts: 1. Processing the Touch inputs 2. Updating the Avatar Processing the Touch Controller Inputs We translate the position and orientation of the head-mounted display and the left and right Touch controllers to avatar body and hand positions using our helper function _ovrAvatarTransformFromGlm(). We call our helper function _ovrAvatarHandInputStateFromOvr() to translate the various Touch button, trigger, and touch states. // Convert the OVR inputs into Avatar SDK inputs ovrInputState touchState; ovr_GetInputState(ovr, ovrControllerType_Active, &touchState); ovrTrackingState trackingState = ovr_GetTrackingState(ovr, 0.0, false); glm::vec3 glm::quat glm::vec3 glm::quat glm::vec3 glm::quat

hmdP = _glmFromOvrVector(trackingState.HeadPose.ThePose.Position); hmdQ = _glmFromOvrQuat(trackingState.HeadPose.ThePose.Orientation); leftP = _glmFromOvrVector(trackingState.HandPoses[ovrHand_Left].ThePose.Position); leftQ = _glmFromOvrQuat(trackingState.HandPoses[ovrHand_Left].ThePose.Orientation); rightP = _glmFromOvrVector(trackingState.HandPoses[ovrHand_Right].ThePose.Position); rightQ = _glmFromOvrQuat(trackingState.HandPoses[ovrHand_Right].ThePose.Orientation);

ovrAvatarTransform hmd; _ovrAvatarTransformFromGlm(hmdP, hmdQ, glm::vec3(1.0f), &hmd); ovrAvatarTransform left; _ovrAvatarTransformFromGlm(leftP, leftQ, glm::vec3(1.0f), &left); ovrAvatarTransform right; _ovrAvatarTransformFromGlm(rightP, rightQ, glm::vec3(1.0f), &right); ovrAvatarHandInputState inputStateLeft; _ovrAvatarHandInputStateFromOvr(left, touchState, ovrHand_Left, &inputStateLeft); ovrAvatarHandInputState inputStateRight; _ovrAvatarHandInputStateFromOvr(right, touchState, ovrHand_Right, &inputStateRight);

14 | Avatar SDK Getting Started Guide | Avatar

Updating the Avatar Everything that can be changed has a related update function in the SDK. Our helper function _updateAvatar() calls the individual pose update functions: ovrAvatarPose_UpdateBody(avatar, hmd); ovrAvatarPose_UpdateHands(avatar, left, right);

It also closes the update by finalizing the updates to the avatar's pose with a timestamp deltaSeconds. This timestamp is used for avatar playback and recording as discussed in Pose Recording and Playback. ovrAvatarPose_Finalize(avatar, deltaSeconds);

Native C/C++ (Gear VR) Getting Started Get started using Oculus Avatars in native GearVR code by creating an avatar project using the Native Application Framework Template. Download and Prepare Oculus SDKs Our SDKs are packaged in .zip files on our developer website. 1. Download the Oculus Avatars SDK .zip package from https://developer.oculus.com/downloads/ and then extract the contents to C:\dev. 2. Download the Oculus Mobile SDK .zip package from https://developer.oculus.com/downloads/ extract the contents to C:\dev, and then rename the ovr_sdk_mobile_x.x.x folder to ovr_sdk_mobile. 3. Download the Oculus Platform SDK .zip package from https://developer.oculus.com/downloads/ extract the contents to C:\dev, and then rename the OVRPlatformSDK_vx.x.x folder to OVRPlatformSDK. 4. Save the following code as C:\dev\OVRPlatformSDK\Android\jni\Android.mk: LOCAL_PATH := $(call my-dir) include $(CLEAR_VARS) LOCAL_MODULE := libovrplatformloader LOCAL_SRC_FILES := ../libs/$(TARGET_ARCH_ABI)/$(LOCAL_MODULE).so LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/../../Include ifneq (,$(wildcard $(LOCAL_PATH)/$(LOCAL_SRC_FILES))) include $(PREBUILT_SHARED_LIBRARY) endif

Create a New App Using the Application Framework Use the Native Application Framework Template to create a new Gear VR project called "mirror" and place your Android device OSIG file inside the assets folder. 1. Run these commands from a Windows command prompt: cd C:\dev\ovr_sdk_mobile\VrSamples\Native\VrTemplate make_new_project.bat mirror oculus

2. Connect your Android device to your computer. 3. Create an Oculus Signature File for your Android device at https://dashboard.oculus.com/tools/osiggenerator/and then copy it to the folder C:\dev\ovr_sdk_mobile\VrSamples\Native\mirror \assets. For more information, see Creating New Apps with the Framework Template.

Avatar | Avatar SDK Getting Started Guide | 15

Modify the Sample Code with a Gear VR App ID The Avatar SDK Samples folder contains a Gear VR version of our Rift mirror sample. Because this sample uses Oculus platform calls, you must add your own Gear VR app ID to the sample code. This app ID must be from a Gear VR app owned by your developer organization and your Oculus user must be subscribed to at least one release channel in that app. 1. Copy the contents of C:\dev\OVRAvatarSDK\Samples\MirrorAndroid to C:\dev\ovr_sdk_mobile \VrSamples\Native\mirror\Src 2. Change #define APP_ID "1221388694657274" in Src\OvrApp.cpp so that it contains the Gear VR app ID of an app that belongs to your developer organization. Modify the Android.mk Makefile We need to modify the Android.mk makefile with the paths to our sources and our Avatar and Platform SDK library files. 1. Locate the Android.mk file in C:\dev\ovr_sdk_mobile\VrSamples\Native\mirror\Projects \Android\jni. 2. Modify the contents of Android.mk as follows: LOCAL_PATH := $(call my-dir) include $(CLEAR_VARS) include ../../../../../cflags.mk LOCAL_MODULE LOCAL_SRC_FILES LOCAL_STATIC_LIBRARIES LOCAL_SHARED_LIBRARIES

:= := := :=

ovrapp ../../../Src/OvrApp.cpp ../../../Src/AvatarManager.cpp vrsound vrmodel vrlocale vrgui vrappframework libovrkernel vrapi libovrplatformloader libovravatarloader

include $(BUILD_SHARED_LIBRARY) $(call $(call $(call $(call $(call $(call $(call $(call $(call

import-module,LibOVRKernel/Projects/AndroidPrebuilt/jni) import-module,VrApi/Projects/AndroidPrebuilt/jni) import-module,VrAppFramework/Projects/AndroidPrebuilt/jni) import-module,VrAppSupport/VrGUI/Projects/AndroidPrebuilt/jni) import-module,VrAppSupport/VrLocale/Projects/AndroidPrebuilt/jni) import-module,VrAppSupport/VrModel/Projects/AndroidPrebuilt/jni) import-module,VrAppSupport/VrSound/Projects/AndroidPrebuilt/jni) import-module,../OVRPlatformSDK/Android/jni) import-module,../OVRAvatarSDK/Android/jni)

Build and Launch the Project Run C:\dev\ovr_sdk_mobile\VrSamples\Native\mirror\Projects\Android\build.bat to build and launch the app on your device.

16 | Avatar SDK Getting Started Guide | Avatar

Avatar | Avatar SDK Guide | 17

Avatar SDK Guide This document describes how to install, configure, and use the Oculus Avatar SDK. The Avatar SDK consists of native C++ and Unity documentation, samples, plugins, source code, and libraries to help developers implement Oculus Avatars in their own VR experiences.

Unity Features Loading Personalized Avatars You can replace the default blue avatar with a personalized avatar using the Oculus Platform package. The base Avatar SDK OvrAvatar.cs class is already set up to load the avatar specifications of users, but we need to call Oculus Platform functions to request valid user IDs. After getting a user ID, we can set the oculusUserID of the avatar accordingly. The timing is important, because this has to happen before the Start() function in OvrAvatar.cs gets called. Note: For security reasons, Oculus Avatars and Oculus Platform must be initialized with a valid App ID before accessing user ID information. You can create a new application and obtain an App ID from the developer dashboard. For more information, see Oculus Platform Setup.

18 | Avatar SDK Guide | Avatar

The example below shows one way of doing this. It defines a new class called PlatformManager. It extends our existing Getting Started sample. When run, it replaces the default blue avatar with the personalized avatar of the user logged on to Oculus Home. 1. Import the Oculus Platform SDK Unity package into your Unity project. 2. Specify valid App IDs for both the Oculus Avatars and Oculus Platform plugins: a. Click Oculus Avatars > Edit Configuration and paste your Oculus Rift App Id or Gear VR App Id into the field. b. Click Oculus Platform > Edit Settings and paste your Oculus Rift App Id or Gear VR app Id into the field. 3. Create an empty game object named PlatformManager: a. Click GameObject > Create Empty. b. Rename the game object PlatformManager. 4. Click Add Component, enter New Script in the search field, and then select New Script. 5. Name the new script PlatformManager and set Language to C Sharp. 6. Copy and save the following text as Assets\PlatformManager.cs. using using using using using

UnityEngine; Oculus.Avatar; Oculus.Platform; Oculus.Platform.Models; System.Collections;

public class PlatformManager : MonoBehaviour { public OvrAvatar myAvatar; void Awake () { Oculus.Platform.Core.Initialize(); Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback); Oculus.Platform.Request.RunCallbacks(); //avoids race condition with OvrAvatar.cs Start(). }

}

private void GetLoggedInUserCallback(Message message) { if (!message.IsError) { myAvatar.oculusUserID = message.Data.ID; } }

7. In the Unity Editor, select PlatformManager from the Hierarchy. The My Avatar field appears in the Inspector. 8. Drag LocalAvatar from the Hierarchy to the My Avatar field. Handling Multiple Personalized Avatars If you have a multi-user scene where each avatar has different personalizations, you probably already have the user IDs of all the users in your scene because you had to retrieve that data to invite them in the first place. Set the oculusUserID for each user 's avatar accordingly. If your scene contains multiple avatars of the same person, you can iterate through all the avatar objects in the scene to change all their oculusUserID values. For example, the LocalAvatar and RemoteLoopback sample scenes both contain two avatars of the same player. Here is an example of how to modify the callback of our PlatformManager class to personalize the avatars in the sample scenes: using using using using using

UnityEngine; Oculus.Avatar; Oculus.Platform; Oculus.Platform.Models; System.Collections;

Avatar | Avatar SDK Guide | 19

public class PlatformManager : MonoBehaviour { void Awake () { Oculus.Platform.Core.Initialize(); Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback); Oculus.Platform.Request.RunCallbacks(); //avoids race condition with OvrAvatar.cs Start(). }

}

private void GetLoggedInUserCallback(Message message) { if (!message.IsError) { OvrAvatar[] avatars = FindObjectsOfType(typeof(OvrAvatar)) as OvrAvatar[]; foreach (OvrAvatar avatar in avatars) { avatar.oculusUserID = message.Data.ID; } } }

Avatar Prefabs The Avatar Unity package contains two prefabs for Avatars: LocalAvatar and RemoteAvatar. They are located in OvrAvatar >Content > PreFabs. The difference between LocalAvatar and RemoteAvatar is in the driver, the control mechanism behind avatar movements. The LocalAvatar driver is the OvrAvatarDriver script which derives avatar movement from the logged in user's Touch and HMD or. The RemoteAvatar driver is the OvrAvatarRemoteDriver script which gets its avatar movement from the packet recording and playback system. Sample Scenes There are four sample scenes in the Avatar Unity package: • Controllers Demonstrates how first-person avatars can be used to enhance the sense of presence for Touch users. • GripPoses A helper scene for creating custom grip poses. See Custom Touch Grip Poses. • LocalAvatar Demonstrates the capabilities of both first-person and third-person avatars. Does not yet include microphone voice visualization or loading an Avatar Specification using Oculus Platform. • RemoteLoopback Demonstrates the avatar packet recording and playback system. See Recording and Playing Back Avatar Pose Updates. Reducing Draw Calls with the Combine Meshes Option Each avatar in your scene requires 11 draw calls per eye per frame (22 total). The Combine Meshes option reduces this to 3 draw calls per eye (6 total) by combining all the mesh parts into a single mesh. This is an important performance gain for Gear VR as most apps typically need to stay within a draw call budget of 50 to 100 draw calls per frame. Without this option, just having 4 avatars in your scene would use most or all of that budget. You should almost always select this option when using avatars. The only drawback to using this option is that you are no longer able to access mesh parts individually, but that is a rare use case.

20 | Avatar SDK Guide | Avatar

Custom Touch Grip Poses The GripPoses sample lets you change the hand poses by rotating the finger joints until you get the pose you want. You can then save these finger joint positions as a Unity prefab that you can load at a later time. In this example, we will pose the left hand to make it look like a scissors or bunny rabbit gesture. Creating the left hand pose: 1. Open the Samples > GripPoses > GripPoses scene. 2. Click Play. 3. Press E to select the Rotate transform tool. 4. In the Hierarchy window, expand LocalAvatar > hand_left > LeftHandPoseEditHelp > hands_l_hand_world > hands:b_l_hand.

5. Locate all the joints of the fingers you want to adjust. Joint 0 is closest to the palm, subsequent joints are towards the finger tip. To adjust the pinky finger joints for example, expand hands:b_l_pinky0 > hands:b_l_pinky1 > hands:b_l_pinky2 > hands:b_l_pinky3. 6. In the Hierarchy window, select the joint you want to rotate.

Avatar | Avatar SDK Guide | 21

7. In the Scene window, click a rotation orbit and drag the joint to the desired angle.

8. Repeat these two steps until you achieve the desired pose. Saving the left hand pose: 1. In the Hierarchy window, drag hand_l_hand_world to the Project window. 2. In the Project window, rename this transform to something descriptive, for example: poseBunnyRabbitLeft. Using the left hand pose: 1. In the Hierarchy window, select LocalAvatar. 2. Drag poseBunnyRabbitLeft from the Project window to the Left Hand Custom Pose field in the Inspector Window. Click Play again. You will see that the left hand is now frozen in our custom bunny grip pose. Settings for Rift Stand-alone Builds To make Rift avatars appear in stand-alone executable builds, we need to change two settings: • Add the Avatar shaders to the Always Included Shaders list in your project settings:

22 | Avatar SDK Guide | Avatar

1. Click Edit > Project Settings > Graphics.

2. Under Always Included Shaders, add +3 to the Size and then press Enter. 3. Add the following shader elements: AvatarSurfaceShader, AvatarSurfaceShaderPBS, AvatarSurfaceShaderSelfOccluding. • Build as a 64-bit application: 1. Click File > Build Settings. 2. Set Architecture to x86_x64. Making Rift Hands Interact with the Environment To allow avatars to interact with objects in their environment, use the OVRGrabber and OVRGrabble components. For a working example, see the AvatarWithGrab sample scene included in the Oculus Unity Sample Framework.

Adding C++ Avatar Support This guide outlines Avatar SDK support with a C/C++ game engine or application. The source code samples used in this guide are taken from the Mirror demo, available in OVRAvatarSDK\Samples\Mirror. To add Avatar support to your Visual C++ project: 1. Add Oculus Platform Support to your Project. (https://developer.oculus.com/documentation/platform/latest/ concepts/dg-development/) 2. Open the project's Properties > Configuration Properties > VC++ Directories page. 3. In Include Directories, add the location of the Avatar SDK includes folder (InstallFolder\include). 4. In Library Directories, add the location of the Avatar SDK library folder (InstallFolder\Windows). 5. Add the Avatar library file as linker input: a. Expand Linker > Input. b. In Additional Dependencies, add InstallFolder\Windows\libovravatar.lib. 6. Add #include and #include . 7. Initialize the Platform module using your app ID through ovr_PlatformInitializeWindows(appID) 8. Initialize the Oculus SDK through ovr_Initialize. 9. Compile the Oculus Avatar OpenGL fragment and vertex reference shader into a shader program. 10.Initialize the Avatar module through ovrAvatar_Initialize(appID). Avatar Message Queue The functions ovrAvatar_RequestAvatarSpecification() and ovrAvatarAsset_BeginLoading() are asynchronous. The avatar message queue contains the results of these operations. You can retrieve the most recent message with ovrAvatarMessage_Pop(). After you finish processing a message on the queue, be sure to call ovrAvatarMessage_Free() to free up the memory used by the pop.

Avatar | Avatar SDK Guide | 23

Rendering Avatar Components

Avatars are composed of avatar components (body, base, hands, controller) which are themselves composed of render parts. Each Oculus user has an Avatar Specification that indicates the mesh and texture assets that need to be loaded to recreate the avatar. Our Mirror.cpp example code contains good examples of the entire process and includes helper functions, prefixed with _, that we have written to make it easier to render complete avatars. The complete process goes something like this: 1. Retrieve the avatar specification for the Oculus user. ovrAvatar_RequestAvatarSpecification(userID); 2. Set the Avatar capabilities. ovrAvatar_Create(message->avatarSpec, ovrAvatarCapability_All); 3. Iterate through the avatar specification to load the static avatar assets (mesh and textures) into the avatar. ovrAvatar_GetReferencedAsset(_avatar); 4. Apply the vertex transforms to determine the position of the avatar component. 5. Apply the material states to determine the appearance of the avatar component. 6. For each render part of an avatar component: a. Get the OpenGL mesh data and tell the renderer to use the Avatar shader program you compiled earlier. b. Calculate the inputs on the vertex uniforms. c. Set the view position, the world matrix, the view matrix, and the array of mesh poses. d. Transform everything in the joint hierarchy. e. Set the material state. f. Draw the mesh, depth first so that it self-occludes. g. Render to the color buffer. 7. When there are no more components to render, the avatar render is complete. Visible Controllers To render avatar hands without controllers: ovrAvatar_SetLeftControllerVisibility(_avatar, 0); ovrAvatar_SetRightControllerVisibility(_avatar, 0);

To render avatar hands with controllers: ovrAvatar_SetLeftControllerVisibility(_avatar, 1); ovrAvatar_SetRightControllerVisibility(_avatar, 1);

Setting a Custom Touch Grip Pose You can pass your own custom transforms to the hand pose functions or use our cube and sphere preset hand poses. Here is an example of a custom pose made from freezing the hands in their current pose: const ovrAvatarHandComponent* handComp = ovrAvatarPose_GetLeftHandComponent(_avatar); const ovrAvatarComponent* comp = handComp->renderComponent; const ovrAvatarRenderPart* renderPart = comp->renderParts[0]; const ovrAvatarRenderPart_SkinnedMeshRender* meshRender = ovrAvatarRenderPart_GetSkinnedMeshRender(renderPart); ovrAvatar_SetLeftHandCustomGesture(_avatar, meshRender->skinnedPose.jointCount, meshRender->skinnedPose.jointTransform); handComp = ovrAvatarPose_GetRightHandComponent(_avatar);

24 | Avatar SDK Guide | Avatar

comp = handComp->renderComponent; renderPart = comp->renderParts[0]; meshRender = ovrAvatarRenderPart_GetSkinnedMeshRender(renderPart); ovrAvatar_SetRightHandCustomGesture(_avatar, meshRender->skinnedPose.jointCount, meshRender->skinnedPose.jointTransform);

To pose the hands as if to grip cubes: ovrAvatar_SetLeftHandGesture(_avatar, ovrAvatarHandGesture_GripCube); ovrAvatar_SetRightHandGesture(_avatar, ovrAvatarHandGesture_GripCube);

To pose the hands as if to grip spheres: ovrAvatar_SetLeftHandGesture(_avatar, ovrAvatarHandGesture_GripSphere); ovrAvatar_SetRightHandGesture(_avatar, ovrAvatarHandGesture_GripSphere);

To unfreeze the hand poses: ovrAvatar_SetLeftHandGesture(_avatar, ovrAvatarHandGesture_Default); ovrAvatar_SetRightHandGesture(_avatar, ovrAvatarHandGesture_Default);

Voice Visualization Voice visualization is an avatar component. It is created as a projection on top of an existing mesh. Create the microphone: ovrMicrophoneHandle mic = ovr_Microphone_Create(); if (mic) { ovr_Microphone_Start(mic); }

Pass an array of voice samples to ovrAvatarPose_UpdateVoiceVisualization(). float micSamples[48000]; size_t sampleCount = ovr_Microphone_ReadData(mic, micSamples, sizeof(micSamples) / sizeof(micSamples[0])); if (sampleCount > 0) { ovrAvatarPose_UpdateVoiceVisualization(_avatar, (uint32_t)sampleCount, micSamples); }

The render parts of the voice visualization component are a ProjectorRender type.

Pose Recording and Playback The Avatar SDK contains a complete avatar pose recording and playback system. You can save pose data to packets at regular intervals and then transmit these packets to a remote computer to drive the avatar poses there. Pose Recording Call ovrAvatarPacket_BeginRecording() to begin recording ovrAvatarPacket_BeginRecording(_avatar);

Avatar | Avatar SDK Guide | 25

After you record as many frames worth of pose changes you want, stop the recording with ovrAvatarPacket_EndRecording() and then write your packet out with ovrAvatarPacket_Write(). ovrAvatarPacket* recordedPacket = ovrAvatarPacket_EndRecording(_avatar); // Write the packet to a byte buffer to exercise the packet writing code uint32_t packetSize = ovrAvatarPacket_GetSize(recordedPacket); uint8_t* packetBuffer = (uint8_t*)malloc(packetSize); ovrAvatarPacket_Write(recordedPacket, packetSize, packetBuffer); ovrAvatarPacket_Free(recordedPacket);

Transmit your data to your destination using your own network code. Pose Playback To read your pose data back into packets: // Read the buffer back into a packet playbackPacket = ovrAvatarPacket_Read(packetSize, packetBuffer); free(packetBuffer);

To play the packets back: float packetDuration = ovrAvatarPacket_GetDurationSeconds(packet); *packetPlaybackTime += deltaSeconds; if (*packetPlaybackTime > packetDuration) { ovrAvatarPose_Finalize(avatar, 0.0f); *packetPlaybackTime = 0; } ovrAvatar_UpdatePoseFromPacket(avatar, packet, *packetPlaybackTime);

The playback routine uses the timestamp deltaSeconds to interpolate a tween pose in case the frames on the remote computer are offset by a different amount.

26 | Avatar SDK C/C++ Developer Reference | Avatar

Avatar SDK C/C++ Developer Reference The Oculus Avatar SDK Developer Reference contains detailed information about the data structures and files included with the SDK. See Oculus Avatar SDK Reference Manual 1.13.

Life Enjoy

" Life is not a problem to be solved but a reality to be experienced! "

Get in touch

Social

© Copyright 2013 - 2019 DOKUMENTIX.COM - All rights reserved.