Avatar SDK - Oculus

In the Oculus Avatars SDK download, you get: • a Unity package with ... back on a remote system. Lets take a quick tour of the RemoteLoopbackManager s...

4 downloads 60 Views 2MB Size

Recommend Documents


Avatar SDK - Oculus
a Unity package with scripts, prefabs, art assets, and sample scenes for PC and Gear VR development. ..... Speak or sing to see the voice .... void Awake () {.

Avatar SDK - Oculus
2 | Introduction | Avatar ... 11. Translating Touch Controllers To Avatar Movements. .... native C/C++ samples, header files, and libraries for PC and Gear VR development. ..... Avatar SDK OvrAvatar.cs class is already set up to load the avatar ...

Avatar SDK - Oculus
Native C/C++ (Rift) Getting Started. .... Avatar SDK C/C++ Developer Reference. .... Each time a packet is recorded, our code places the packet into a memory ...

Avatar SDK - Oculus
Avatar SDK C/C++ Developer Reference. ... This tutorial shows you how to start using them. ... Let us have a look at the RemoteLoopbackManager script.

Avatar SDK - Oculus
Avatar SDK Getting Started Guide. .... Avatar SDK C/C++ Developer Reference. .... the Oculus Unity packages and also setting up Unity for Android development.

Avatar SDK - Oculus
If you do not have the Oculus Utilities for Unity 5, download its .zip file too. .... for implementing avatars in your own code. Let us take a little tour of it. .... The Avatar SDK consists of native C++ and Unity documentation, samples, plugins, so

Avatar SDK - Oculus
native C/C++ samples, header files, and libraries for PC and Gear VR development. • an Unreal ... This tutorial .... avatar 180 degrees so that its front faces us.

Audio SDK - Oculus
Some 3D positional implementations layer simple “shoebox room” modeling on ..... that you set the attenuation range min/max to match the authored curve as ..... Unity 5 includes a flexible mixer architecture for routing audio sources. ..... the n

Audio SDK - Oculus
2 | Introduction | Audio. 2 | |. Copyrights and ..... Playing Ambisonic Audio in Unity 2017.1 (Beta). ..... Audio | Introduction to Virtual Reality Audio | 9. Directional ...

Platform SDK - Oculus
Your app should constantly check the message queue for notifications and ... Once you've retrieved the message payload call ovr_FreeMessage() to free the ... In your Visual C++ project, open Properties, expand the Linker tab, and select Input. ......

Avatar SDK Version 1.24

2 | Introduction | Avatar

Copyrights and Trademarks ©

2017 Oculus VR, LLC. All Rights Reserved.

OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights reserved. BLUETOOTH is a registered trademark of Bluetooth SIG, Inc. All other trademarks are the property of their respective owners. Certain materials included in this publication are reprinted with the permission of the copyright holder.

2 |  | 

Avatar | Contents | 3

Contents Avatar SDK Getting Started Guide..................................................................... 4 Unity (Rift) Getting Started................................................................................................................................ 4 Unity (Gear VR) Getting Started........................................................................................................................9 Native C/C++ (Rift) Getting Started............................................................................................................... 12 Rendering Avatars...................................................................................................................................... 13 Translating Touch Controllers To Avatar Movements............................................................................... 15 Native C/C++ (Gear VR) Getting Started....................................................................................................... 16 Unreal (Rift) Getting Started............................................................................................................................ 18

Avatar Developer Guide.................................................................................... 25 Using Unity Features........................................................................................................................................25 Sample Scenes........................................................................................................................................... 25 Loading Personalized Avatars.................................................................................................................... 25 Avatar Prefabs............................................................................................................................................ 27 Reducing Draw Calls with the Combine Meshes Option.......................................................................... 28 Custom Touch Grip Poses......................................................................................................................... 28 Grabbing Objects with Rift Hands.............................................................................................................30 Making Rift Stand-alone Builds..................................................................................................................30 Getting the Position of Avatar Components............................................................................................. 31 Unity Social Starter Example........................................................................................................................... 31 Adding C++ Avatar Support...........................................................................................................................32 Rendering Avatar Components..................................................................................................................33 Voice Visualization...................................................................................................................................... 34 Pose Recording and Playback....................................................................................................................35

Avatar SDK C/C++ Developer Reference......................................................... 36

4 | Avatar SDK Getting Started Guide | Avatar

Avatar SDK Getting Started Guide Welcome to Oculus Avatars, a powerful and flexible way to increase presence within VR apps. Avatars deliver true social presence with persistent identities you can take with you across the Oculus ecosystem. Bring other peoples' avatars into your app, game, or experience so they can feel like themselves and even recognize their friends. Avatars also provide hand presence for Touch controllers, letting you integrate Oculus Touch interaction into Rift apps. Integrating Touch lets your users interact with their environment and enhances their perception of themselves within your virtual world.

In the Oculus Avatars SDK download, you get: • a Unity package with scripts, prefabs, art assets, and sample scenes for PC and Gear VR development. • native C/C++ samples, header files, and libraries for PC and Gear VR development. • an Unreal plugin with sample code for PC development. Add Social Presence, Touch Interaction, or Both As a developer, you are free to mix and match Avatars SDK features. You can use it to provide Touch interaction in one app and social interaction in another. The SDK is extensible and customizable.

Unity (Rift) Getting Started The Avatar Unity package contains several prefabs you can drop into your existing Unity projects. This tutorial shows you how to start using them. Set Up the Unity Project for Oculus VR and Avatars 1. Create a New Project in Unity named "Unity Avatar Demo Project".

Avatar | Avatar SDK Getting Started Guide | 5

2. There are two ways to import the Oculus APIs into the Unity Editor. You can either: • Navigate to the Oculus Integration page and select Import. • In the Editor, select the Asset Store tab, Search for 'Oculus Integration', and select Import. Note: We recommend importing the complete integration package. This enables the core Oculus APIs, the Platform and Avatar APIs, and the Social Starter sample scene. Read about the Social Starter, a sample scene that demonstrates how the Avatar and Platform APIs complement each-other to create an engaging social experience. 3. Select the Virtual Reality Supported check box in Edit > Project Settings > Player. 4. Delete Main Camera from your scene and then drag OVRCameraRig from OVR > PreFabs. 5. Reset the transform on OVRCameraRig.

Note: You may ignore any No Oculus Rift App ID warnings you see during development. While an App ID is required to retrieve Oculus avatars for specific users, you can prototype and test experiences that make use of Touch and Avatars with just the default blue avatar. Adding an Avatar to the Scene The LocalAvatar prefab renders the player's avatar and hands. You can choose which parts of the avatar you want to render: body, hands, and Touch controllers. To render avatar hands with controllers: 1. Drag OvrAvatar > Content > Prefabs > LocalAvatar to the Unity Hierarchy window. 2. In the Unity Inspector window, select the Start With Controllers check box. Click Play to test. Try out the built-in hand poses and animations by playing with the Touch controllers.

6 | Avatar SDK Getting Started Guide | Avatar

To render avatar hands without controllers: 1. In the Hierarchy window, select LocalAvatar. 2. In the Inspector window, clear the Start With Controllers check box. Click Play to test. Squeeze and release the grips and triggers on the Touch controllers and observe how the finger joints transform to change hand poses. It is possible to add hand poses outside the range of these movements; we talk more about this in Custom Touch Grip Poses on page 28.

Avatar | Avatar SDK Getting Started Guide | 7

To render an avatar body: 1. 2. 3. 4.

In the Hierarchy window, select LocalAvatar. In the Inspector window, select the Show Third Person check box. Change Transform > Position to X:0 Y:0 Z:1.5 Change Transform > Rotation to X:0 Y:180 Z:0

8 | Avatar SDK Getting Started Guide | Avatar

Recording and Playing Back Avatar Pose Updates The avatar packet recording system saves avatar movement data as packets you can send across a network to play back on a remote system. Lets take a quick tour of the RemoteLoopbackManager script. Open the RemoteLoopback scene in OvrAvatar > Samples > RemoteLoopback. We set RecordPackets to true to start the avatar packet recording system. We also subscribe to the event handler PacketRecorded so that we can trigger other actions each time a packet is recorded. void Start () { LocalAvatar.RecordPackets = true; LocalAvatar.PacketRecorded += OnLocalAvatarPacketRecorded; }

Avatar | Avatar SDK Getting Started Guide | 9

Each time a packet is recorded, our code places the packet into a memory stream we are using as a stand-in for a real network layer. void OnLocalAvatarPacketRecorded(object sender, args) { using (MemoryStream outputStream = new MemoryStream()) { BinaryWriter writer = new BinaryWriter(outputStream); writer.Write(packetSequence++); args.Packet.Write(outputStream); SendPacketData(outputStream.ToArray()); } }

The remainder of our code receives the packet from the memory stream for playback on our loopback avatar object. void SendPacketData(byte[] data) { ReceivePacketData(data); } void ReceivePacketData(byte[] data) { using (MemoryStream inputStream = new MemoryStream(data)) { BinaryReader reader = new BinaryReader(inputStream); int sequence = reader.ReadInt32(); OvrAvatarPacket packet = OvrAvatarPacket.Read(inputStream); LoopbackAvatar.GetComponent().QueuePacket(sequence, packet); } }

Unity (Gear VR) Getting Started The Avatar Unity packages contain several prefabs you can drop into your existing Unity projects. This tutorial shows you how to start using them. Set Up the Unity Project for Oculus VR and Avatars The set up includes importing the Oculus Unity packages and also setting up Unity for Android development and debugging. 1. Create a new project in Unity named gearvr-avatar. 2. Click File > Build Settings and select Android. Download and install Unity Android Support and then restart Unity if necessary. 3. Click Switch Platform to switch to Android platform.

4. Click Add Open Scenes. 5. Set Texture Compression to ASTC. 6. Click Edit > Project Settings > Player, click the little Android Settings robot, and then set the following options: a. Select the Virtual Reality Supported check box. b. In Bundle Identifier, enter a unique package name. c. Set Minimum API Level to Android 5.0 'Lollipop' (API level 21). d. Set Install Location to Automatic. 7. There are two ways to import the Oculus APIs into the Unity Editor. You can either: • Navigate to the Oculus Integration page and select Import.

10 | Avatar SDK Getting Started Guide | Avatar

• In the Editor, select the Asset Store tab, Search for 'Oculus Integration', and select Import. Note: We recommend importing the complete integration package. This enables the core Oculus APIs, the Platform and Avatar APIs, and the Social Starter sample scene. Read about the Social Starter, a sample scene that demonstrates how the Avatar and Platform APIs compliment eachother to create an engaging social experience. 8. Connect your Android device to your computer. 9. Create an Oculus Signature File for your Android device at https://dashboard.oculus.com/tools/osiggenerator/and then copy it to the folder gearvr-avatar/Assets/Plugins/Android/assets. Create this folder if it doesn't exist. Adding the VR Camera Because the avatar has a default height of 170 cm, we must raise our VR camera rig to the same height. 1. Delete Main Camera from your scene and then drag OVRCameraRig from OVR > PreFabs. 2. Set the Position transform on OVRCameraRig to X:0, Y:1.70, Z:0. Adding an Avatar As the player cannot see his or her own Gear VR avatar, Gear VR avatars should all be of the "third person" type. To make sure the avatar is visible, we can place the avatar 50cm in front of the camera, and rotate the avatar 180 degrees so that its front faces us. Note: The "local" in the prefab name "LocalAvatar" refers to how the avatar object gets its motion data. "Local" means the avatar object is driven by the local headset orientation. 1. Drag OvrAvatar > Content > Prefabs > LocalAvatar to the Hierarchy window. 2. In the Inspector, clear the Show First Person check box and select the Show Third Person check box. 3. Select the Combine Meshes check box. This reduces total draw calls per frame per avatar to 6 from 22. Gear VR apps typically need to stay within 50 to 100 draw calls per frame. 4. Set the Position transform on LocalAvatar to X:0, Y:0, Z:0.50. 5. Set the Rotation transform on LocalAvatar to X:0, Y:180, Z:0. 6. Click File > Build & Run to build an .apk from this scene and have Unity launch it on your Android device.

Avatar | Avatar SDK Getting Started Guide | 11

Add an Avatar with the Gear VR Controller: In addition to the steps above, select the Start With Controllers check box in the Inspector. If a Gear VR Controller is connected as the main controller, the controller will be rendered in the scene with the corresponding hand animations. What to Explore Next? • Loading Personalized Avatars See Using Unity Features on page 25 for instructions on how to modify the sample scenes to retrieve Oculus User IDs and display personalized avatars. • Recording and Playing Back Avatar Pose Updates Build our RemoteLoopback example scene and read the accompanying write-up in our Unity (Rift) Getting Started on page 4 topic.

12 | Avatar SDK Getting Started Guide | Avatar

Native C/C++ (Rift) Getting Started Get started using Oculus Avatars in your own native Rift code by experimenting with our sample Visual Studio 2013 C++ solution. Download the Oculus Avatars SDK The SDK is packaged in a .zip archive file on our developer website. 1. Download the Oculus Avatars SDK from https://developer.oculus.com/downloads/package/oculus-avatarsdk/. 2. Extract the contents of the .zip archive files to your local drive. OpenGL is Required The current version of the Avatar SDK only contains OpenGL shaders. Running the Mirror Sample on Microsoft Visual Studio 2013 Our Mirror sample serves as a good foundation for implementing avatars in your own code. Let's take a tour of its features and its code. To set up the Microsoft Visual Studio 2013 solution for our Mirror sample: 1. Download and install cmake from https://cmake.org/download. 2. In Windows Explorer, locate the OVRAvatarSDK\Samples folder and double-click generate_projects.cmd. 3. Wait for the script to finish creating the VS2013 folder and solution. 4. Open and build the solution: Samples\VS2013\Mirror.sln. 5. Press F5 to start debugging. Key Bindings The Mirror sample illustrates several features of the Avatar SDK by letting you toggle them: Press...

to...

1

show/hide the avatar body.

2

show/hide the hands.

3

show/hide the base cone.

4

show/hide the voice visualization.

c

show/hide the Touch controllers.

f

freeze/unfreeze the current hand pose.

s

set the hand pose to 'grip sphere'

u

set the hand pose to 'grip cube'

j

show/hide the joint lines.

r

start avatar packet recording. Press 'r' again to play recorded packets in a loop.

Avatar | Avatar SDK Getting Started Guide | 13

Exploring the Sample Code Open Mirror.cpp file and follow along from main. Our tour explores only the portions of code specific to Oculus avatars.

Rendering Avatars We compile the Avatar vertex and fragment shaders for our regular shader and our physically based shader (PBS) using a helper function _compileProgramFromFiles. _skinnedMeshProgram = _compileProgramFromFiles("AvatarVertexShader.glsl", "AvatarFragmentShader.glsl", sizeof(errorBuffer), errorBuffer); ... _skinnedMeshPBSProgram = _compileProgramFromFiles("AvatarVertexShader.glsl", "AvatarFragmentShaderPBS.glsl", sizeof(errorBuffer), errorBuffer);

Retrieving Avatar Data From a User Profile The appearance of every person's avatar is stored in his or her Oculus user profile as an Avatar Specification. The Avatar Specification identifies the meshes and textures that make up a person's avatar. Before we retrieve this specification data, we have to initialize both the Platform SDK and the Avatar SDK using our app ID. To get your own app ID, see Developer Dashboard and Oculus App Setup. #define MIRROR_SAMPLE_APP_ID "958062084316416" ... ovrPlatformInitializeWindows(MIRROR_SAMPLE_APP_ID); ... ovrAvatar_Initialize(MIRROR_SAMPLE_APP_ID);

Avatar Specifications are indexed by Oculus user ID. An app has easy access to the Oculus user ID of the currently logged in user. Tip: If you wanted to create a social experience, you would write code to share user IDs between instances of your app so that you could load and render the appearance of other avatars too. ovrID userID = ovr_GetLoggedInUserID(); ovrAvatar_RequestAvatarSpecification(userID);

The function ovrAvatar_RequestAvatarSpecification() is asynchronous. We use a message queue to determine when the function has finished obtaining our data (ovrAvatarMessageType_AvatarSpecification). // Pump avatar messages while (ovrAvatarMessage* message = ovrAvatarMessage_Pop()) { switch (ovrAvatarMessage_GetType(message)) { case ovrAvatarMessageType_AvatarSpecification: _handleAvatarSpecification(ovrAvatarMessage_GetAvatarSpecification(message)); break; case ovrAvatarMessageType_AssetLoaded: _handleAssetLoaded(ovrAvatarMessage_GetAssetLoaded(message)); break; } ovrAvatarMessage_Free(message); }

With the Avatar Specification in hand, we then use our helper function _handleAvatarSpecification to: • Create an avatar instance (ovrAvatar_Create). • Load all the relevant avatar assets into that instance.

14 | Avatar SDK Getting Started Guide | Avatar

Loading avatar assets is also asynchronous and we again rely on popping our message queue to determine when an asset for an avatar has finished loading (ovrAvatarMessageType_AssetLoaded). static void _handleAvatarSpecification(const ovrAvatarMessage_AvatarSpecification* message) { // Create the avatar instance _avatar = ovrAvatar_Create(message->avatarSpec, ovrAvatarCapability_All);

}

// Trigger load operations for all of the assets referenced by the avatar uint32_t refCount = ovrAvatar_GetReferencedAssetCount(_avatar); for (uint32_t i = 0; i < refCount; ++i) { ovrAvatarAssetID id = ovrAvatar_GetReferencedAsset(_avatar, i); ovrAvatarAsset_BeginLoading(id); ++_loadingAssets; } printf("Loading %d assets...\r\n", _loadingAssets);

static void _handleAssetLoaded(const ovrAvatarMessage_AssetLoaded* message) { // Determine the type of the asset that got loaded ovrAvatarAssetType assetType = ovrAvatarAsset_GetType(message->asset); void* data = nullptr; // Call the appropriate loader function switch (assetType) { case ovrAvatarAssetType_Mesh: data = _loadMesh(ovrAvatarAsset_GetMeshData(message->asset)); break; case ovrAvatarAssetType_Texture: data = _loadTexture(ovrAvatarAsset_GetTextureData(message->asset)); break; }

}

// Store the data that we loaded for the asset in the asset map _assetMap[message->assetID] = data; --_loadingAssets; printf("Loading %d assets...\r\n", _loadingAssets);

Rendering the Avatar Our sample code is called Mirror and it calls the avatar rendering helper function _renderAvatar() twice. The first call renders a first-person avatar. A first person avatar can depict the player's hands and world position. // If we have the avatar and have finished loading assets, render it if (_avatar && !_loadingAssets) { _renderAvatar(_avatar, ovrAvatarVisibilityFlag_FirstPerson, view, proj, eyeWorld, renderJoints);

The second call renders a third-person avatar, transformed so that it faces you as if looking in a mirror. A thirdperson avatar can depict hands, body, and base cone. glm::vec4 reflectionPlane = glm::vec4(0.0, 0.0, -1.0, 0.0); glm::mat4 reflection = _computeReflectionMatrix(reflectionPlane); glFrontFace(GL_CW); _renderAvatar(_avatar, ovrAvatarVisibilityFlag_ThirdPerson, view * reflection, proj, glm::vec3(reflection * glm::vec4(eyeWorld, 1.0f)), renderJoints); glFrontFace(GL_CCW);

}

Hiding and Displaying Avatar Capabilities When we first created our avatar instance, we created it with all capabilities active: _avatar = ovrAvatar_Create(message->avatarSpec, ovrAvatarCapability_All);

Avatar | Avatar SDK Getting Started Guide | 15

You can enable different capabilities by calling ovrAvatar_SetActiveCapabilities(). In our sample, we toggle different capabilities in real-time using the bit masks defined in ovrAvatarCapabilities: case '1': capabilities ^= ovrAvatarCapability_Body; ovrAvatar_SetActiveCapabilities(_avatar, static_cast(capabilities)); break; case '2': capabilities ^= ovrAvatarCapability_Hands; ovrAvatar_SetActiveCapabilities(_avatar, static_cast(capabilities)); break; case '3': capabilities ^= ovrAvatarCapability_Base; ovrAvatar_SetActiveCapabilities(_avatar, static_cast(capabilities)); break; case '4': capabilities ^= ovrAvatarCapability_Voice; ovrAvatar_SetActiveCapabilities(_avatar, static_cast(capabilities)); break;

Translating Touch Controllers To Avatar Movements Our sample code translates Touch controller input into Avatar movements in two parts: 1. Processing the Touch inputs 2. Updating the Avatar Processing the Touch Controller Inputs We translate the position and orientation of the head-mounted display and the left and right Touch controllers to avatar body and hand positions using our helper function _ovrAvatarTransformFromGlm(). We call our helper function _ovrAvatarHandInputStateFromOvr() to translate the various Touch button, trigger, and touch states. // Convert the OVR inputs into Avatar SDK inputs ovrInputState touchState; ovr_GetInputState(ovr, ovrControllerType_Active, &touchState); ovrTrackingState trackingState = ovr_GetTrackingState(ovr, 0.0, false); glm::vec3 glm::quat glm::vec3 glm::quat glm::vec3 glm::quat

hmdP = _glmFromOvrVector(trackingState.HeadPose.ThePose.Position); hmdQ = _glmFromOvrQuat(trackingState.HeadPose.ThePose.Orientation); leftP = _glmFromOvrVector(trackingState.HandPoses[ovrHand_Left].ThePose.Position); leftQ = _glmFromOvrQuat(trackingState.HandPoses[ovrHand_Left].ThePose.Orientation); rightP = _glmFromOvrVector(trackingState.HandPoses[ovrHand_Right].ThePose.Position); rightQ = _glmFromOvrQuat(trackingState.HandPoses[ovrHand_Right].ThePose.Orientation);

ovrAvatarTransform hmd; _ovrAvatarTransformFromGlm(hmdP, hmdQ, glm::vec3(1.0f), &hmd); ovrAvatarTransform left; _ovrAvatarTransformFromGlm(leftP, leftQ, glm::vec3(1.0f), &left); ovrAvatarTransform right; _ovrAvatarTransformFromGlm(rightP, rightQ, glm::vec3(1.0f), &right); ovrAvatarHandInputState inputStateLeft; _ovrAvatarHandInputStateFromOvr(left, touchState, ovrHand_Left, &inputStateLeft); ovrAvatarHandInputState inputStateRight; _ovrAvatarHandInputStateFromOvr(right, touchState, ovrHand_Right, &inputStateRight);

16 | Avatar SDK Getting Started Guide | Avatar

Updating the Avatar Everything that can be changed has a related update function in the SDK. Our helper function _updateAvatar() calls the individual pose update functions: ovrAvatarPose_UpdateBody(avatar, hmd); ovrAvatarPose_UpdateHands(avatar, left, right);

It also closes the update by finalizing the updates to the avatar's pose with a timestamp deltaSeconds. This timestamp is used for avatar playback and recording as discussed in Pose Recording and Playback. ovrAvatarPose_Finalize(avatar, deltaSeconds);

Native C/C++ (Gear VR) Getting Started Get started using Oculus Avatars in native GearVR code by creating an avatar project using the Native Application Framework Template. Download and Prepare Oculus SDKs Our SDKs are packaged in .zip files on our developer website. 1. Download the Oculus Avatars SDK from https://developer.oculus.com/downloads/package/oculus-avatarsdk/ and then extract the contents to C:\dev. 2. Download the Oculus Mobile SDK from https://developer.oculus.com/downloads/package/oculusmobile-sdk/ extract the contents to C:\dev, and then rename the ovr_sdk_mobile_x.x.x folder to ovr_sdk_mobile. 3. Download the Oculus Platform SDK from https://developer.oculus.com/downloads/package/oculusplatform-sdk/ extract the contents to C:\dev, and then rename the OVRPlatformSDK_vx.x.x folder to OVRPlatformSDK. 4. Save the text below as a text file named C:\dev\OVRPlatformSDK\Android\jni\Android.mk: LOCAL_PATH := $(call my-dir) include $(CLEAR_VARS) LOCAL_MODULE := libovrplatformloader LOCAL_SRC_FILES := ../libs/$(TARGET_ARCH_ABI)/$(LOCAL_MODULE).so LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/../../Include ifneq (,$(wildcard $(LOCAL_PATH)/$(LOCAL_SRC_FILES))) include $(PREBUILT_SHARED_LIBRARY) endif

Create a New App Using the Application Framework Use the Native Application Framework Template to create a new Gear VR project called mirror and place your Android device OSIG file inside the assets folder. 1. Run these commands from a Windows command prompt: cd C:\dev\ovr_sdk_mobile\VrSamples\Native\VrTemplate make_new_project.bat mirror oculus

2. Connect your Android device to your computer. 3. Create an Oculus Signature File for your Android device at https://dashboard.oculus.com/tools/osiggenerator/and then copy it to the folder C:\dev\ovr_sdk_mobile\VrSamples\Native\mirror \assets. For more information, see Creating New Apps with the Framework Template.

Avatar | Avatar SDK Getting Started Guide | 17

Modify the Sample Code with a Gear VR App ID The Avatar SDK Samples folder contains a Gear VR version of our Rift mirror sample. Because this sample uses Oculus platform calls, you must add your own Gear VR app ID to the sample code. Note: This app ID must be from a Gear VR app owned by your developer organization and your Oculus user must be subscribed to at least one release channel in that app. 1. Copy the contents of C:\dev\OVRAvatarSDK\Samples\MirrorAndroid to C:\dev\ovr_sdk_mobile \VrSamples\Native\mirror\Src 2. Change #define APP_ID "1221388694657274" in Src\OvrApp.cpp so that it contains the Gear VR app ID of an app that belongs to your developer organization. Modify the Android.mk Makefile We need to modify the Android.mk makefile with the paths to our sources and our Avatar and Platform SDK library files. 1. Locate the Android.mk file in C:\dev\ovr_sdk_mobile\VrSamples\Native\mirror\Projects \Android\jni. 2. Modify the contents of Android.mk as follows: LOCAL_PATH := $(call my-dir) include $(CLEAR_VARS) include ../../../../../cflags.mk LOCAL_MODULE LOCAL_SRC_FILES LOCAL_STATIC_LIBRARIES LOCAL_SHARED_LIBRARIES

:= := := :=

ovrapp ../../../Src/OvrApp.cpp ../../../Src/AvatarManager.cpp vrsound vrmodel vrlocale vrgui vrappframework libovrkernel vrapi libovrplatformloader libovravatarloader

include $(BUILD_SHARED_LIBRARY) $(call $(call $(call $(call $(call $(call $(call $(call $(call

import-module,LibOVRKernel/Projects/AndroidPrebuilt/jni) import-module,VrApi/Projects/AndroidPrebuilt/jni) import-module,VrAppFramework/Projects/AndroidPrebuilt/jni) import-module,VrAppSupport/VrGUI/Projects/AndroidPrebuilt/jni) import-module,VrAppSupport/VrLocale/Projects/AndroidPrebuilt/jni) import-module,VrAppSupport/VrModel/Projects/AndroidPrebuilt/jni) import-module,VrAppSupport/VrSound/Projects/AndroidPrebuilt/jni) import-module,../OVRPlatformSDK/Android/jni) import-module,../OVRAvatarSDK/Android/jni)

Build and Launch the Project Run C:\dev\ovr_sdk_mobile\VrSamples\Native\mirror\Projects\Android\build.bat to build and launch the app on your device.

18 | Avatar SDK Getting Started Guide | Avatar

Unreal (Rift) Getting Started The Oculus Avatar SDK for Unreal Beta download file contains an Unreal Engine (UE) C++ sample project illustrating and implementing all the features available to Oculus Avatars in UE. The example project demonstrates: • using avatar classes to create and destroy UE avatar objects. • changing hand poses to custom hand poses and rendering Touch controllers. • recording local avatar movement packets and replaying the packets back on remote avatars (including voice visualizations). Note: Oculus Avatars for UE are for C++ projects. A blueprints version is not available at this time.

Avatar | Avatar SDK Getting Started Guide | 19

Requirements • Unreal Editor 4.15 (Avatar SDK Unreal beta is currently only supported in version 4.15) • Microsoft Visual Studio 2015 with C++ • Oculus Avatar SDK for Unreal Beta from https://developer.oculus.com/downloads/package/oculus-avatarsdk-unreal-beta/ Architecture of a UE Avatar Project Oculus Avatars for UE are implemented as a plugin. Avatars are embodied within UOvrAvatar ActorComponents that you can attach to the UE actors you desire. This lets you keep your game-side code separate from our avatar implementation. Some of the files and folders in our example project and their primary functions include: Config/DefaultInput.ini - contains the avatar input settings for Touch controllers. Config/DefaultEngine.ini - contains your app ID and adds Oculus Platform as a subsystem. Content/Avatars/ - contains the material assets used by the avatars. Plugins/ - contains Oculus Avatars implemented as an Unreal plugin. Source/LocalAvatar.cpp and RemoteAvatar.cpp - contain the "game-side" classes that demonstrate how to attach avatar components to actor classes. • AvatarSamples.uproject - enables the OvrAvatar plugin. • • • • •

Launching the Avatar Samples Unreal Project 1. 2. 3. 4.

Extract the contents of the Oculus Avatar SDK for Unreal Beta .zip file. Launch AvatarSamples.uproject. Click Play > VR Preview Wear your Rift.

You should see the hands of your avatar. This first person view where you only see your hands is referred to as the local avatar. Note: The Oculus Avatar SDK Unreal Beta sample project might have reflection effects that are not appropriate for VR rendering techniques. To fix, select the Forward Shading check box in Project Settings > Engine > Rendering > Forward Render.

20 | Avatar SDK Getting Started Guide | Avatar

The code that spawns your first-person avatar is in LocalAvatar.cpp: ALocalAvatar::ALocalAvatar() { RootComponent = CreateDefaultSubobject(TEXT("LocalAvatarRoot")); PrimaryActorTick.bCanEverTick = true; AutoPossessPlayer = EAutoReceiveInput::Player0; BaseEyeHeight = 170.f; AvatarComponent = CreateDefaultSubobject(TEXT("LocalAvatar")); AvatarComponent->SetVisibilityType(ovrAvatarVisibilityFlag_FirstPerson); AvatarComponent->SetPlayerHeightOffset(BaseEyeHeight / 100.f);

}

Spawning and Destroying Remote Avatars Squeeze the right Touch trigger to spawn avatars in a circle around you. Squeeze the left Touch trigger to destroy them. These third-person avatars with hands, heads, and base cones represent other people and are called remote avatars.

Avatar | Avatar SDK Getting Started Guide | 21

The remote avatars in this sample mimic your movements because we hooked them up to our avatar packet recording and playback system. This system records both your movements and your microphone amplitude, letting us transmit them to remote avatars and animate them accordingly. Speak or sing to see the voice animations on the remote avatars.

The packet recording is handled by ALocalAvatar::UpdatePacketRecording(float DeltaTime) in LocalAvatar.cpp. Packet playback on remote avatars is handled by ARemoteAvatar::Tick in RemoteAvatar.cpp. You might notice a small delay in the response between your local avatar movements and the corresponding movement in the remote avatars. This is an artificial delay we added to the sample to simulate network latency. To toggle packet recording and playback: • Press A on your right Touch. Creating Custom Hand Poses Press the thumbsticks to cycle through the following hand poses:

22 | Avatar SDK Getting Started Guide | Avatar

• a built-in pose for gripping a sphere: AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_GripSphere);

• a built-in pose for gripping a cube: AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_GripCube);

• a custom hand gesture built from an array of joint transforms, gAvatarRightHandTrans: AvatarComponent->SetCustomGesture(ovrHand_Right, gAvatarRightHandTrans, HAND_JOINTS);

• a built-in pose depicting Touch controllers: AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_Default); AvatarComponent->SetControllerVisibility(ovrHand_Right, true);

The code snippets above from LocalAvatar.cpp set the poses for the right hand. For the left hand, substitute the appropriate left hand functions and constants. Detaching and Moving Hands Independent of Tracking Press Y on the left Touch or press B on the right Touch to detach the avatar hands from Touch tracking. You can then use the thumbsticks to drive the avatar hand movements. The following code in LocalAvatar.cpp detaches the hands: AvatarHands[ovrHand_Right] = AvatarComponent->DetachHand(ovrHand_Right);

ALocalAvatar::DriveHand drives the hand movement after detaching. Adding Avatars to An Existing Project Copy the Plugins folder to the root folder of your project. It contains our OvrAvatar plugin. Update your project's Config/DefaultInput.ini file with content from the sample project's Config/ DefaultInput.ini file. Update the Modules and Plugins sections of your .uproject file with additional items. Remember to add a comma (,) to the last item in any existing Modules or Plugins sections before pasting the additional lines: "Modules": [ { "AdditionalDependencies": [ "Engine", "OnlineSubsystem", "OnlineSubsystemUtils" ] } ], "Plugins": [ { "Name": "OnlineSubsystemOculus", "Enabled": true }, { "Name": "OvrAvatar", "Enabled": true } ]

Update your project's Config/DefaultEngine.ini file with the following: [OnlineSubsystem]

Avatar | Avatar SDK Getting Started Guide | 23

DefaultPlatformService=Oculus [OnlineSubsystemOculus] bEnabled=true OculusAppId=YOUR_APP_ID

Get YOUR_APP_ID from the Oculus Developer Dashboard: https://dashboard.oculus.com In your code, the local user actor must implement the SetupPlayerInputComponent function, as the component needs controller input sent to it to animate the hands properly. A set of macros define these repetitive functions. Two things to consider are that: • the macros depend on the component member variable being named AvatarComponent • it stubs out member functions for the Actor class. You also need to replace the ALocalAvatar:: entries with the name of your own Actor class. // LocalAvatar.cpp void ALocalAvatar::SetupPlayerInputComponent(UInputComponent* Input) { Super::SetupPlayerInputComponent(Input); #define INPUT_ENTRY(entry, hand, flag) \ Input->BindAction(#entry, IE_Pressed, this, &ALocalAvatar::##entry##_Pressed); \ Input->BindAction(#entry, IE_Released, this, &ALocalAvatar::##entry##_Released); INPUT_COMMAND_TUPLE #undef INPUT_ENTRY #define AXIS_ENTRY(entry, hand, flag) \ Input->BindAxis(#entry, this, &ALocalAvatar::##entry##_Value); AXIS_INPUT_TUPLE #undef AXIS_ENTRY #define CUSTOM_ENTRY(entry, hand, field, invert) \ Input->BindAxis(#entry, this, &ALocalAvatar::##entry##_Value); CUSTOM_AXIS_TUPLE #undef CUSTOM_ENTRY } #define CUSTOM_ENTRY(entry, hand, field, invert) \ void ALocalAvatar::##entry##_Value(float value) >##entry##_Value(value); } CUSTOM_AXIS_TUPLE #undef CUSTOM_ENTRY

{

AvatarComponent-

#define INPUT_ENTRY(entry, hand, flag) \ void ALocalAvatar::##entry##_Pressed() { AvatarComponent->##entry##_Pressed();}\ void ALocalAvatar::##entry##_Released() { AvatarComponent->##entry##_Released(); } INPUT_COMMAND_TUPLE #undef INPUT_ENTRY #define AXIS_ENTRY(entry, hand, flag) \ void ALocalAvatar::##entry##_Value( float value) { AvatarComponent->##entry##_Value(value); } AXIS_INPUT_TUPLE #undef AXIS_ENTRY

Note in LocalAvatar.h where these functions are declared: private: #define INPUT_ENTRY(entry, hand, flag) \ void entry##_Pressed();\ void entry##_Released(); INPUT_COMMAND_TUPLE #undef INPUT_ENTRY #define AXIS_ENTRY(entry, hand, flag) \ void entry##_Value( float value); AXIS_INPUT_TUPLE #undef AXIS_ENTRY #define CUSTOM_ENTRY(entry, hand, field, invert) \

24 | Avatar SDK Getting Started Guide | Avatar

void entry##_Value( float value); CUSTOM_AXIS_TUPLE #undef CUSTOM_ENTRY

Place your request to fetch the avatar wherever you have set up online login functionality. For example: void ALocalAvatar::OnLoginComplete(int32 LocalUserNum, bool bWasSuccessful, const FUniqueNetId& UserId, const FString& Error) { IOnlineIdentityPtr OculusIdentityInterface = Online::GetIdentityInterface(); OculusIdentityInterface->ClearOnLoginCompleteDelegate_Handle(0, OnLoginCompleteDelegateHandle);

}

if (AvatarComponent) { AvatarComponent->RequestAvatar(10149999027226798); }

Avatar | Avatar Developer Guide | 25

Avatar Developer Guide This document describes how to install, configure, and use the Oculus Avatar SDK. The Avatar SDK consists of native C++ and Unity documentation, samples, plugins, source code, and libraries to help developers implement Oculus Avatars in their own VR experiences.

Using Unity Features These topics describe the contents and features of Oculus Avatars for Unity development.

Sample Scenes

There are four sample scenes in the Avatar Unity package. • Controllers Demonstrates how first-person avatars can be used to enhance the sense of presence for Touch users. • GripPoses A helper scene for creating custom grip poses. See Custom Touch Grip Poses on page 28. • LocalAvatar Demonstrates the capabilities of both first-person and third-person avatars. Does not yet include microphone voice visualization or loading an Avatar Specification using Oculus Platform. • RemoteLoopback Demonstrates the avatar packet recording and playback system. See Recording and Playing Back Avatar Pose Updates on page 8. • Social Starter Demonstrates using Oculus Avatars together with other Oculus Platform features such as invites, peer-topeer networking, and VoIP. See Unity Social Starter Example on page 31.

Loading Personalized Avatars

You can replace the default blue avatar with a personalized avatar using the Oculus Platform package. The base Avatar SDK OvrAvatar.cs class is already set up to load the avatar specifications of users, but we need to call Oculus Platform functions to get valid user IDs. After getting a user ID, we then can set the oculusUserID of the avatar accordingly. The timing is important, because we have to set the user ID before the Start() function in OvrAvatar.cs gets called. Note: For security reasons, Oculus Avatars and Oculus Platform must be initialized with a valid App ID before accessing user ID information. You can create a new application and obtain an App ID from the developer dashboard. For more information, see Oculus Platform Setup.

26 | Avatar Developer Guide | Avatar

The example below shows one way of doing this. It defines a new class called PlatformManager to extend our existing Unity (Rift) Getting Started on page 4 sample. After modifying the sample with our new class, the Avatar SDK shows you the personalized avatar of the current Oculus Home user instead of the default blue avatar. 1. Import the Oculus Platform SDK Unity package into your Unity project. 2. Specify valid App IDs for both the Oculus Avatars and Oculus Platform plugins: a. Click Oculus Avatars > Edit Configuration and paste your Oculus Rift App Id or Gear VR App Id into the field. b. Click Oculus Platform > Edit Settings and paste your Oculus Rift App Id or Gear VR app Id into the field. 3. Create an empty game object named PlatformManager: a. Click GameObject > Create Empty. b. Rename the game object PlatformManager. 4. Click Add Component, enter New Script in the search field, and then select New Script. 5. Name the new script PlatformManager and set Language to C Sharp. 6. Save the text below as Assets\PlatformManager.cs. using using using using using

UnityEngine; Oculus.Avatar; Oculus.Platform; Oculus.Platform.Models; System.Collections;

Avatar | Avatar Developer Guide | 27

public class PlatformManager : MonoBehaviour { public OvrAvatar myAvatar; void Awake () { Oculus.Platform.Core.Initialize(); Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback); Oculus.Platform.Request.RunCallbacks(); //avoids race condition with OvrAvatar.cs Start(). } private void GetLoggedInUserCallback(Message message) { if (!message.IsError) { myAvatar.oculusUserID = message.Data.ID; } }

}

7. In the Unity Editor, select PlatformManager from the Hierarchy. The My Avatar field appears in the Inspector. 8. Drag LocalAvatar from the Hierarchy to the My Avatar field. Handling Multiple Personalized Avatars In a multi-user scene where each avatar has different personalizations, you already have the user IDs of all the users in your scene because you had to retrieve that data to invite them in the first place. Set the oculusUserID for each user 's avatar accordingly. If your scene contains multiple avatars of the same person, such as in our LocalAvatar and RemoteLoopback sample scenes, you can iterate through all the avatar objects in the scene to change all their oculusUserID values. Here is an example of how to modify the callback of our PlatformManager class to personalize the avatars in those two sample scenes: using using using using using

UnityEngine; Oculus.Avatar; Oculus.Platform; Oculus.Platform.Models; System.Collections;

public class PlatformManager : MonoBehaviour { void Awake () { Oculus.Platform.Core.Initialize(); Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback); Oculus.Platform.Request.RunCallbacks(); //avoids race condition with OvrAvatar.cs Start(). }

}

private void GetLoggedInUserCallback(Message message) { if (!message.IsError) { OvrAvatar[] avatars = FindObjectsOfType(typeof(OvrAvatar)) as OvrAvatar[]; foreach (OvrAvatar avatar in avatars) { avatar.oculusUserID = message.Data.ID; } } }

Avatar Prefabs

The Avatar Unity package contains two prefabs for Avatars: LocalAvatar and RemoteAvatar. They are located in OvrAvatar >Content > PreFabs. The difference between LocalAvatar and RemoteAvatar is in the driver, the control mechanism behind avatar movements. The LocalAvatar driver is the OvrAvatarDriver script which derives avatar movement from the logged in user's controllers and HMD. The RemoteAvatar driver is the OvrAvatarRemoteDriver script which gets its avatar movement from the packet recording and playback system.

28 | Avatar Developer Guide | Avatar

Reducing Draw Calls with the Combine Meshes Option Each avatar in your scene requires 11 draw calls per eye per frame (22 total). The Combine Meshes option reduces this to 3 draw calls per eye (6 total) by combining all the mesh parts into a single mesh. This is an important performance gain for Gear VR as most apps typically need to stay within a draw call budget of 50 to 100 draw calls per frame. Without this option, just having 4 avatars in your scene would use most or all of that budget. You should almost always select this option when using avatars. The only drawback to using this option is that you are no longer able to access mesh parts individually, but that is a rare use case.

Custom Touch Grip Poses

The GripPoses sample lets you change the hand poses by rotating the finger joints until you get the pose you want. You can then save these finger joint positions as a Unity prefab that you can load at a later time. In this example, we pose the left hand to make it look like a scissors or bunny rabbit gesture. Creating the left hand pose: 1. 2. 3. 4.

Open the Samples > GripPoses > GripPoses scene. Click Play. Press E to select the Rotate transform tool. In the Hierarchy window, expand LocalAvatar > hand_left > LeftHandPoseEditHelp > hands_l_hand_world > hands:b_l_hand.

Avatar | Avatar Developer Guide | 29

5. Locate all the joints of the fingers you want to adjust. Joint 0 is closest to the palm, subsequent joints are towards the finger tip. To adjust the pinky finger joints for example, expand hands:b_l_pinky0 > hands:b_l_pinky1 > hands:b_l_pinky2 > hands:b_l_pinky3. 6. In the Hierarchy window, select the joint you want to rotate.

7. In the Scene window, click a rotation orbit and drag the joint to the desired angle.

30 | Avatar Developer Guide | Avatar

8. Repeat these two steps until you achieve the desired pose. Saving the left hand pose: 1. In the Hierarchy window, drag hand_l_hand_world to the Project window. 2. In the Project window, rename this transform to something descriptive, for example: poseBunnyRabbitLeft. Using the left hand pose: 1. In the Hierarchy window, select LocalAvatar. 2. Drag poseBunnyRabbitLeft from the Project window to the Left Hand Custom Pose field in the Inspector Window. Click Play again. You see that the left hand is now frozen in our custom bunny grip pose.

Grabbing Objects with Rift Hands To let avatars interact with objects in their environment, use the OVRGrabber and OVRGrabble components. For a working example, see the AvatarWithGrab sample scene included in the Oculus Unity Sample Framework.

Making Rift Stand-alone Builds

To make Rift avatars appear in stand-alone executable builds, we need to change two settings. • Add the Avatar shaders to the Always Included Shaders list in your project settings: 1. Click Edit > Project Settings > Graphics. 2. Under Always Included Shaders, add +3 to the Size and then press Enter. 3. Add the following shader elements: AvatarSurfaceShader, AvatarSurfaceShaderPBS, AvatarSurfaceShaderSelfOccluding. • Build as a 64-bit application: 1. Click File > Build Settings. 2. Set Architecture to x86_x64.

Avatar | Avatar Developer Guide | 31

Getting the Position of Avatar Components

You can use our accessor functions to get the transforms for the avatar hands and mouth without having to walk the hierarchy. You can specify the hand and joint you want and then use GetHandTransform() to get its transform. public Transform GetHandTransform(HandType hand, HandJoint joint)

The enums for HandType are: • Right • Left The enums for HandJoint are: • • • • •

HandBase IndexBase IndexTip ThumbBase ThumbTip

You can also get the forwards and upwards directions of an avatar hand as a vector so you know where the avatar hand is pointing. Use GetPointingDirection(). Forwards and Up are perpendicular to each other. public void GetPointingDirection(HandType hand, ref Vector3 forward, ref Vector3 up)

To get the transform of the avatar's mouth, use GetMouthTransform(). This is useful when you want to spatialize avatar speech as point-source audio located at the mouth. public Transform GetMouthTransform()

Unity Social Starter Example The Social Starter example scene demonstrates using Oculus Avatars together with other Oculus Platform features such as invites, peer-to-peer networking, and VoIP. This scene depicts a virtual room into which you can invite your friends who are also running the scene. As your friends join your room, the scene sets up a VoIP connection to transmit voice packets and a peer-to-peer connection to transmit avatar packets. Requirements This scene requires the following Unity packages: • Oculus Unity Utilities • Oculus Avatar SDK • Oculus Platform SDK Because this scene uses Oculus Platform features, you must paste an Oculus App ID in both the Oculus Platform and the Oculus Avatars settings. For more information, see Using Unity Features on page 25.

32 | Avatar Developer Guide | Avatar

Demonstrating the Social Features In the middle of the room is a sphere. The sphere is white if you are in an online room or black if the room creation fails for any reason. The floor is blue if you are the owner of the room, green if you have joined someone else's room. Things you can do: • To send an invitation, press X on your Touch controller or Back on your Gear VR controller. • To view the scene from a different camera, press Y on your Touch or click the trackpad on your Gear VR controller. • To move and turn your avatar around the room, use the thumbsticks on your Touch or the trackpad on your Gear VR controller. • To turn off the help text, click the thumbstick on your left Touch or pull the trigger on your Gear VR controller.

Adding C++ Avatar Support This guide outlines Avatar SDK support with a C/C++ game engine or application. The source code samples used in this guide are taken from the Mirror demo, available in OVRAvatarSDK\Samples\Mirror. To add Avatar support to your Visual C++ project: 1. Add Oculus Platform Support to your Project. (https://developer.oculus.com/documentation/platform/latest/ concepts/pgsg-native-gsg/) 2. Open the project's Properties > Configuration Properties > VC++ Directories page. 3. In Include Directories, add the location of the Avatar SDK includes folder (InstallFolder\include). 4. In Library Directories, add the location of the Avatar SDK library folder (InstallFolder\Windows). 5. Add the Avatar library file as linker input: a. Expand Linker > Input. b. In Additional Dependencies, add InstallFolder\Windows\libovravatar.lib. 6. Add #include and #include . 7. Initialize the Platform module using your app ID through ovr_PlatformInitializeWindows(appID) 8. Initialize the Oculus SDK through ovr_Initialize. 9. Compile the Oculus Avatar OpenGL fragment and vertex reference shader into a shader program. 10.Initialize the Avatar module through ovrAvatar_Initialize(appID). Avatar Message Queue The functions ovrAvatar_RequestAvatarSpecification() and ovrAvatarAsset_BeginLoading() are asynchronous. The avatar message queue contains the results of these operations. You can retrieve the most recent message with ovrAvatarMessage_Pop(). After you finish processing a message on the queue, be sure to call ovrAvatarMessage_Free() to free up the memory used by the pop.

Avatar | Avatar Developer Guide | 33

Rendering Avatar Components

Avatars are composed of avatar components (body, base, hands, controller) which are themselves composed of render parts. Each Oculus user has an Avatar Specification that indicates the mesh and texture assets that need to be loaded to recreate the avatar. Our Mirror.cpp example code contains good examples of the entire process and includes helper functions, prefixed with _, that we have written to make it easier to render complete avatars. The complete process goes something like this: 1. Retrieve the avatar specification for the Oculus user. ovrAvatar_RequestAvatarSpecification(userID); 2. Set the Avatar capabilities. ovrAvatar_Create(message->avatarSpec, ovrAvatarCapability_All); 3. Iterate through the avatar specification to load the static avatar assets (mesh and textures) into the avatar. ovrAvatar_GetReferencedAsset(_avatar); 4. Apply the vertex transforms to determine the position of the avatar component. 5. Apply the material states to determine the appearance of the avatar component. 6. For each render part of an avatar component: a. Get the OpenGL mesh data and tell the renderer to use the Avatar shader program you compiled earlier. b. Calculate the inputs on the vertex uniforms. c. Set the view position, the world matrix, the view matrix, and the array of mesh poses. d. Transform everything in the joint hierarchy. e. Set the material state. f. Draw the mesh, depth first so that it self-occludes. g. Render to the color buffer. 7. When there are no more components to render, the avatar render is complete. Rendering Controllers You can render Touch and Gear VR Controllers with the user's Avatar. Rift Applications To render avatar hands without controllers: ovrAvatar_SetLeftControllerVisibility(_avatar, 0); ovrAvatar_SetRightControllerVisibility(_avatar, 0);

To render avatar hands with controllers: ovrAvatar_SetLeftControllerVisibility(_avatar, 1); ovrAvatar_SetRightControllerVisibility(_avatar, 1);

Gear VR Applications When using the VrAppFramework there is a bone count limitation which makes us unable to render hands, so you can only render the controller model. After you've created the avatar, turn off the hands: ovrAvatar_SetLeftHandVisibility( sAvatar, false ); ovrAvatar_SetRightHandVisibility( sAvatar, false );

34 | Avatar Developer Guide | Avatar

Then, turn on the controller visibility: ovrAvatar_SetRightControllerVisibility( sAvatar, true ); ovrAvatar_SetLeftControllerVisibility( sAvatar, true );

In your file that manages the Avatar: void AvatarManager::UpdateTrackedControllerInputState( const ovrPosef& headPose, const ovrInputStateTrackedRemote& remoteInputState, const ovrTracking& trackingState, const ovrInputTrackedRemoteCapabilities& capabilities, const bool isActive )

Finally, see the VrApi Input API for information about detecting and sampling the controller properly. Setting a Custom Touch Grip Pose You can pass your own custom transforms to the hand pose functions or use our cube and sphere preset hand poses. Here is an example of a custom pose made from freezing the hands in their current pose: const ovrAvatarHandComponent* handComp = ovrAvatarPose_GetLeftHandComponent(_avatar); const ovrAvatarComponent* comp = handComp->renderComponent; const ovrAvatarRenderPart* renderPart = comp->renderParts[0]; const ovrAvatarRenderPart_SkinnedMeshRender* meshRender = ovrAvatarRenderPart_GetSkinnedMeshRender(renderPart); ovrAvatar_SetLeftHandCustomGesture(_avatar, meshRender->skinnedPose.jointCount, meshRender->skinnedPose.jointTransform); handComp = ovrAvatarPose_GetRightHandComponent(_avatar); comp = handComp->renderComponent; renderPart = comp->renderParts[0]; meshRender = ovrAvatarRenderPart_GetSkinnedMeshRender(renderPart); ovrAvatar_SetRightHandCustomGesture(_avatar, meshRender->skinnedPose.jointCount, meshRender->skinnedPose.jointTransform);

To pose the hands as if to grip cubes: ovrAvatar_SetLeftHandGesture(_avatar, ovrAvatarHandGesture_GripCube); ovrAvatar_SetRightHandGesture(_avatar, ovrAvatarHandGesture_GripCube);

To pose the hands as if to grip spheres: ovrAvatar_SetLeftHandGesture(_avatar, ovrAvatarHandGesture_GripSphere); ovrAvatar_SetRightHandGesture(_avatar, ovrAvatarHandGesture_GripSphere);

To unfreeze the hand poses: ovrAvatar_SetLeftHandGesture(_avatar, ovrAvatarHandGesture_Default); ovrAvatar_SetRightHandGesture(_avatar, ovrAvatarHandGesture_Default);

Voice Visualization Voice visualization is an avatar component. It is created as a projection on top of an existing mesh. Create the microphone: ovrMicrophoneHandle mic = ovr_Microphone_Create(); if (mic) {

Avatar | Avatar Developer Guide | 35

ovr_Microphone_Start(mic); }

Pass an array of voice samples to ovrAvatarPose_UpdateVoiceVisualization(). float micSamples[48000]; size_t sampleCount = ovr_Microphone_ReadData(mic, micSamples, sizeof(micSamples) / sizeof(micSamples[0])); if (sampleCount > 0) { ovrAvatarPose_UpdateVoiceVisualization(_avatar, (uint32_t)sampleCount, micSamples); }

The render parts of the voice visualization component are a ProjectorRender type.

Pose Recording and Playback The Avatar SDK contains a complete avatar pose recording and playback system. You can save pose data to packets at regular intervals and then transmit these packets to a remote computer to drive the avatar poses there. Pose Recording Call ovrAvatarPacket_BeginRecording() to begin recording ovrAvatarPacket_BeginRecording(_avatar);

After you record as many frames worth of pose changes you want, stop the recording with ovrAvatarPacket_EndRecording() and then write your packet out with ovrAvatarPacket_Write(). ovrAvatarPacket* recordedPacket = ovrAvatarPacket_EndRecording(_avatar); // Write the packet to a byte buffer to exercise the packet writing code uint32_t packetSize = ovrAvatarPacket_GetSize(recordedPacket); uint8_t* packetBuffer = (uint8_t*)malloc(packetSize); ovrAvatarPacket_Write(recordedPacket, packetSize, packetBuffer); ovrAvatarPacket_Free(recordedPacket);

Transmit your data to your destination using your own network code. Pose Playback To read your pose data back into packets: // Read the buffer back into a packet playbackPacket = ovrAvatarPacket_Read(packetSize, packetBuffer); free(packetBuffer);

To play the packets back: float packetDuration = ovrAvatarPacket_GetDurationSeconds(packet); *packetPlaybackTime += deltaSeconds; if (*packetPlaybackTime > packetDuration) { ovrAvatarPose_Finalize(avatar, 0.0f); *packetPlaybackTime = 0; } ovrAvatar_UpdatePoseFromPacket(avatar, packet, *packetPlaybackTime);

The playback routine uses the timestamp deltaSeconds to interpolate a tween pose in case the frames on the remote computer are offset by a different amount.

36 | Avatar SDK C/C++ Developer Reference | Avatar

Avatar SDK C/C++ Developer Reference The Oculus Avatar SDK Developer Reference contains detailed information about the data structures and files included with the SDK. See Oculus Avatar SDK Reference Manual 1.24.

Life Enjoy

" Life is not a problem to be solved but a reality to be experienced! "

Get in touch

Social

© Copyright 2013 - 2019 DOKUMENTIX.COM - All rights reserved.