33
Avatar SDK Version 1.15.0

Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

  • Upload
    builien

  • View
    395

  • Download
    2

Embed Size (px)

Citation preview

Page 1: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar SDK

Version 1.15.0

Page 2: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

2 | Introduction | Avatar

2 |  | 

Copyrights and Trademarks© 2017 Oculus VR, LLC. All Rights Reserved.

OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights reserved.BLUETOOTH is a registered trademark of Bluetooth SIG, Inc. All other trademarks are the property of theirrespective owners. Certain materials included in this publication are reprinted with the permission of thecopyright holder.

Page 3: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Contents | 3

Contents

Avatar SDK Getting Started Guide..................................................................... 4Unity (Rift) Getting Started................................................................................................................................ 4Unity (Gear VR) Getting Started........................................................................................................................7Native C/C++ (Rift) Getting Started............................................................................................................... 10

Rendering Avatars...................................................................................................................................... 11Translating Touch Controllers To Avatar Movements............................................................................... 13

Native C/C++ (Gear VR) Getting Started....................................................................................................... 14Unreal (Rift) Getting Started............................................................................................................................16

Avatar SDK Guide..............................................................................................23Unity Features.................................................................................................................................................. 23Adding C++ Avatar Support...........................................................................................................................28

Rendering Avatar Components..................................................................................................................29Voice Visualization......................................................................................................................................30Pose Recording and Playback....................................................................................................................31

Avatar SDK C/C++ Developer Reference......................................................... 32

Avatar SDK Documentation Archive..................................................................33

Page 4: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

4 | Avatar SDK Getting Started Guide | Avatar

Avatar SDK Getting Started GuideWelcome to Oculus Avatars, a powerful and flexible way to increase presence within VR apps.

Avatars deliver true social presence with persistent identities you can take with you across the Oculusecosystem. Bring other peoples' avatars into your app, game, or experience so they can feel like themselvesand even recognize their friends.

Avatars also provide hand presence for Touch controllers, letting you integrate Oculus Touch interaction intoRift apps. Integrating Touch lets your users interact with their environment and enhances their perception ofthemselves within your virtual world.

Add Social Presence, Touch Interaction, or Both

As a developer, you are free to mix and match Avatars SDK features. You can use it to provide Touchinteraction in one app and social interaction in another. The SDK is extensible and customizable and includes:

• a Unity package with scripts, prefabs, art assets, and sample scenes for PC and Gear VR development.• native C/C++ samples, header files, and libraries for PC and Gear VR development.• an Unreal plugin with sample code for PC development.

Unity (Rift) Getting StartedThe Avatar Unity package contains several prefabs you can drop into your existing Unity projects. This tutorialshows you how to start using them.

Download the Oculus Avatars SDK

The SDK is packaged in a .zip archive file on our developer website.

1. Download the Oculus Avatars SDK .zip from https://developer.oculus.com/downloads/.

Page 5: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Getting Started Guide | 5

2. If you do not have the Oculus Utilities for Unity 5, download its .zip file too.3. Extract the contents of the .zip archive files to your local drive.

Set Up Unity for Oculus Avatar Development

To set up, import the Oculus Unity packages into a project.

1. Create a New Project in Unity named "Unity Avatar Demo Project".2. Import the Oculus Avatars (OvrAvatar.unityPackage) and Oculus Utilities

(OculusUtilities.unitypackage) packages. For each package:

a. Click Assets > Import Package > Custom Package.b. Select the package file from your local drive.c. Click All and then click Import.

3. Select the Virtual Reality Supported check box in Edit > Project Settings > Player.4. Delete Main Camera from your scene and then drag OVRCameraRig from OVR > PreFabs.5. Reset the transform on OVRCameraRig.

Note: You may ignore any No Oculus Rift App ID warnings you see during development. While an AppID is required to retrieve Oculus avatars for specific users, you can prototype and test experiences thatmake use of Touch and Avatars with just the default blue avatar.

Adding an Avatar to the Scene

The LocalAvatar prefab renders the player's avatar and hands. Check box options in the Inspector let youchoose which parts of the avatar you want to render: body, hands, and Touch controllers.

To render avatar hands with controllers:

1. Drag OvrAvatar > Content > Prefabs > LocalAvatar to the Hierarchy window.2. In the Inspector window, select the Start With Controllers check box.

Click Play to test. Experiment with the built-in hand poses and animations by playing with the Touchcontrollers.

To render avatar hands:

1. In the Hierarchy window, select LocalAvatar.2. In the Inspector window, clear the Start With Controllers check box.

Page 6: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

6 | Avatar SDK Getting Started Guide | Avatar

Click Play to test. Note how the finger joints transform to change hand poses as you squeeze and release thegrips and triggers on the Touch controllers. You might sometimes want to use hand poses outside of thesemovements and we talk more about this in Custom Grip Poses.

To render the avatar body:

1. In the Hierarchy window, select LocalAvatar.2. In the Inspector window, select the Show Third Person check box.3. Change Transform > Position to X:0 Y:0 Z:1.54. Change Transform > Rotation to X:0 Y:180 Z:0

Recording and Playing Back Avatar Pose Updates

The avatar packet recording system saves avatar movement data as packets you can send across a network toplay back on a remote system.

To see a demonstration, open the RemoteLoopback scene in OvrAvatar > Samples > RemoteLoopback.

Let us have a look at the RemoteLoopbackManager script.

Page 7: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Getting Started Guide | 7

Setting RecordPackets to true starts the avatar packet recording system. We also subscribe to the eventhandler PacketRecorded so that we can do something useful each time a packet is recorded.

void Start () { LocalAvatar.RecordPackets = true; LocalAvatar.PacketRecorded += OnLocalAvatarPacketRecorded;}

Each time a packet is recorded, our code places the packet into a memory stream we are using as a stand-in fora real network layer.

void OnLocalAvatarPacketRecorded(object sender, args){ using (MemoryStream outputStream = new MemoryStream()) { BinaryWriter writer = new BinaryWriter(outputStream); writer.Write(packetSequence++); args.Packet.Write(outputStream); SendPacketData(outputStream.ToArray()); }}

The remainder of our code receives the packet from the memory stream and plays it back on our loopbackavatar object.

void SendPacketData(byte[] data){ ReceivePacketData(data);}

void ReceivePacketData(byte[] data){ using (MemoryStream inputStream = new MemoryStream(data)) { BinaryReader reader = new BinaryReader(inputStream); int sequence = reader.ReadInt32(); OvrAvatarPacket packet = OvrAvatarPacket.Read(inputStream); LoopbackAvatar.GetComponent<OvrAvatarRemoteDriver>().QueuePacket(sequence, packet); }}

Unity (Gear VR) Getting StartedThe Avatar Unity packages contain several prefabs you can drop into your existing Unity projects. This tutorialshows you how to start using them.

Download the Oculus Avatars SDK

The SDK is packaged in a .zip archive file on our developer website.

1. Download the Oculus Avatars SDK .zip from https://developer.oculus.com/downloads/.2. If you do not have the Oculus Utilities for Unity 5, download its .zip file too.3. Extract the contents of the .zip archive files to your local drive.

Set Up Unity for Oculus Avatar Gear VR Development

The set up includes importing the Oculus Unity packages and also setting up Unity for Android developmentand debugging.

1. Create a New Project in Unity named "gearvr-avatar".

Page 8: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

8 | Avatar SDK Getting Started Guide | Avatar

2. Click File > Build Settings and select Android. Download and install Unity Android Support and thenrestart Unity if necessary.

3. Click Switch Platform to switch to Android platform.4. Click Add Open Scenes.5. Set Texture Compression to ASTC.6. Click Edit > Project Settings > Player, click the little Android Settings robot, and then set the following

options:

a. Select the Virtual Reality Supported check box.b. In Bundle Identifier, enter a unique package name.c. Set Minimum API Level to Android 5.0 'Lollipop' (API level 21).d. Set Install Location to Automatic.

7. Import the Oculus Avatars (OvrAvatar.unityPackage) and Oculus Utilities(OculusUtilities.unitypackage) packages. For each package:

a. Click Assets > Import Package > Custom Package.b. Select the package file from your local drive.c. Click All and then click Import.

8. Connect your Android device to your computer.9. Create an Oculus Signature File for your Android device at https://dashboard.oculus.com/tools/osig-

generator/and then copy it to the folder gearvr-avatar/Assets/Plugins/Android/assets. Createthis folder if it doesn't exist.

Adding the VR Camera

Because the avatar has a default height of 170 cm, we should raise our VR camera rig to the same height.

1. Delete Main Camera from your scene and then drag OVRCameraRig from OVR > PreFabs.2. Set the Position transform on OVRCameraRig to X:0, Y:1.70, Z:0.

Adding an Avatar

As the player cannot see his or her own Gear VR avatar, Gear VR avatars should all be of the "third person"type. To make sure the avatar is visible, we can place the avatar 50cm in front of the camera, and rotate theavatar 180 degrees so that its front faces us.

Note: The "local" in the prefab name "LocalAvatar" refers to how the avatar object gets its motiondata. "Local" means the avatar object is driven by the local headset orientation.

1. Drag OvrAvatar > Content > Prefabs > LocalAvatar to the Hierarchy window.2. In the Inspector, clear the Show First Person check box and select the Show Third Person check box.3. Select the Combine Meshes check box. This reduces total draw calls per frame per avatar to 6 from 22.

Gear VR apps typically need to stay within 50 to 100 draw calls per frame.4. Set the Position transform on LocalAvatar to X:0, Y:0, Z:0.50.5. Set the Rotation transform on LocalAvatar to X:0, Y:180, Z:0.6. Click File > Build & Run to build an .apk from this scene and have Unity launch it on your Android device.

Page 9: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Getting Started Guide | 9

What to Explore Next?

• Loading Personalized Avatars

See Unity Features for instructions on how to modify the sample scenes to retrieve Oculus User IDs anddisplay personalized avatars.

• Recording and Playing Back Avatar Pose Updates

Build our RemoteLoopback example scene and read the accompanying write-up in our Unity (Rift) GettingStarted topic.

Page 10: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

10 | Avatar SDK Getting Started Guide | Avatar

Native C/C++ (Rift) Getting StartedGet started using Oculus Avatars in your own native Rift code by experimenting with our sample Visual Studio2013 C++ solution.

Download the Oculus Avatars SDK

The SDK is packaged in a .zip archive file on our developer website.

1. Download the Oculus Avatars SDK .zip from https://developer.oculus.com/downloads/.2. Extract the contents of the .zip archive files to your local drive.

OpenGL is Required

The current version of the Avatar SDK only contains OpenGL shaders.

Running the Mirror Sample on Microsoft Visual Studio 2013

Our Mirror sample serves as a good foundation for implementing avatars in your own code. Let us take a littletour of it.

To set up the Microsoft Visual Studio 2013 solution for our Mirror sample:

1. Download and install cmake from https://cmake.org/download.2. In Windows Explorer, locate the OVRAvatarSDK\Samples folder and double-click

generate_projects.cmd.3. Wait for the script to finish creating the VS2013 folder and solution.4. Open and build the solution: Samples\VS2013\Mirror.sln.5. Press F5 to start debugging.

Key Bindings

The Mirror sample illustrates several features of the Avatar SDK by letting you toggle them:

Press... to...

1 show/hide the avatar body.

2 show/hide the hands.

3 show/hide the base cone.

4 show/hide the voice visualization.

c show/hide the Touch controllers.

f freeze/unfreeze the current hand pose.

s set the hand pose to 'grip sphere'

u set the hand pose to 'grip cube'

j show/hide the joint lines.

r start avatar packet recording. Press 'r' again to playrecorded packets in a loop.

Page 11: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Getting Started Guide | 11

Exploring the Sample Code

Open Mirror.cpp file and follow along from main. Our tour explores only the portions of code specific toOculus avatars.

Rendering Avatars

Compiling Shaders

We compile the Avatar vertex and fragment shaders for our regular shader and our physically based shader(PBS) using a helper function _compileProgramFromFiles.

_skinnedMeshProgram = _compileProgramFromFiles("AvatarVertexShader.glsl", "AvatarFragmentShader.glsl", sizeof(errorBuffer), errorBuffer);..._skinnedMeshPBSProgram = _compileProgramFromFiles("AvatarVertexShader.glsl", "AvatarFragmentShaderPBS.glsl", sizeof(errorBuffer), errorBuffer);

Retrieving Avatar Data From a User Profile

The appearance of every person's avatar is stored in his or her Oculus user profile as an Avatar Specification.The Avatar Specification identifies the meshes and textures that make up a person's avatar. Before we retrievethis specification data, we have to initialize both the Platform SDK and the Avatar SDK using our app ID. To getyour own app ID, see Developer Dashboard and Oculus App Setup.

#define MIRROR_SAMPLE_APP_ID "958062084316416"...ovrPlatformInitializeWindows(MIRROR_SAMPLE_APP_ID);...ovrAvatar_Initialize(MIRROR_SAMPLE_APP_ID);

Avatar Specifications are indexed by Oculus user ID. An app has easy access to the Oculus user ID of thecurrently logged in user.

Tip: If you wanted to create a social experience, you would write code to share user IDs betweeninstances of your app so that you could load and render the appearance of other avatars too.

ovrID userID = ovr_GetLoggedInUserID();ovrAvatar_RequestAvatarSpecification(userID);

The function ovrAvatar_RequestAvatarSpecification() is asynchronous. Weuse a message queue to determine when the function has finished obtaining our data(ovrAvatarMessageType_AvatarSpecification).

// Pump avatar messages while (ovrAvatarMessage* message = ovrAvatarMessage_Pop()) { switch (ovrAvatarMessage_GetType(message)) { case ovrAvatarMessageType_AvatarSpecification: _handleAvatarSpecification(ovrAvatarMessage_GetAvatarSpecification(message)); break; case ovrAvatarMessageType_AssetLoaded: _handleAssetLoaded(ovrAvatarMessage_GetAssetLoaded(message)); break; } ovrAvatarMessage_Free(message); }

With the Avatar Specification in hand, we then use our helper function _handleAvatarSpecification to:

Page 12: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

12 | Avatar SDK Getting Started Guide | Avatar

• Create an avatar instance (ovrAvatar_Create).• Load all the relevant avatar assets into that instance.

Loading avatar assets is also asynchronous and we again rely on popping our message queue to determinewhen an asset for an avatar has finished loading (ovrAvatarMessageType_AssetLoaded).

static void _handleAvatarSpecification(const ovrAvatarMessage_AvatarSpecification* message){ // Create the avatar instance _avatar = ovrAvatar_Create(message->avatarSpec, ovrAvatarCapability_All);

// Trigger load operations for all of the assets referenced by the avatar uint32_t refCount = ovrAvatar_GetReferencedAssetCount(_avatar); for (uint32_t i = 0; i < refCount; ++i) { ovrAvatarAssetID id = ovrAvatar_GetReferencedAsset(_avatar, i); ovrAvatarAsset_BeginLoading(id); ++_loadingAssets; } printf("Loading %d assets...\r\n", _loadingAssets);}

static void _handleAssetLoaded(const ovrAvatarMessage_AssetLoaded* message){ // Determine the type of the asset that got loaded ovrAvatarAssetType assetType = ovrAvatarAsset_GetType(message->asset); void* data = nullptr;

// Call the appropriate loader function switch (assetType) { case ovrAvatarAssetType_Mesh: data = _loadMesh(ovrAvatarAsset_GetMeshData(message->asset)); break; case ovrAvatarAssetType_Texture: data = _loadTexture(ovrAvatarAsset_GetTextureData(message->asset)); break; }

// Store the data that we loaded for the asset in the asset map _assetMap[message->assetID] = data; --_loadingAssets; printf("Loading %d assets...\r\n", _loadingAssets);}

Rendering the Avatar

Our sample code is called Mirror and it calls the avatar rendering helper function _renderAvatar() twice.The first call renders a first-person avatar. A first person avatar can depict the player's hands and world position.

// If we have the avatar and have finished loading assets, render itif (_avatar && !_loadingAssets){ _renderAvatar(_avatar, ovrAvatarVisibilityFlag_FirstPerson, view, proj, eyeWorld, renderJoints);

The second call renders a third-person avatar, transformed so that it faces you as if looking in a mirror. A third-person avatar can depict hands, body, and base cone.

glm::vec4 reflectionPlane = glm::vec4(0.0, 0.0, -1.0, 0.0); glm::mat4 reflection = _computeReflectionMatrix(reflectionPlane);

glFrontFace(GL_CW); _renderAvatar(_avatar, ovrAvatarVisibilityFlag_ThirdPerson, view * reflection, proj, glm::vec3(reflection * glm::vec4(eyeWorld, 1.0f)), renderJoints); glFrontFace(GL_CCW);}

Page 13: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Getting Started Guide | 13

Hiding and Displaying Avatar Capabilities

When we first created our avatar instance, we created it with all capabilities active:

_avatar = ovrAvatar_Create(message->avatarSpec, ovrAvatarCapability_All);

You can enable different capabilities by calling ovrAvatar_SetActiveCapabilities(). In our sample, wetoggle different capabilities in real-time using the bit masks defined in ovrAvatarCapabilities:

case '1': capabilities ^= ovrAvatarCapability_Body; ovrAvatar_SetActiveCapabilities(_avatar, static_cast<ovrAvatarCapabilities>(capabilities)); break;case '2': capabilities ^= ovrAvatarCapability_Hands; ovrAvatar_SetActiveCapabilities(_avatar, static_cast<ovrAvatarCapabilities>(capabilities)); break;case '3': capabilities ^= ovrAvatarCapability_Base; ovrAvatar_SetActiveCapabilities(_avatar, static_cast<ovrAvatarCapabilities>(capabilities)); break;case '4': capabilities ^= ovrAvatarCapability_Voice; ovrAvatar_SetActiveCapabilities(_avatar, static_cast<ovrAvatarCapabilities>(capabilities)); break;

Translating Touch Controllers To Avatar MovementsOur sample code translates Touch controller input into Avatar movements in two parts:

1. Processing the Touch inputs2. Updating the Avatar

Processing the Touch Controller Inputs

We translate the position and orientation of the head-mounted display and the left and right Touch controllersto avatar body and hand positions using our helper function _ovrAvatarTransformFromGlm().

We call our helper function _ovrAvatarHandInputStateFromOvr() to translate the various Touch button,trigger, and touch states.

// Convert the OVR inputs into Avatar SDK inputsovrInputState touchState;ovr_GetInputState(ovr, ovrControllerType_Active, &touchState);ovrTrackingState trackingState = ovr_GetTrackingState(ovr, 0.0, false); glm::vec3 hmdP = _glmFromOvrVector(trackingState.HeadPose.ThePose.Position);glm::quat hmdQ = _glmFromOvrQuat(trackingState.HeadPose.ThePose.Orientation);glm::vec3 leftP = _glmFromOvrVector(trackingState.HandPoses[ovrHand_Left].ThePose.Position);glm::quat leftQ = _glmFromOvrQuat(trackingState.HandPoses[ovrHand_Left].ThePose.Orientation);glm::vec3 rightP = _glmFromOvrVector(trackingState.HandPoses[ovrHand_Right].ThePose.Position);glm::quat rightQ = _glmFromOvrQuat(trackingState.HandPoses[ovrHand_Right].ThePose.Orientation);

ovrAvatarTransform hmd;_ovrAvatarTransformFromGlm(hmdP, hmdQ, glm::vec3(1.0f), &hmd);

ovrAvatarTransform left;_ovrAvatarTransformFromGlm(leftP, leftQ, glm::vec3(1.0f), &left);

ovrAvatarTransform right;_ovrAvatarTransformFromGlm(rightP, rightQ, glm::vec3(1.0f), &right);

ovrAvatarHandInputState inputStateLeft;_ovrAvatarHandInputStateFromOvr(left, touchState, ovrHand_Left, &inputStateLeft);

ovrAvatarHandInputState inputStateRight;_ovrAvatarHandInputStateFromOvr(right, touchState, ovrHand_Right, &inputStateRight);

Page 14: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

14 | Avatar SDK Getting Started Guide | Avatar

Updating the Avatar

Everything that can be changed has a related update function in the SDK. Our helper function_updateAvatar() calls the individual pose update functions:

ovrAvatarPose_UpdateBody(avatar, hmd);ovrAvatarPose_UpdateHands(avatar, left, right);

It also closes the update by finalizing the updates to the avatar's pose with a timestamp deltaSeconds. Thistimestamp is used for avatar playback and recording as discussed in Pose Recording and Playback.

ovrAvatarPose_Finalize(avatar, deltaSeconds);

Native C/C++ (Gear VR) Getting StartedGet started using Oculus Avatars in native GearVR code by creating an avatar project using the NativeApplication Framework Template.

Download and Prepare Oculus SDKs

Our SDKs are packaged in .zip files on our developer website.

1. Download the Oculus Avatars SDK .zip package from https://developer.oculus.com/downloads/ and thenextract the contents to C:\dev.

2. Download the Oculus Mobile SDK .zip package from https://developer.oculus.com/downloads/ extract thecontents to C:\dev, and then rename the ovr_sdk_mobile_x.x.x folder to ovr_sdk_mobile.

3. Download the Oculus Platform SDK .zip package from https://developer.oculus.com/downloads/ extract thecontents to C:\dev, and then rename the OVRPlatformSDK_vx.x.x folder to OVRPlatformSDK.

4. Save the following code as C:\dev\OVRPlatformSDK\Android\jni\Android.mk:

LOCAL_PATH := $(call my-dir)include $(CLEAR_VARS)LOCAL_MODULE := libovrplatformloaderLOCAL_SRC_FILES := ../libs/$(TARGET_ARCH_ABI)/$(LOCAL_MODULE).soLOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/../../Includeifneq (,$(wildcard $(LOCAL_PATH)/$(LOCAL_SRC_FILES))) include $(PREBUILT_SHARED_LIBRARY) endif

Create a New App Using the Application Framework

Use the Native Application Framework Template to create a new Gear VR project called "mirror" and placeyour Android device OSIG file inside the assets folder.

1. Run these commands from a Windows command prompt:

cd C:\dev\ovr_sdk_mobile\VrSamples\Native\VrTemplatemake_new_project.bat mirror oculus

2. Connect your Android device to your computer.3. Create an Oculus Signature File for your Android device at https://dashboard.oculus.com/tools/osig-

generator/and then copy it to the folder C:\dev\ovr_sdk_mobile\VrSamples\Native\mirror\assets.

For more information, see Creating New Apps with the Framework Template.

Page 15: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Getting Started Guide | 15

Modify the Sample Code with a Gear VR App ID

The Avatar SDK Samples folder contains a Gear VR version of our Rift mirror sample. Because this sample usesOculus platform calls, you must add your own Gear VR app ID to the sample code. This app ID must be from aGear VR app owned by your developer organization and your Oculus user must be subscribed to at least onerelease channel in that app.

1. Copy the contents of C:\dev\OVRAvatarSDK\Samples\MirrorAndroid to C:\dev\ovr_sdk_mobile\VrSamples\Native\mirror\Src

2. Change #define APP_ID "1221388694657274" in Src\OvrApp.cpp so that it contains the Gear VRapp ID of an app that belongs to your developer organization.

Modify the Android.mk Makefile

We need to modify the Android.mk makefile with the paths to our sources and our Avatar and Platform SDKlibrary files.

1. Locate the Android.mk file in C:\dev\ovr_sdk_mobile\VrSamples\Native\mirror\Projects\Android\jni.

2. Modify the contents of Android.mk as follows:

LOCAL_PATH := $(call my-dir)

include $(CLEAR_VARS)

include ../../../../../cflags.mk

LOCAL_MODULE := ovrappLOCAL_SRC_FILES := ../../../Src/OvrApp.cpp ../../../Src/AvatarManager.cppLOCAL_STATIC_LIBRARIES := vrsound vrmodel vrlocale vrgui vrappframework libovrkernelLOCAL_SHARED_LIBRARIES := vrapi libovrplatformloader libovravatarloader

include $(BUILD_SHARED_LIBRARY)

$(call import-module,LibOVRKernel/Projects/AndroidPrebuilt/jni)$(call import-module,VrApi/Projects/AndroidPrebuilt/jni)$(call import-module,VrAppFramework/Projects/AndroidPrebuilt/jni)$(call import-module,VrAppSupport/VrGUI/Projects/AndroidPrebuilt/jni)$(call import-module,VrAppSupport/VrLocale/Projects/AndroidPrebuilt/jni)$(call import-module,VrAppSupport/VrModel/Projects/AndroidPrebuilt/jni)$(call import-module,VrAppSupport/VrSound/Projects/AndroidPrebuilt/jni)$(call import-module,../OVRPlatformSDK/Android/jni)$(call import-module,../OVRAvatarSDK/Android/jni)

Build and Launch the Project

Run C:\dev\ovr_sdk_mobile\VrSamples\Native\mirror\Projects\Android\build.bat to buildand launch the app on your device.

Page 16: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

16 | Avatar SDK Getting Started Guide | Avatar

Unreal (Rift) Getting StartedThe Oculus Avatar SDK for Unreal Beta download file contains an Unreal Engine (UE) C++ sample projectillustrating and implementing all the features available to Oculus Avatars in UE.

The example project demonstrates:

• using avatar classes to create and destroy UE avatar objects.• changing hand poses to custom hand poses and rendering Touch controllers.• recording local avatar movement packets and replaying the packets back on remote avatars (including voice

visualizations).

Note: Oculus Avatars for UE are for C++ projects. A blueprints version is not available at this time.

Page 17: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Getting Started Guide | 17

Requirements

• Unreal Editor 4.15 or later• Microsoft Visual Studio 2015 with C++• Oculus Avatar SDK for Unreal Beta

Architecture of a UE Avatar Project

Oculus Avatars for UE are implemented as a plugin. Avatars are embodied within UOvrAvatarActorComponents that you can attach to the UE actors you desire. This lets you keep your game-side codeseparate from our implementation of avatars.

Some of the files and folders in our example project and their primary functions include:

• Config/DefaultInput.ini - contains the avatar input settings for Touch controllers.• Config/DefaultEngine.ini - contains your app ID and adds Oculus Platform as a subsystem.• Content/Avatars/ - contains the material assets used by the avatars.• Plugins/ - contains Oculus Avatars implemented as an Unreal plugin.• Source/LocalAvatar.cpp and RemoteAvatar.cpp - contain the "game-side" classes that demonstrate how to

attach avatar components to actor classes.• AvatarSamples.uproject - enables the OvrAvatar plugin.

Launching the Avatar Samples Unreal Project

1. Extract the contents of the Oculus Avatar SDK for Unreal Beta .zip file.2. Launch AvatarSamples.uproject.3. Click Play > VR Preview4. Put your Rift on.

You should see the hands of your avatar. This first person view is also called the local avatar.

Note: The Oculus Avatar SDK Unreal Beta sample project has reflection effects that are not appropriatefor VR rendering techniques. To fix, select the Forward Shading check box in Project Settings >Engine > Rendering > Forward Render.

Page 18: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

18 | Avatar SDK Getting Started Guide | Avatar

The code that spawns your first-person avatar is in LocalAvatar.cpp:

ALocalAvatar::ALocalAvatar(){ RootComponent = CreateDefaultSubobject<USceneComponent>(TEXT("LocalAvatarRoot"));

PrimaryActorTick.bCanEverTick = true; AutoPossessPlayer = EAutoReceiveInput::Player0;

BaseEyeHeight = 170.f;

AvatarComponent = CreateDefaultSubobject<UOvrAvatar>(TEXT("LocalAvatar")); AvatarComponent->SetVisibilityType(ovrAvatarVisibilityFlag_FirstPerson); AvatarComponent->SetPlayerHeightOffset(BaseEyeHeight / 100.f);}

Spawning and Destroying Remote Avatars

Squeeze the right Touch trigger to spawn avatars in a circle around you. Squeeze the left Touch trigger todestroy them. These third-person avatars with hands, heads, and base cones are called remote avatars.

Page 19: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Getting Started Guide | 19

The remote avatars in this sample mimic your movements because we hooked them up to our avatar packetrecording and playback system. This system records both your movements and your microphone amplitude,letting us transmit them to remote avatars and animate them accordingly. Speak or sing to see the voiceanimations on the remote avatars.

The packet recording is handled by ALocalAvatar::UpdatePacketRecording(float DeltaTime) inLocalAvatar.cpp.

Packet playback on remote avatars is handled by ARemoteAvatar::Tick in RemoteAvatar.cpp. You mightnotice a small delay in the response between your local avatar movements and the corresponding movement inthe remote avatars. This is an artificial delay we added to the sample to simulate network latency.

To toggle packet recording and playback:

• Press A on your right Touch.

Creating Custom Hand Poses

Press the thumbsticks to cycle through the following hand poses:

Page 20: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

20 | Avatar SDK Getting Started Guide | Avatar

• a built-in pose for gripping a sphere:

AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_GripSphere);

• a built-in pose for gripping a cube:

AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_GripCube);

• a custom hand gesture built from an array of joint transforms, gAvatarRightHandTrans:

AvatarComponent->SetCustomGesture(ovrHand_Right, gAvatarRightHandTrans, HAND_JOINTS);

• a built-in pose depicting Touch controllers:

AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_Default);AvatarComponent->SetControllerVisibility(ovrHand_Right, true);

The code snippets above are from LocalAvatar.cpp and set the poses for the right hand. For the left hand,substitute the appropriate left hand functions and constants.

Detaching and Moving Hands Independent of Tracking

Press Y on the left Touch or press B on the right Touch to detach the avatar hands from Touch tracking. You canthen use the thumbsticks to drive the avatar hand movements.

The following code in LocalAvatar.cpp detaches the hands:

AvatarHands[ovrHand_Right] = AvatarComponent->DetachHand(ovrHand_Right);

ALocalAvatar::DriveHand drives the hand movement after detaching.

Adding Avatars to An Existing Project

Copy the Plugins folder to the root folder of your project. It contains our OvrAvatar plugin.

Update your project's Config/DefaultInput.ini file with content from the sample project's Config/DefaultInput.ini file.

Update the Modules and Plugins sections of your .uproject file with additional items. Remember to add acomma (,) to the last item in any existing Modules or Plugins sections before pasting the additional lines:

"Modules": [ { "AdditionalDependencies": [ "Engine", "OnlineSubsystem", "OnlineSubsystemUtils" ] } ], "Plugins": [ { "Name": "OnlineSubsystemOculus", "Enabled": true }, { "Name": "OvrAvatar", "Enabled": true } ]

Update your project's Config/DefaultEngine.ini file with the following:

[OnlineSubsystem]

Page 21: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Getting Started Guide | 21

DefaultPlatformService=Oculus [OnlineSubsystemOculus]bEnabled=trueOculusAppId=YOUR_APP_ID

Get YOUR_APP_ID from the Oculus Developer Dashboard: https://dashboard.oculus.com

In your code, the local user actor must implement the SetupPlayerInputComponent function, as thecomponent needs controller input sent to it to animate the hands properly. A set of macros define theserepetitive functions. Two things to consider are that:

• the macros depend on the component member variable being named AvatarComponent• it stubs out member functions for the Actor class.

You also need to replace the ALocalAvatar:: entries with the name of your own Actor class.

// LocalAvatar.cppvoid ALocalAvatar::SetupPlayerInputComponent(UInputComponent* Input){ Super::SetupPlayerInputComponent(Input);

#define INPUT_ENTRY(entry, hand, flag) \ Input->BindAction(#entry, IE_Pressed, this, &ALocalAvatar::##entry##_Pressed); \ Input->BindAction(#entry, IE_Released, this, &ALocalAvatar::##entry##_Released); INPUT_COMMAND_TUPLE#undef INPUT_ENTRY

#define AXIS_ENTRY(entry, hand, flag) \ Input->BindAxis(#entry, this, &ALocalAvatar::##entry##_Value); AXIS_INPUT_TUPLE#undef AXIS_ENTRY

#define CUSTOM_ENTRY(entry, hand, field, invert) \ Input->BindAxis(#entry, this, &ALocalAvatar::##entry##_Value); CUSTOM_AXIS_TUPLE#undef CUSTOM_ENTRY}

#define CUSTOM_ENTRY(entry, hand, field, invert) \ void ALocalAvatar::##entry##_Value(float value) { AvatarComponent->##entry##_Value(value); }CUSTOM_AXIS_TUPLE#undef CUSTOM_ENTRY

#define INPUT_ENTRY(entry, hand, flag) \ void ALocalAvatar::##entry##_Pressed() { AvatarComponent->##entry##_Pressed();}\ void ALocalAvatar::##entry##_Released() { AvatarComponent->##entry##_Released(); }INPUT_COMMAND_TUPLE#undef INPUT_ENTRY

#define AXIS_ENTRY(entry, hand, flag) \ void ALocalAvatar::##entry##_Value( float value) { AvatarComponent->##entry##_Value(value); }AXIS_INPUT_TUPLE#undef AXIS_ENTRY

Note in LocalAvatar.h where these functions are declared:

private:

#define INPUT_ENTRY(entry, hand, flag) \ void entry##_Pressed();\ void entry##_Released(); INPUT_COMMAND_TUPLE#undef INPUT_ENTRY

#define AXIS_ENTRY(entry, hand, flag) \ void entry##_Value( float value); AXIS_INPUT_TUPLE#undef AXIS_ENTRY

#define CUSTOM_ENTRY(entry, hand, field, invert) \

Page 22: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

22 | Avatar SDK Getting Started Guide | Avatar

void entry##_Value( float value); CUSTOM_AXIS_TUPLE#undef CUSTOM_ENTRY

Place your request to fetch the avatar wherever you have set up online login functionality. For example:

void ALocalAvatar::OnLoginComplete(int32 LocalUserNum, bool bWasSuccessful, const FUniqueNetId& UserId, const FString& Error){ IOnlineIdentityPtr OculusIdentityInterface = Online::GetIdentityInterface(); OculusIdentityInterface->ClearOnLoginCompleteDelegate_Handle(0, OnLoginCompleteDelegateHandle);

if (AvatarComponent) { AvatarComponent->RequestAvatar(10149999027226798); }}

Page 23: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Guide | 23

Avatar SDK GuideThis document describes how to install, configure, and use the Oculus Avatar SDK.

The Avatar SDK consists of native C++ and Unity documentation, samples, plugins, source code, and librariesto help developers implement Oculus Avatars in their own VR experiences.

Unity Features

Loading Personalized Avatars

You can replace the default blue avatar with a personalized avatar using the Oculus Platform package. The baseAvatar SDK OvrAvatar.cs class is already set up to load the avatar specifications of users, but we need to callOculus Platform functions to request valid user IDs.

After getting a user ID, we can set the oculusUserID of the avatar accordingly. The timing is important,because this has to happen before the Start() function in OvrAvatar.cs gets called.

Note: For security reasons, Oculus Avatars and Oculus Platform must be initialized with a valid App IDbefore accessing user ID information. You can create a new application and obtain an App ID from thedeveloper dashboard. For more information, see Oculus Platform Setup.

Page 24: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

24 | Avatar SDK Guide | Avatar

The example below shows one way of doing this. It defines a new class called PlatformManager. It extendsour existing Getting Started sample. When run, it replaces the default blue avatar with the personalized avatarof the user logged on to Oculus Home.

1. Import the Oculus Platform SDK Unity package into your Unity project.2. Specify valid App IDs for both the Oculus Avatars and Oculus Platform plugins:

a. Click Oculus Avatars > Edit Configuration and paste your Oculus Rift App Id or Gear VR App Id intothe field.

b. Click Oculus Platform > Edit Settings and paste your Oculus Rift App Id or Gear VR app Id into thefield.

3. Create an empty game object named PlatformManager:

a. Click GameObject > Create Empty.b. Rename the game object PlatformManager.

4. Click Add Component, enter New Script in the search field, and then select New Script.5. Name the new script PlatformManager and set Language to C Sharp.6. Copy and save the following text as Assets\PlatformManager.cs.

using UnityEngine;using Oculus.Avatar;using Oculus.Platform;using Oculus.Platform.Models;using System.Collections;

public class PlatformManager : MonoBehaviour {

public OvrAvatar myAvatar;

void Awake () { Oculus.Platform.Core.Initialize(); Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback); Oculus.Platform.Request.RunCallbacks(); //avoids race condition with OvrAvatar.cs Start(). }

private void GetLoggedInUserCallback(Message<User> message) { if (!message.IsError) { myAvatar.oculusUserID = message.Data.ID; } }}

7. In the Unity Editor, select PlatformManager from the Hierarchy. The My Avatar field appears in theInspector.

8. Drag LocalAvatar from the Hierarchy to the My Avatar field.

Handling Multiple Personalized Avatars

If you have a multi-user scene where each avatar has different personalizations, you probably already have theuser IDs of all the users in your scene because you had to retrieve that data to invite them in the first place. Setthe oculusUserID for each user 's avatar accordingly.

If your scene contains multiple avatars of the same person, you can iterate through all the avatar objects in thescene to change all their oculusUserID values. For example, the LocalAvatar and RemoteLoopback samplescenes both contain two avatars of the same player.

Here is an example of how to modify the callback of our PlatformManager class to personalize the avatars inthe sample scenes:

using UnityEngine;using Oculus.Avatar;using Oculus.Platform;using Oculus.Platform.Models;using System.Collections;

Page 25: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Guide | 25

public class PlatformManager : MonoBehaviour {

void Awake () { Oculus.Platform.Core.Initialize(); Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback); Oculus.Platform.Request.RunCallbacks(); //avoids race condition with OvrAvatar.cs Start(). }

private void GetLoggedInUserCallback(Message<User> message) { if (!message.IsError) { OvrAvatar[] avatars = FindObjectsOfType(typeof(OvrAvatar)) as OvrAvatar[]; foreach (OvrAvatar avatar in avatars) { avatar.oculusUserID = message.Data.ID; } } }}

Avatar Prefabs

The Avatar Unity package contains two prefabs for Avatars: LocalAvatar and RemoteAvatar. They are located inOvrAvatar >Content > PreFabs. The difference between LocalAvatar and RemoteAvatar is in the driver, thecontrol mechanism behind avatar movements.

The LocalAvatar driver is the OvrAvatarDriver script which derives avatar movement from the logged inuser's Touch and HMD or.

The RemoteAvatar driver is the OvrAvatarRemoteDriver script which gets its avatar movement from thepacket recording and playback system.

Sample Scenes

There are four sample scenes in the Avatar Unity package:

• Controllers

Demonstrates how first-person avatars can be used to enhance the sense of presence for Touch users.• GripPoses

A helper scene for creating custom grip poses. See Custom Touch Grip Poses.• LocalAvatar

Demonstrates the capabilities of both first-person and third-person avatars. Does not yet includemicrophone voice visualization or loading an Avatar Specification using Oculus Platform.

• RemoteLoopback

Demonstrates the avatar packet recording and playback system. See Recording and Playing Back AvatarPose Updates.

Reducing Draw Calls with the Combine Meshes Option

Each avatar in your scene requires 11 draw calls per eye per frame (22 total). The Combine Meshes optionreduces this to 3 draw calls per eye (6 total) by combining all the mesh parts into a single mesh. This is animportant performance gain for Gear VR as most apps typically need to stay within a draw call budget of 50 to100 draw calls per frame. Without this option, just having 4 avatars in your scene would use most or all of thatbudget.

You should almost always select this option when using avatars. The only drawback to using this option is thatyou are no longer able to access mesh parts individually, but that is a rare use case.

Page 26: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

26 | Avatar SDK Guide | Avatar

Custom Touch Grip Poses

The GripPoses sample lets you change the hand poses by rotating the finger joints until you get the pose youwant. You can then save these finger joint positions as a Unity prefab that you can load at a later time.

In this example, we will pose the left hand to make it look like a scissors or bunny rabbit gesture.

Creating the left hand pose:

1. Open the Samples > GripPoses > GripPoses scene.2. Click Play.3. Press E to select the Rotate transform tool.4. In the Hierarchy window, expand LocalAvatar > hand_left > LeftHandPoseEditHelp >

hands_l_hand_world > hands:b_l_hand.

5. Locate all the joints of the fingers you want to adjust. Joint 0 is closest to the palm, subsequent jointsare towards the finger tip. To adjust the pinky finger joints for example, expand hands:b_l_pinky0 >hands:b_l_pinky1 > hands:b_l_pinky2 > hands:b_l_pinky3.

6. In the Hierarchy window, select the joint you want to rotate.

Page 27: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Guide | 27

7. In the Scene window, click a rotation orbit and drag the joint to the desired angle.

8. Repeat these two steps until you achieve the desired pose.

Saving the left hand pose:

1. In the Hierarchy window, drag hand_l_hand_world to the Project window.2. In the Project window, rename this transform to something descriptive, for example:

poseBunnyRabbitLeft.

Using the left hand pose:

1. In the Hierarchy window, select LocalAvatar.2. Drag poseBunnyRabbitLeft from the Project window to the Left Hand Custom Pose field in the Inspector

Window.

Click Play again. You will see that the left hand is now frozen in our custom bunny grip pose.

Settings for Rift Stand-alone Builds

To make Rift avatars appear in stand-alone executable builds, we need to change two settings:

• Add the Avatar shaders to the Always Included Shaders list in your project settings:

Page 28: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

28 | Avatar SDK Guide | Avatar

1. Click Edit > Project Settings > Graphics.2. Under Always Included Shaders, add +3 to the Size and then press Enter.3. Add the following shader elements: AvatarSurfaceShader, AvatarSurfaceShaderPBS,

AvatarSurfaceShaderSelfOccluding.• Build as a 64-bit application:

1. Click File > Build Settings.2. Set Architecture to x86_x64.

Making Rift Hands Interact with the Environment

To allow avatars to interact with objects in their environment, use the OVRGrabber and OVRGrabblecomponents. For a working example, see the AvatarWithGrab sample scene included in the Oculus UnitySample Framework.

Accessing Transforms for the Hands and Mouth

You can use our accessor functions to get the transforms for the avatar hands and mouth without having to walkthe hierarchy. You can specify the hand and joint and use GetHandTransform() to get its transform

public Transform GetHandTransform(HandType hand, HandJoint joint)

The enums for HandType are: Right, Left

The enums for HandJoint are: HandBase, IndexBase, IndexTip, ThumbBase, ThumbTip,

To get the forwards direction of the avatar hand as a vector, so you know where the avatar hand is pointing,using the GetPointingDirection(). Forwards and Up are perpendicular to each other.

public void GetPointingDirection(HandType hand, ref Vector3 forward, ref Vector3 up)

To get the transform of the avatar's mouth, use GetMouthTransform(). This is useful when you want tospatialize avatar speech as point-source audio located at the mouth.

public Transform GetMouthTransform()

Adding C++ Avatar SupportThis guide outlines Avatar SDK support with a C/C++ game engine or application. The source code samplesused in this guide are taken from the Mirror demo, available in OVRAvatarSDK\Samples\Mirror.

To add Avatar support to your Visual C++ project:

1. Add Oculus Platform Support to your Project. (https://developer.oculus.com/documentation/platform/latest/concepts/dg-development/)

2. Open the project's Properties > Configuration Properties > VC++ Directories page.3. In Include Directories, add the location of the Avatar SDK includes folder (InstallFolder\include).4. In Library Directories, add the location of the Avatar SDK library folder (InstallFolder\Windows).5. Add the Avatar library file as linker input:

a. Expand Linker > Input.b. In Additional Dependencies, add InstallFolder\Windows\libovravatar.lib.

Page 29: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Guide | 29

6. Add #include <OVR_Avatar.h> and #include <OVR_Platform.h>.7. Initialize the Platform module using your app ID through ovr_PlatformInitializeWindows(appID)8. Initialize the Oculus SDK through ovr_Initialize.9. Compile the Oculus Avatar OpenGL fragment and vertex reference shader into a shader program.10.Initialize the Avatar module through ovrAvatar_Initialize(appID).

Avatar Message Queue

The functions ovrAvatar_RequestAvatarSpecification() and ovrAvatarAsset_BeginLoading()are asynchronous. The avatar message queue contains the results of these operations.

You can retrieve the most recent message with ovrAvatarMessage_Pop(). After you finish processing amessage on the queue, be sure to call ovrAvatarMessage_Free() to free up the memory used by the pop.

Rendering Avatar ComponentsAvatars are composed of avatar components (body, base, hands, controller) which are themselves composed ofrender parts. Each Oculus user has an Avatar Specification that indicates the mesh and texture assets that needto be loaded to recreate the avatar.

Our Mirror.cpp example code contains good examples of the entire process and includes helper functions,prefixed with _, that we have written to make it easier to render complete avatars.

The complete process goes something like this:

1. Retrieve the avatar specification for the Oculus user.ovrAvatar_RequestAvatarSpecification(userID);

2. Set the Avatar capabilities. ovrAvatar_Create(message->avatarSpec,ovrAvatarCapability_All);

3. Iterate through the avatar specification to load the static avatar assets (mesh and textures) into the avatar.ovrAvatar_GetReferencedAsset(_avatar);

4. Apply the vertex transforms to determine the position of the avatar component.5. Apply the material states to determine the appearance of the avatar component.6. For each render part of an avatar component:

a. Get the OpenGL mesh data and tell the renderer to use the Avatar shader program you compiled earlier.b. Calculate the inputs on the vertex uniforms.c. Set the view position, the world matrix, the view matrix, and the array of mesh poses.d. Transform everything in the joint hierarchy.e. Set the material state.f. Draw the mesh, depth first so that it self-occludes.g. Render to the color buffer.

7. When there are no more components to render, the avatar render is complete.

Visible Controllers

To render avatar hands without controllers:

ovrAvatar_SetLeftControllerVisibility(_avatar, 0);ovrAvatar_SetRightControllerVisibility(_avatar, 0);

To render avatar hands with controllers:

ovrAvatar_SetLeftControllerVisibility(_avatar, 1);

Page 30: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

30 | Avatar SDK Guide | Avatar

ovrAvatar_SetRightControllerVisibility(_avatar, 1);

Setting a Custom Touch Grip Pose

You can pass your own custom transforms to the hand pose functions or use our cube and sphere preset handposes. Here is an example of a custom pose made from freezing the hands in their current pose:

const ovrAvatarHandComponent* handComp = ovrAvatarPose_GetLeftHandComponent(_avatar);const ovrAvatarComponent* comp = handComp->renderComponent;const ovrAvatarRenderPart* renderPart = comp->renderParts[0];const ovrAvatarRenderPart_SkinnedMeshRender* meshRender = ovrAvatarRenderPart_GetSkinnedMeshRender(renderPart);ovrAvatar_SetLeftHandCustomGesture(_avatar, meshRender->skinnedPose.jointCount, meshRender->skinnedPose.jointTransform);handComp = ovrAvatarPose_GetRightHandComponent(_avatar);comp = handComp->renderComponent;renderPart = comp->renderParts[0];meshRender = ovrAvatarRenderPart_GetSkinnedMeshRender(renderPart);ovrAvatar_SetRightHandCustomGesture(_avatar, meshRender->skinnedPose.jointCount, meshRender->skinnedPose.jointTransform);

To pose the hands as if to grip cubes:

ovrAvatar_SetLeftHandGesture(_avatar, ovrAvatarHandGesture_GripCube);ovrAvatar_SetRightHandGesture(_avatar, ovrAvatarHandGesture_GripCube);

To pose the hands as if to grip spheres:

ovrAvatar_SetLeftHandGesture(_avatar, ovrAvatarHandGesture_GripSphere);ovrAvatar_SetRightHandGesture(_avatar, ovrAvatarHandGesture_GripSphere);

To unfreeze the hand poses:

ovrAvatar_SetLeftHandGesture(_avatar, ovrAvatarHandGesture_Default);ovrAvatar_SetRightHandGesture(_avatar, ovrAvatarHandGesture_Default);

Voice VisualizationVoice visualization is an avatar component. It is created as a projection on top of an existing mesh.

Create the microphone:

ovrMicrophoneHandle mic = ovr_Microphone_Create(); if (mic) { ovr_Microphone_Start(mic); }

Pass an array of voice samples to ovrAvatarPose_UpdateVoiceVisualization().

float micSamples[48000]; size_t sampleCount = ovr_Microphone_ReadData(mic, micSamples, sizeof(micSamples) / sizeof(micSamples[0])); if (sampleCount > 0) { ovrAvatarPose_UpdateVoiceVisualization(_avatar, (uint32_t)sampleCount, micSamples); }

The render parts of the voice visualization component are a ProjectorRender type.

Page 31: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Guide | 31

Pose Recording and PlaybackThe Avatar SDK contains a complete avatar pose recording and playback system. You can save pose data topackets at regular intervals and then transmit these packets to a remote computer to drive the avatar posesthere.

Pose Recording

Call ovrAvatarPacket_BeginRecording() to begin recording

ovrAvatarPacket_BeginRecording(_avatar);

After you record as many frames worth of pose changes you want, stop the recording withovrAvatarPacket_EndRecording() and then write your packet out with ovrAvatarPacket_Write().

ovrAvatarPacket* recordedPacket = ovrAvatarPacket_EndRecording(_avatar);// Write the packet to a byte buffer to exercise the packet writing codeuint32_t packetSize = ovrAvatarPacket_GetSize(recordedPacket);uint8_t* packetBuffer = (uint8_t*)malloc(packetSize);ovrAvatarPacket_Write(recordedPacket, packetSize, packetBuffer);ovrAvatarPacket_Free(recordedPacket);

Transmit your data to your destination using your own network code.

Pose Playback

To read your pose data back into packets:

// Read the buffer back into a packetplaybackPacket = ovrAvatarPacket_Read(packetSize, packetBuffer);free(packetBuffer);

To play the packets back:

float packetDuration = ovrAvatarPacket_GetDurationSeconds(packet);*packetPlaybackTime += deltaSeconds;if (*packetPlaybackTime > packetDuration){ ovrAvatarPose_Finalize(avatar, 0.0f); *packetPlaybackTime = 0;}ovrAvatar_UpdatePoseFromPacket(avatar, packet, *packetPlaybackTime);

The playback routine uses the timestamp deltaSeconds to interpolate a tween pose in case the frames onthe remote computer are offset by a different amount.

Page 32: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

32 | Avatar SDK C/C++ Developer Reference | Avatar

Avatar SDK C/C++ Developer ReferenceThe Oculus Avatar SDK Developer Reference contains detailed information about the data structures and filesincluded with the SDK.

See Oculus Avatar SDK Reference Manual 1.15.

Page 33: Avatar SDK - s3.amazonaws.coms3.amazonaws.com/static.oculus.com/documentation/... · Click File > Build Settings and select Android. Download and install Unity Android Support and

Avatar | Avatar SDK Documentation Archive | 33

Avatar SDK Documentation ArchiveThis section provides links to legacy documentation.

Select from the following:

Version PDF

Latest Avatar SDK Documentation

1.14 Avatar SDK Documentation

1.13 Avatar SDK Documentation

1.12 Avatar SDK Documentation