To experiment with this interaction, I created a demo consisting of a single character idling, minding his own business. When the user raises their hand, the character waves back and a speech bubble appears saying “Hello, there!”
Let’s take a look at the demo setup and then look at how testing the user experience went.
Setup
To create the scene I used basic 3D objects (planes, cubes) and a directional light to create a simple room. The character in the scene is a a rigged human character ("Carl") from the Male Character Pack by Mixamo. The speech bubble is created using a world space canvas (see: Thought Bubbles in a Rift scene). To get my hands in the scene, I used the LeapOVRPlayerController from the Leap Unity Core Assest v.2.2.4 (see: Seeing your hands in VR).For the character animation, I used the Idle and Wave animations from the Raw Mocap data package for Macanim by Unity Technologies (free animations created from motion capture data) and I created an animation controller for the character to control when he is idling and when he waves back at you. The animation controller has two animation states, Idle and Wave. It also has two triggers that can be used to trigger the transition between each state:
The animation controller for the waving character has two states and two triggers. |
And, of course, I wrote a script (wavinghello.cs) to detect when the user has raised their hand. The interesting bit of this script is how you know where the user’s hands are and how you know when a hand has been raised high enough so that you can trigger the appropriate animation. Let's take a look at the script's Update() function:
void Update () {
HandModel[] userHands = handController.GetAllPhysicsHands();
if (userHands.Length > 0){
foreach (HandModel models in userHands){
if (models.GetPalmPosition().y >= centerEyeAnchor.transform.position.y){
anim.SetTrigger("Wave");
changeMenuDisplay(speechbubble, 1);
} else {
anim.SetTrigger("Idle");
changeMenuDisplay(speechbubble, 0);
}
}
} else {
anim.SetTrigger("Idle");
changeMenuDisplay(speechbubble, 0);
}
}
To get the all of the hands in the scene, the script uses GetAllPhysicsHands()
from HandController.cs:
HandModel[] userHands = handController.GetAllPhysicsHands();
GetAllPhysicsHands() returns an array of all Leap physics HandModels for the specified HandController. To get each hand's position, the script uses GetPalmPosition() which returns the Vector3 position of the HandModel relative to the HandController. The HandController is located at 0, 0, 0 relative to its parent object, the CenterEyeAnchor.
The HandController is a child of the CenterEyeAnchor. |
The HandController is located at 0, 0, 0 relative to its parent the CenterEyeAnchor. |
The CenterEyeAnchor object is used by the Oculus Rift integration scripts to maintain a position directly between the two eye cameras. As the cameras are the user’s eyes, if the Y value of a HandModel object's position is greater than the Y value of the centerEyeAnchor, we know the user's hand has been raised above eye level.
The user experience
When testing this demo I was looking at how high the user should raise their hand to trigger the interaction, how long the user should have their hand raised, and, what the application should do when it loses hand tracking. Initially, I went with what seemed comfortable for me. I required the users to raise their hand (measured from the center of their palm) to above eye level and I did not require the user's hand to be raised for any specific amount of time. If the Leap lost hand tracking, the application treated it as though all hands were below eye level.I then grabbed three people to do some testing. The only instruction I gave them was to “raise your hand to get the guy’s attention.” For my first user, the demo worked quite well. He raised his hand and the character waved back as expected. Great so far. My second user was resistant to raising his hand any higher than his nose. He quickly got frustrated as he could not get the guy’s attention. My third user raised his hand and then waved it wildly around so much so that the speech bubble flickered and was unreadable. Quite a range of results for only three users.
For my next iteration, I set the threshold for raising one’s hand a few centimeters below eye level.
models.GetPalmPosition().y >= centerEyeAnchor.transform.position.y - 0.03f
This worked for my second user as it was low enough that he would trigger the interaction, but not so low that he would accidentally trigger it.
I haven’t done anything to address the third user yet, but whatever I do, waving my hands like a maniac is now part of my my own testing checklist.
I’d love to hear if anyone else is using this type of mechanic and what their experiences are.
Hi Karen, This might be a silly question, but I am just wondering why you have used "Raw Mocap data package for Macanim" instead of "Mecanim Locomotion Starter Kit" because, to my limited knowledge, the Mocap data package is a bunch of animations only. Of course, we can use them, but they do not replace the locomotion animator. Could you please elaborate on this part, as I have just started learning about it? Thank you..
ReplyDeleteNo real reason. I only needed a simple animation to test the user experience and it worked for that.
Delete