Anatomy of a Button Press in VR

Sarthak Ghosh
7 min readAug 30, 2019

--

Buttons form an important part of many VR experiences — Virtual buttons can be placed in the environment for activating different game levels, or even for getting user input, for example through virtual keyboards. You might have seen the virtual keyboard in the VR application called “Rec Room” and it works pretty well, doesn’t it?

Keyboard in Rec Room. The controllers appear as hands.

Interactions based on button presses can be quite effective in situations where the margin for human error is minimal and “learnability” of interaction needs to be high. A button by design can indicate the “affordance” of pressing — so in most cases, it wouldn’t need any interaction prompts, even for first time VR users. So today I am going to go into the very basics of how to create a simple, functional button in VR.

3D models

First we need two 3D models :

  1. The base — This is the stationary part with respect to which the button should move. I am going to use a flat platform like structure with a hole in it for the button to fit in.
  2. The button — This is the movable part. I am going to use a basic brick shape that should fit into the hole of the base.
Some simple models made using Blender

We can export these models separately as FBX files and import them into our Unity Scene and scale them down for use in VR — it is better to use the Scale Factor attribute in the import settings of a model. Make sure the pivots of the models are set to their respective centers.

Models imported into Unity with some simple materials.

Text

Next we need to add some text on the surfaces. There can be a few ways of doing this. For our purposes, we can use the 3D text option. You can also try out using Text Mesh Pro. The button text has to be put under the hierarchy of the button so that it moves with the button. For the text to look sharp, we need to increase the font size and then scale down the whole GameObject.

Scene hierarchy (left). 3D text on the surfaces (right)

Rigidbody and Colliders

We need to make sure that when we touch the button’s mesh with our controllers, the button should move back along one of its local axes. So we need to make the buttons Rigid and then add box — colliders on both the button and on the controllers’ ends.

Box Collider on the head of the controller
Inspector view of a controller

Preventing rotation : We need to freeze rotation on the “Rigidbody” component on the button and also deselect both the “Use Gravity” and “Is Kinematic” options.

Detecting controller collisions

We can now create a script that looks for collision events on the controller. This script has a Boolean variable that stays true for the duration of a collision. Furthermore, whenever the controller is colliding, we can initiate a slight haptic vibration on it. Not only does this act as a good feedback mechanism, but it also acts as a debugging mechanism for us to know if Unity is detecting the collision or not. Detecting collisions with small objects in Unity can be very tricky sometimes.

This script has to be attached to both the controllers.

Constraining the button motion

Whenever the button is disturbed due to collision, we need a script to make sure that the button only moves along one of its local axes. We should keep that axis free, along which the button should be pressable and lock all other axes. Mentioning a return speed helps the button to return to its original position, after the user removes her hand (controller).

Finally, our button game object looks like this in the inspector view

Scene Hierarchy

Button press detection

For any button, we need to know whether it has been pressed down far enough for it to initiate an action or an event. We can do this in a number of ways. The best way is to place a box collider beneath the button at a certain distance along its motion axis. This collider needs to be set to receive trigger events (check “Is Trigger”). It may be a good idea to make the collider thick enough so that the button doesn’t go through the other side of the collider. Using this method, we can have any number of buttons on a keyboard and every press can be detected with just one large enough collider at the back.

The fluorescent green wire-frame box indicates the box-collider for detecting button presses.

Next, we write a script to control what happens when a button enters this collider:

The inspector view of the base collider

Designing feedback

From a end-user point of view, this is probably the most important step. We need to be able to communicate to the user, what is the current state of the button ? In their blog, Leap motion defines 6 stages of a button press:

  • Approach [HoverBegin]
  • Contact [ContactBegin]
  • Depression [ContactStay]
  • Engagement [OnPress]
  • Ending contact [ContactEnd]
  • Recession [HoverEnd]

Feedback design for approach and recession may be useful in certain cases, but for our purposes, we only concentrate on the other 4 states. Now, for each of the 4 states of Contact, Depression, Engagement and Ending Contact, we can design visual, auditory and haptic feedback which can be used either individually or in different combinations together.

Visual

  • Contact : Change of button color/transparency
  • Depression : Just the movement of the button is a form of visual feedback. We can also have slight changes in the scale as it is pressed more and more — for example flattening it out a bit.
  • Engagement : We can have the text color change upon engagement, or even use some particle effects radiating from the button base. But depending upon the complexity of a scene, we can decide to keep it as simple as possible.
  • Ending contact : Change back the color/transparency to its initial state

Auditory

If we are designing for accessibility, it will be best to include Auditory feedback for all the 4 states. For now we can just include a soft click on Contact and a louder click on Engagement.

Haptic

  • Contact : A short, mild buzz on the controller that is colliding
  • Depression : We can design discrete mild pulses as the button is continually pressed. This can give the feeling of pressing against a mechanical spring
  • Engagement : A quick, stronger (greater magnitude) buzz on the controller that is colliding
  • Ending contact : Same short, mild button as used for the contact state.

Final Output

Discussion

Before ending, I would like to point out a few other things to consider.

Debouncing: Sometimes, if the meshes are complicated, we can receive multiple “OnTriggerEnter” events from Unity. In such cases, it may be useful to include some debouncing techniques, like introducing a time delay after every button press, so that subsequent events are not considered.

Preventing Drag: If the user starts moving her hand from underneath the button, then the button can move along with the hand upwards, away from its base. This is often undesirable. So it makes sense to include another script to make sure that this kind of a drag is prevented.

Optimal Interaction Zone: Any such keyboard, that requires the user to reach out and press a button, should be displayed in an optimal zone, near the current position of the user. One should determine the optimal position and orientation of the keyboard by testing and adjusting a number of times. I like to do it with respect to the camera position, to account for the height of the user.

Spring Joints: The movement of buttons can also be modeled using spring joints in unity, but that discussion is for another time.

Using Unity Events: If we have multiple keys, then we can use Unity Events to trigger an event with the key-name, every time a key is pressed to enter the trigger collider.

Hands instead of controller: Button presses can work better if the controller models are replaced with hand models such that the index finger is raised while the other fingers are curled. Some basic models can be found on the Oculus website that is provided under the creative commons library.

Lastly, a shout-out to Felix Noller for this blog for the inspiration.

--

--

Sarthak Ghosh
Sarthak Ghosh

Written by Sarthak Ghosh

Engineer, Artist, Singer-songwriter

No responses yet