The Headjack SDK offers a simple cross-platform VR input system for controller tracking, headset gaze tracking and a few button states. By limiting the input system to this it is possible to get consistent and predictable input handling across multiple VR platforms.
Laser Pointer & Gaze Pointer
To allow the user to select an option from the menu, you can either display a laser pointer originating from the current dominant VR controller or display a crosshair in the middle of the user’s view which can be pointed by the user moving their head.
Controller Tracking
To turn on and show the laser pointer coming from the active controller, set Headjack.VRInput.MotionControllerLaser
to true
. Before using the laser pointer, it is recommended to check whether an active controller is detected by the SDK using Headjack.VRInput.MotionControllerAvailable
and falling back on gaze tracking (see below) if a controller is not currently available/connected.
If you want to get the world position and orientation of the current active VR controller directly, for instance to show a virtual representation of the controller, it can be found at Headjack.VRInput.LaserTransform
.
Head Tracking
To show the crosshair for gaze/head tracking, set Headjack.App.ShowCrosshair
to true
. While it is possible to enable both the gaze crosshair and the controller laser pointer, to avoid confusion it is highly recommended to only enable one of the two and ensuring the other one is properly disabled.
Check Selected Object
Both the laser pointer and the gaze crosshair use Unity’s physics ray casting system to determine which GameObject
is selected, so make sure that every button or other GameObject
that can be selected in the app has an appropriately sized Collider
component.
Checking if the laser or crosshair is currently pointing at a particular GameObject
/Collider
is as simple as calling Headjack.App.IsCrosshairHit()
and passing the GameObject
/Collider
and which ray cast source you want to test (MotionController
, Gaze
or Touch
). Touch
can be used to test for smartphone touch input on the Cardboard platforms in much the same way as the other input sources.
If you want more data about the physics ray cast hit (or miss), you can retrieve this ray cast data directly for each source using Headjack.App.CrosshairHit
, Headjack.App.LaserHit
or Headjack.App.TouchHit
.
Button Input
The button input system in the Headjack SDK only identifies two buttons: Confirm
and Back
. Together with a pointer this should be enough for most simple menu interactions. These virtual buttons are mapped on each supported VR platform to the buttons users might intuitively associate with those two actions. On the Oculus platform for instance, Confirm
is mapped to the A
and X
buttons, as well as the trigger on both controllers. The Back
button is mapped to the B
and Y
buttons. The other platforms and controller types have similar mappings.
These Confirm
and Back
button states are easily checked by getting Headjack.VRInput.Confirm
and Headjack.VRInput.Back
respectively. This will return a Headjack.VRInput.VrButton
object which contains the button Hold
state, Pressed
state (only true on the first frame the button is pressed) and Released
state (only true on the first frame the button is released).
If you need more sophisticated VR input like haptic feedback/rumble, thumbstick/touchpad handling or multiple controller tracking, please refer directly to the vendor-specific SDK documentation of the VR platform(s) you are looking to support.