Imagine you want to start developing for the HTC Vive. You want to do all those things like using
- reacting to the controllers
- using pointers
- grabbing and throwing objects
- manipulating things
- using menus and UI
- gaze etc.
This means big effort. Here is where the Virtual Reality Toolkit comes into play. It’s a collection of scripts that was brought to life by a great Britain who also does an amazing job at growing a great community of by now over 180 people. The code is free to use (and contribute to) on Github and for easy consumption available also on the Unity Asset Store. There’s a slack channel where nearly all communication goes through and YouTube tutorials showing all features in action.
Since I like this project so much I decided early on to become a part of it and contribute. For a game I have in my mind I need to interact a lot with the environment. Think of things like buttons, levers, drawers etc. So what I did was to introduce these 3D controls to the toolkit. So far I have a button, slider, dial, lever, drawer and a chest. Next up is a door. The example scene is number 25 which shows all of these in action. Check it out!
What are the advantages over assigning the needed physics components yourself? Many. The controls provide a full out of the box experience. Take the button. Add the script to any game object and it will auto-detect the push direction, the distance when it should trigger, scene gizmo visualization, provide a common interface and most importantly events and a mapped value range. If you want to setup certain components manually you can still do so.
Here are some examples:
The following video shows the scene in action: