Friday, January 9, 2015

Unity 4.6: Creating a look-based GUI for VR

In a previous post, I talked about creating GUIs for VR using world space canvases. In that example,   the GUI only displayed text - it didn't have any input components (buttons, sliders, etc). I wanted to add a button above each thought bubble the user could click to hear the text read aloud.  



As I had used a look-based interaction to toggle the visibility of the GUI, this brought up the obvious question of how do I use a similar interaction for GUI input?  And, importantly,  how do I do it in a way that takes advantage of Unity's GUI EventSystem?

Turns out, what's needed is a custom input module that detects where the user is looking. There is an excellent tutorial posted on the Oculus forums  by css that is a great place to start. That tutorial includes the code for a sample input module and walks you through the process of setting up the GUI event camera.  (You need to assign an event camera to each canvas and one twist is that the OVR cameras don’t seem to work with the GUI.) By following that tutorial, I was able to get look-based input working very quickly.

Note that while look-based interactions are immersive and fairly intuitive to use, it is worth keeping in mind that look-based input won’t work in all situations. For example, if you have attached the GUI to CenterEyeCamera  to ensure that the user always sees the GUI, the GUI will follow the user’s view meaning the user won’t be able to look at any one specific option.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.