Wednesday, March 26, 2014

The Facebook acquisition of Oculus VR

So, yesterday the entire world blew up, apparently.  My initial reaction was something along the lines of most people's and I think it went something like this

  • April 1st isn't for a few days yet
  • 'Facebook'... really?  But why?
  • Well, lets go see how the community is reacting to it and... OH MY GOD!

But you know what?  I've come to terms with it, and on the whole I believe this is a good thing.  I think that the concerns, complaints and observations people have been making deserve addressing.


Tuesday, March 25, 2014

Working with the latency tester

The Oculus latency tester is a device that allows you to empirically test the time it takes between issuing the commands to render something on the Rift screen and the screen actually changing in response.  Someone on the Oculus forums recently asked why there weren't any test applications that you could download in order to use the device.  The reason for this is that such an application would defeat the purpose of the device, which is to allow you to measure the performance and latency of your program on your hardware.

Specifically it allows you to measure the amount of time you should be using in the sensor fusion prediction mechanism.

Thursday, March 20, 2014

Getting started with Unity Pro

Unity Pro is a popular game engine so let’s take a look at getting started with Rift development using Unity Pro. The basic integration steps look pretty easy:
  1. Create a scene.
  2. Import the Oculus integration package into the project’s assets.
  3. Add the Oculus Character Controller prefab (OVRPlayerController) to the scene or add the Oculus Camera prefab (OVRCameraController) to the character controller in your scene.
To get started with these steps, I first downloaded the OVR Unity 4Pro Integration package from developer.oculusvr.com. It isn’t the smallest of downloads, so, while that was in progress, I fired up Unity and got started on the first step of Rift integration and created a scene.  

For a minimal Rift scene, all you need is a plane to stand on, a light to see by and some objects to look at. Not being someone who is happy doing the bare minimum,  I used the Terrain Assets standard Unity package to add a little bit of flair to my scene and created a sand beach with palm trees.  



With my scene ready, the next step was to import the Oculus integration package into my project by using Assets->Import Package->Custom package and selecting the OculusUnityIntegration.unitypackage that I downloaded earlier from OculusVR.  After importing the package and saving the scene, I had an OVR folder in my project’s assets and an Oculus menu item in my main menu listing two prefabs: OVRPlayerController and OVRCameraController. The OVRCameraController prefab is a stereo camera that is used in place of a single Unity camera, and the OVRPlayerController prefab is an OVRCameraController prefab attached to a character controller.

I wanted to see my scene running on the Rift right away, so I went with the more complete option and dragged the OVRPlayerController prefab onto my scene. Then, all I needed to do was properly position the prefab so that it was above the plane. After re-setting the Prefab location to the origin point, I looked at the character controller on the prefab to see that it had a default Height of 2 and a Radius of 0.5.  Not wanting my character buried in the sand, I set the Transform Y position to 1. When I switched  the scene to the Game view, the scene now showed left and right views.



With the prefab in place, I selected “Maximize on play” and clicked play. And, I got an error: "There are two Audio Listeners in the scene. Please ensure there is always exactly one audio listener in the scene." Both the Oculus prefab and the default Main Camera that I still had in the scene have Audio Listeners attached. After deleting the default Main Camera, I was ready to give it another whirl.  After pressing play, with the Rift attached, my scene was now displayed in the Rift’s oval views.



I put  my  Rift on and there I was standing on the sand beneath the palm trees. Now, I just need to add some surf and some sun....  


Monday, March 10, 2014

Testing your matrix setup with the Oculus Rift

Working with the Rift requires per-eye manipulation of both the modelview and projection matrix.  The modelview matrix needs a horizontal translation that gives objects a sense of depth by providing a different viewpoint for each eye.  The projection matrix takes a different horizontal translation that provides the correct per-eye field of view and also serves to center the the scene under the lens axis, instead of in the viewpoint center.

Matrix manipulation can be a bit of a pain, though and getting the manipulation wrong is pretty easy to do.  For this reason my initial examples working with the Rift in any language or toolkit always involve rendering a small colored cube of a specific size very close to the viewer.  This type of scene serves as a useful test for a variety of ways it's possible to screw up the matrices or their application to the scene.

I'm going to look at why that is, and what the effects of incorrect matrix manipulation look like

Saturday, March 8, 2014

Field of view in the Oculus VR

User Pogo on the Oculus VR forums asked the other day about why some applications didn't cover more of the available screen real estate on the Rift, and consequently occupied a smaller field of view than was possible.

Every pixel on the Rift display has a fixed relationship with the lenses (discounting the slight differences between the three sets of lenses that come with the Rift).  Each pixel has a specific θ (pronounced theta) which represents the angle between the lens axis and the line from the pixel through your eye.  Each pixel also has a specific 'perceived θ', which is the perceived angle between the lens axis and the where it appears to be because of the lens distortion.  

What this means is that if you're drawing on a smaller area of the Rift screen, you're essentially providing a smaller field of view.  

As an example of the issue Pogo posted the following images.  The first is from the Oculus configuration utility used to test the IPD and height settings a user has entered.


The second is from the lauded RedFrame environment demo.


And the last is from Half Life 2.  


All three applications are using native Rift support, but are rendering different fields of view.  But why?  Well, there are a number of reasons that this might happen.