The Devices
The devices you want to detect basically boil down to the head tracker and the HMD. Perhaps once the Latency Tester gains wider adoption, we might include that, but for now we'll focus on these two devices.
The Head Tracker
The head tracker shows up as a Human Interface Device (HID) device on the system, just like most mice, keyboards, and a variety of other USB devices for input, such as the Logitech Space Navigator, a 6-DOF controller. The HID specification is designed to allow operating systems to interact with common input devices without the hassle of having to install drivers for each and every one. It can be particularly difficult to install a mouse or keyboard driver when you don't have a functioning mouse or keyboard. That's why no matter what special customizable buttons, bells and whistles your AwesomeTech GG-2500 keyboard might have, when you plug it into a computer, the basic keys still work.
That said, simply plugging in an Oculus Rift to your computer isn't going to let you do anything like control the cursor with your head without some additional effort. Being an HID device means that you don't actually need driver software for it, but since it's not recognizable as a mouse, keyboard, or other well known input device, your OS isn't going to do anything particularly useful with the information it can get back from the Rift, hence the need for an SDK to mediate communication with it.
Initializing the head tracker in the SDK commonly looks something like this:
OVR::Ptr<OVR::DeviceManager> pManager = *DeviceManager::Create();
OVR::Ptr<OVR::SensorDevice> pSensor;
pSensor = *pManager->EnumerateDevices<OVR::SensorDevice>().CreateDevice();
Based on the last article, and with a lot of patience, it would be possible to put a break point inside the command processing code to drill down into the actual implementation code to find out how your Rift is finding (or failing to find) the actual hardware. It's non-trivial to follow because there's a lot of abstraction and division of labor between the DeviceManager class and the array of classes it owns and maintains. If you follow the path of the code, eventually you'll find it leads you to an OS specific sub-device manager class. In the case of the head tracker this will be in a class named HIDDeviceManager in a file OVR_<OS>_HIDDevice.cpp, where <OS> is either OSX, Win32, or Linux. Each operating system exposes the underlying HID API in a completely different fashion, so there is very little crossover between these files. However, each class must implement an Enumerate function as this is how the owning device manager queries for the actual hardware it can find.
Putting a break point at the beginning of this enumerate function is probably the best first step at debugging if your code is having trouble finding the head tracker. However, this has the caveat that this won't help you at all if the hardware isn't being detected by the host OS. The mechanisms for determining if that's the case vary by operating system and are outside the scope of this document. On Linux it typically involves running 'lsusb', while on Windows it involves using the Device Manager, or some similar development tool with more advanced features. I'm not familiar enough with OSX to even guess where to start on that platform.
It's actually rare in my (admittedly limited) experience to find this particular technique useful. If the hardware is detected by the OS, then it's typically found by the SDK. On the other hand it has proven useful for me as a first step in determining that there's a permissions issue with the HID device, a common ailment on the Linux platform which by default doesn't grant non-superusers access to arbitrary HID devices.
The HMD
Access to the HMD follows the same patterns as the head tracker, but is even simpler. There's no intervening API other than whatever your OS provides for enumerating monitors. The file in which to start your investigation is OVR_<OS>_HMDDevice.cpp and the EnumerateDevices method of the class HMDDeviceFactory.
Interestingly, the mechanism for identifying the HMD appear to be radically different between operating systems this time. While for HID, the API for gathering information on each platform is different, in the end you're identifying the tracker by matching a product ID and vendor ID fetched from the platform specific API. With the HMD, each OS is using different information gathered from the available API to identify candidate monitors.
On OSX numeric identifiers culled from the functions CGDisplayVendorNumber and CGDisplayModelNumber are used to identify Rift displays. I assume these values are extracted somehow from the EDID information of the monitor.
On Windows platforms the SDK uses string matching against the DeviceID field contained in the Win32 DISPLAY_DEVICE structure for the given monitor. Again, this is presumably ultimately derived from the EDID information from the monitor.
On Linux platforms, somewhat alarmingly, the SDK doesn't seem to use the EDID information at all. Instead it queries the Xinerama extension library for all the screens, and then iterates over them looking for ones that have a resolution of 1280x800. This can be problematic, since Xinerama will ignore the screen if it's not currently enabled, and worse, it's reporting the current display resolution of the screen, rather than the native resolution, so if your Rift defaulted to 1280x720 for some reason, it won't be detected. Hopefully this will change in a future release of the SDK. Xinerama is largely being displaced by the RANDR extension which allows access to the EDID information and would likely improve the reliability of HMD detection on Linux environments.
Right now, the identification of the HMD seems kind of sketchy, but this isn't any fault of Oculus VR. Up till now there really hasn't been a compelling need for an OS level concept of a 'different kind of monitor'. As such there's no real middle ground established for being able to query monitors and target them for output for graphics rendering, while still ignoring them for the purposes of the conventional OS desktop metaphor. If VR and AR hardware achieves more mainstream acceptance, this will undoubtedly change, with EDID extensions arriving to allow OS's to identify hardware that should be excluded from desktop rendering, but still made available via accelerated rendering APIs such as OpenGL and DirectX. In the nearer term, we can only hope OS developers enable some kind of manual mechanism to accomplish this via their respective 'Display' control panels.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.