Friday, August 22, 2014

Advanced uses of Timewarp II - When you're running late

[This is post three of three on Timewarp, a new technology available on the Oculus Rift. This is a draft of work in progress of Chapter 5.7 from our upcoming book, "Oculus Rift in Action", Manning Press. By posting this draft on the blog, we're looking for feedback and comments: is this useful, and is it intelligible?]


5.7.2 When you're running late

Of course, when the flak really starts to fly, odds are that you won’t be rendering frames ahead of the clock—it’s a lot more likely that you’ll be scrambling to catch up.  Sometimes rendering a single frame costs you longer than the number of milliseconds your target framerate allows.  But timewarp can be useful here too.

Say your engine realizes that it’s going to be running late.  Instead of continuing to render the current frame, you can send the previous frame to the Rift and let the Rift apply timewarp to the images generated a dozen milliseconds ago.  (Figure 5.12.)  Sure, they won’t be quite right—but if it buys you enough time to get back on top of your rendering load, it’ll be worth it, and no human eye will catch it when you drop occasionally one frame out of 75.  Far more importantly, the image sent to the Rift will continue to respond to the user’s head motions with absolute fidelity; low latency means responsive software, even with the occasional lost frame.

Remember, timewarp can distort any frame, so long as it’s clear when that frame was originally generated so that the Rift knows how much distortion to apply.

Figure 5.12: If you’re squeezed for rendering time, you can occasionally save a few cycles by dropping a frame and re-rendering the previous frame through timewarp.

The assumption here is that your code is sufficiently instrumented and capable of self-analysis that you do more than just render a frame and hope it was fast enough.  Carefully instrumented timing code isn’t hard to add, especially with some display-bound timing methods as ovrHmd_GetFrameTiming, but it does mean more complexity in the rendering loop.  If you’re using a commercial graphics engine, they may already have the support baked in.  This is the sort of monitoring that any 3D app engine that handles large, complicated, variable-density scenes will hopefully be capable of performing.

Dropping frames with timewarp is an advanced technique, and probably not worth investing engineering resources into early in a project.  This is something that you should only build when your scene has grown so complicated that you anticipate having spikes of rendering time.  But if that’s you, then timewarp will help.

1 comment:

  1. Fascinating stuff!

    I can imagine quite a few ways how to improve timewarp. Maybe, you could comment on why this isn't possible/ haven't been done.

    - assuming I can accurately predict how long the rendering will take, would it not be better to wait at the beginning of the frame instead at the end?
    - is it not possible to render the two eyes in parallel, say in parallel threads?
    - would it be not better to make rendering and timewarp asynchronous, again with one thread continuously rendering and timewarp getting the most recent finished render.

    On an unrelated note, is it possible with 0.4 SDK to provide input for timewarp from a custom head-tracking sensor?

    ReplyDelete

Note: Only a member of this blog may post a comment.