instantreality 1.0

Going virtual with OculusVR

Oculus, Virtual Reality, HMD, Tracking
Author(s): Tobias Alexander Franke
Date: 2013-25-06

Summary: This tutorial will show you how to set up tracking with the Oculus VR HMD in InstantReality and X3DOM.


The Oculus Rift is a low-cost and lightweight stereo HMD for VR applications. It is integrated into InstantIO and can be easily used with your X3D application.

After connecting the kit to your PC, first make sure that your Oculus Rift set is running properly by downloading the SDK and running one of the sample applications. If the application is responding to rotation, then you're good to go. Please also consider that the Oculus effectively represents just another screen connected to your PC. Applications which want to display a proper context must be run in fullscreen on this screen. Depending on your configuration, this screen might be either combined with your current monitor or it might be an extension.

The InstantIO node for an Oculus looks like this:

Code: OpenNI InstantIO node

<IOSensor DEF='ovr' type='OculusRift'>
   <field accessType='outputOnly'  name='Orientation'       type='SFRotation'/>
   <field accessType='outputOnly'  name='Acceleration'      type='SFVec3f'/>
   <field accessType='outputOnly'  name='PredictionDelta'   type='SFFloat'/>
   <field accessType='inputOnly'   name='EnablePrediction'  type='SFBool'/>

The Oculus node will provide you with the Orientation of the HMD (which you can for instance directly route into a Viewpoint), the current 3D Acceleration and a PredicitionDelta value, depending on it being turned on (default) or off. You can do this manually by sending a bool value to EnablePredicition.

Going virtual

In order to get the orientation of the camera to work within a scene, a very simple way to do this is to route the Orientation field of the Oculus to a Viewpoint.

Code: Routing the orientation of an Oculus to a Viewpoint

<IOSensor DEF='ovr' type='OculusRift'>
   <field accessType='outputOnly' name='Orientation' type='SFRotation'/>

<Viewpoint DEF='v' position='0 2 0'>

<ROUTE fromNode='ovr' fromField='Orientation' toNode='v' toField='orientation'/>

In a second step, we need to render the scene for each eye with a distortion to match the diopter of the lenses which are currently used in the HMD. The Oculus comes with three different types that have a different distortion, and you need to be aware which type of lenses you are currently using. To read more about this, please check with the manual in your box.

Since Instant Reality has always been designed to target VR applications, both stereo rendering and distortion filters are readily available from X3D (see the ParallelStereoViewModifier and ImprovedDistortionDisplayFilter nodes) and you needn't lift a finger. Render context dependent changes of your X3D scene are handled inside an Engine block in your X3D application, in which we can define the window of the context and how the scene will be rendered. Consider the following code:

Code: The distored stereo view setup

    <RenderJob DEF='render'>
            <Window sample='4' size='1280 800'>
                <Viewarea upperRight='0.5 1'>
                    <ParallelStereoViewModifier      DEF='leftSVM' leftEye='TRUE'/>
                    <ImprovedDistortionDisplayFilter DEF='leftDIS' leftEye='TRUE'/>

                <Viewarea lowerLeft='0.5 0'>
                    <ParallelStereoViewModifier      DEF='rightSVM' leftEye='FALSE'/>
                    <ImprovedDistortionDisplayFilter DEF='rightDIS' leftEye='FALSE'/>


We define a Window with a size of 1280x800 pixels (the current resolution of the Oculus Rift Developer Kit Version) and sample count of 4 (MSAA antialiasing), which is useful to avoid heavy aliasing noticeable inside the HMD. The window is separated into two Viewareas, defined by their position on the screen. Each Viewarea has a ParallelStereoViewModifier, controlling two things: which eye is to be served with this area, and what parallax distance eyeSeparation is used to view the scene. You can use this parameter to get a better 3D impression (or switch it off entirely).

The ImprovedDistortionDisplayFilter controls the distortion of the viewarea. To do this, the rendered image is used in a second render pass and distorted by a fragment shader. This shader is controlled by several parameters given to the filter: HResolutionand VResolution which are the width and height in pixels of the screen (a current Devkit Oculus has a 1280 by 800 resolution, but the paramter is for one eye, which leaves us with 640 by 800 per eye). VScreenSize and HScreenSize are the physical width and height of the screen in meters. The distortion is done with a polynomial of four coefficients, which are set with the distortionK field. Finally, the LensSeparationDistance parameter is the separation of your virtual lenses or cameras. This parameter controls how you perceive depth.

Instead of setting these parameters manually, the Oculus node provides them automatically for the connected device. We can therefore simply route the parameters directly to both distortion filters:

Code: The distored stereo view setup

<ROUTE fromNode='ovr' fromField='FieldOfView'               toNode='v'                toField='fieldOfView'/>
<ROUTE fromNode='ovr' fromField='Orientation'               toNode='v'                toField='orientation'/>
<ROUTE fromNode='ovr' fromField='FieldOfView'               toNode='v'                toField='fieldOfView'/>

<ROUTE fromNode='ovr' fromField='HResolution'               toNode='engine::leftDIS'  toField='width'/>
<ROUTE fromNode='ovr' fromField='HResolution'               toNode='engine::rightDIS' toField='width'/>
<ROUTE fromNode='ovr' fromField='VResolution'               toNode='engine::leftDIS'  toField='height'/>
<ROUTE fromNode='ovr' fromField='VResolution'               toNode='engine::rightDIS' toField='height'/>

<ROUTE fromNode='ovr' fromField='HScreenSize'               toNode='engine::leftDIS'  toField='hScreenSize'/>
<ROUTE fromNode='ovr' fromField='HScreenSize'               toNode='engine::rightDIS' toField='hScreenSize'/>
<ROUTE fromNode='ovr' fromField='VScreenSize'               toNode='engine::leftDIS'  toField='vScreenSize'/>
<ROUTE fromNode='ovr' fromField='VScreenSize'               toNode='engine::rightDIS' toField='vScreenSize'/>

<ROUTE fromNode='ovr' fromField='DistortionK'               toNode='engine::leftDIS'  toField='distortionK'/>
<ROUTE fromNode='ovr' fromField='DistortionK'               toNode='engine::rightDIS' toField='distortionK'/>

<ROUTE fromNode='ovr' fromField='LensSeparationDistance'    toNode='engine::leftDIS'  toField='lensSeparationDistance'/>
<ROUTE fromNode='ovr' fromField='LensSeparationDistance'    toNode='engine::rightDIS' toField='lensSeparationDistance'/>

<ROUTE fromNode='ovr' fromField='HScreenSize'               toNode='engine::leftSVM'  toField='hScreenSize'/>
<ROUTE fromNode='ovr' fromField='HScreenSize'               toNode='engine::rightSVM' toField='hScreenSize'/>

<ROUTE fromNode='ovr' fromField='VScreenSize'               toNode='engine::leftSVM'  toField='vScreenSize'/>
<ROUTE fromNode='ovr' fromField='VScreenSize'               toNode='engine::rightSVM' toField='vScreenSize'/>

<ROUTE fromNode='ovr' fromField='DistortionK'               toNode='engine::leftSVM'  toField='distortionK'/>
<ROUTE fromNode='ovr' fromField='DistortionK'               toNode='engine::rightSVM' toField='distortionK'/>

<ROUTE fromNode='ovr' fromField='LensSeparationDistance'    toNode='engine::leftSVM'  toField='lensSeparationDistance'/>
<ROUTE fromNode='ovr' fromField='LensSeparationDistance'    toNode='engine::rightSVM' toField='lensSeparationDistance'/>

<ROUTE fromNode='ovr' fromField='InterpupillaryDistance'    toNode='engine::leftSVM'  toField='eyeSeparation'/>
<ROUTE fromNode='ovr' fromField='InterpupillaryDistance'    toNode='engine::rightSVM' toField='eyeSeparation'/>

You may have noticed one more parameter being routed from the Oculus node, which is the FieldOfView. To ensure that the distortion doesn't affect your perceived image, a correct field of view has to be set. This parameter also comes preconfigured from the Oculus, so we reroute it to the same Viewpoint we use for the orientation.

If you have come this far you should see a working stereo rendering of your scene with correct distortion applied. Now simply move the window to the Oculus screen and go fullscreen in your InstantPlayer, put on your Oculus and dive right in!



This tutorial has no comments.

Add a new comment

Due to excessive spamming we have disabled the comment functionality for tutorials. Please use our forum to post any questions.