Please note, AR/VR is currently only supported on Windows Desktop when using the C++ API and the DirectX 11 driver.

Beta Release: The approach detailed below is required for adding VR to your application with Visualize HPS 2018 SP1. In future releases we will provide a generic API to make the process much easier. For a reference implementation of VR in a Visualize application, you can refer to the OpenVR Sandbox, bundled with your copy of Visualize.

12.2 Virtual Reality


For detailed instructions on setting up AR/VR with Visualize, see the Getting Started section.

12.2.1 OpenVR

Visualize supports any VR headset compatible with OpenVR. In order to add VR to your application, the following items are required:

1. Initializing OpenVR in Visualize

The following steps are required to initialize OpenVR:

#include <openvr.h>
//The head-mounted display
vr::IVRSystem * pHmd = nullptr;
HPS::World world = new HPS::World(HOOPS_LICENSE);
//Step 1. Initialize OpenVR
vr::EVRInitError eError = vr::VRInitError_None;
::pHmd = vr::VR_Init(&eError, vr::VRApplication_Scene);
if (eError != vr::VRInitError_None)
{
//failed to initialize openVR
}
//Step 2. Get the models used for rendering the VR devices (like controllers, tracking towers, etc)
vr::IVRRenderModels * pRenderModels = (vr::IVRRenderModels *)vr::VR_GetGenericInterface(vr::IVRRenderModels_Version, &eError);
if (!pRenderModels)
{
vr::VR_Shutdown();
//failed to get models
}
//Step 3. Initialize the OpenVR Compositor
if (!vr::VRCompositor())
{
vr::VR_Shutdown();
//failed to initialize the compositor
}
//Step 4. Optionally, create a Win32 window which will show the preview of what the headset sees.
//This window will be divided in half, showing the rendition for the left and right eyes, side by side.
//You can choose to re-use an already existing Win32 window instead of creating one.
HWND hwnd = CreateWin32Window(1280, 800, L"HPS OpenVR Sandbox", false, WndProc);
//Step 5. Ask VR for the size of the window to render to. Then create a Visualize window with these settings.
uint32_t render_width = 1024, render_height = 1024;
if (pHmd)
pHmd->GetRecommendedRenderTargetSize(&render_width, &render_height);
::window_aspect = (float)render_width / render_height;
oswok.SetDriver(driver_type);
oswok.SetHardwareResident(true); //Use this setting to improve update times
oswok.SetFramebufferRetention(false); //Use this setting because the OpenVR device owns the textures we pass to it
oswok.SetAntiAliasCapable(true);
HPS::WindowKey window_key = HPS::Database::CreateOffScreenWindow(render_width, render_height, oswok);
//Step 6. Tell Visualize that we are rendering to an OpenVR headset to enable Stereo rendering.
HPS_Driver_Set_OpenVR_Bit(window_key, (HPS::WindowHandle)hwnd);

2. Create an Update Handler.

Once OpenVR is initialized, and Visualize has started rendering in Stereo mode, we will need to set up an update handler.

Visualize needs to pass the images it renders in stereo mode to the VR headset at the end of each update. In order to do this, we will define a FinishPictureHandler. A FinishPictureHandler is a class which contains a callback called Handle. Handle is invoked automatically every time Visualize finishes an Update.

Here is a sample implementation of the FinishPictureHandler:

#include "arvr.h"
#include <openvr.h>
//The Handle method in the FinishPictureHandler class is called every time Visualize
//completes an update.
//This is a minimal FinishPictureHandler class, necessary to pass the images from Visualize to the VR headset
class FinishPictureHandler : public HPS::DriverEventHandler
{
public:
//The pose of each device
vr::TrackedDevicePose_t tracked_device_pose[vr::k_unMaxTrackedDeviceCount];
//The Visualize MatrixKit associated with each device
HPS::MatrixKit tracked_device_transforms[vr::k_unMaxTrackedDeviceCount];
// these are set once
HPS::MatrixKit hmd_proj_left;
HPS::MatrixKit hmd_proj_right;
HPS::MatrixKit eye_to_head_left;
HPS::MatrixKit eye_to_head_right;
// these get updated every frame
HPS::MatrixKit proj_left;
HPS::MatrixKit proj_right;
HPS::MatrixKit view_left;
HPS::MatrixKit view_right;
float const z_near = 0.1f;
float const z_far = 100.0f;
FinishPictureHandler()
{ }
//This function is called every time Visualize completes an Update
void Handle(HPS::DriverEvent const * in_event)
{
HPS::FinishPictureEvent const * fpe = (HPS::FinishPictureEvent const *)in_event;
if (pHmd)
{
//Visualize currently only supports DirectX11 for VR
static const auto api = vr::TextureType_DirectX;
//Step 1. Gather the left and right eye textures Visualize produced as part of the update and pass them to OpenVR
vr::Texture_t leftEyeTexture = { (void*)fpe->GetSurface(0), api, vr::ColorSpace_Gamma };
vr::VRCompositor()->Submit(vr::Eye_Left, &leftEyeTexture);
vr::Texture_t rightEyeTexture = { (void*)fpe->GetSurface(1), api, vr::ColorSpace_Gamma };
vr::VRCompositor()->Submit(vr::Eye_Right, &rightEyeTexture);
//Step 2. Update the pose of every attached VR device
vr::VRCompositor()->WaitGetPoses(tracked_device_pose, vr::k_unMaxTrackedDeviceCount, NULL, 0);
for (int nDevice = 0; nDevice < vr::k_unMaxTrackedDeviceCount; ++nDevice)
{
if (tracked_device_pose[nDevice].bPoseIsValid)
tracked_device_transforms[nDevice] = ConvertOpenVRMatrixToMatrixKit(tracked_device_pose[nDevice].mDeviceToAbsoluteTracking);
}
//Step 3. During the first update, calculate the projection and eye-to-head transforms
static bool first_update = true;
if (first_update)
{
hmd_proj_left = ConvertOpenVRMatrixToMatrixKit(pHmd->GetProjectionMatrix(vr::Eye_Left, z_near, z_far));
hmd_proj_right = ConvertOpenVRMatrixToMatrixKit(pHmd->GetProjectionMatrix(vr::Eye_Right, z_near, z_far));
eye_to_head_left = ConvertOpenVRMatrixToMatrixKit(pHmd->GetEyeToHeadTransform(vr::Eye_Left));
eye_to_head_left.Invert();
eye_to_head_right = ConvertOpenVRMatrixToMatrixKit(pHmd->GetEyeToHeadTransform(vr::Eye_Right));
eye_to_head_right.Invert();
first_update = false;
}
//Step 4. Calculate a camera to pass to Visualize based on the position of the VR headset
if (tracked_device_pose[vr::k_unTrackedDeviceIndex_Hmd].bPoseIsValid)
{
tracked_device_transforms[vr::k_unTrackedDeviceIndex_Hmd].ShowInverse(view);
UpdateCameraParams(
hmd_proj_left,
hmd_proj_right,
view * eye_to_head_left,
view * eye_to_head_right,
::driver_type,
::window_aspect,
proj_left,
proj_right,
view_left,
view_right,
camera_kit);
fpe->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionLeft, proj_left);
fpe->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionRight, proj_right);
fpe->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewLeft, view_left);
fpe->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewRight, view_right);
}
}
}
};

3. Set FinishPictureHandler on Window

The FinishPictureHandler will then need to be set as the handler on the window we created for VR:

//An instance of the FinishPictureHandler class you created needs to be set on the
//window you created for VR.
FinishPictureHandler finishPictureHandler;
window_key.SetDriverEventHandler(finishPictureHandler, HPS::Object::ClassID<HPS::FinishPictureEvent>());

4. Import Your Model into Visualize

At this point you are ready to import your model into Visualize. If you wish to you can create a Canvas / View / Model structure starting from the Offscreen window key you created for VR. This is also a good point to render the models of the VR devices to the scene graph, if you wish to do so.

You can find an example of how this can be done in the OpenVR Sandbox, specifically with the function SetupRenderModels().

5. Start a Render Loop

Now you are ready to start a render loop. The update at the end of the render loop should be the only one you have in the VR portion of your application. This render loop should continue until when you decide your application should quit, or otherwise terminate its VR mode.

//This is the main render loop for a VR application.
//It should keep running until when you decide your application is ready to exit VR mode or quit altogether
while (vr_mode)
{
//Optional: if you created a Win32 window to preview the headset view, you can listen to events sent to it here
if (!ProcessWin32Events())
break;
//Optional: you can listen to events sent to the VR device by OpenVR here.
//For example you can handle devices becoming connected or disconnected, or input from controllers.
//For a simple example of how to implement it, refer to the OpenVR Sandbox
ProcessVREvents(pHmd);
//Optional: you can ask openVR to collect the status of the connected controllers, and perform actions based on their state,
//like you would with an operator
ProcessControllers();
//Update the position of the connected devices. We start at offset 1, because the 0th entry is the headset.
for (uint32_t unTrackedDevice = vr::k_unTrackedDeviceIndex_Hmd + 1; unTrackedDevice < vr::k_unMaxTrackedDeviceCount; unTrackedDevice++)
{
//Optional: if you decided to display the models of the attached devices, you can update their modelling matrix here
//to reflect their position in space
HPS::SegmentKey & device_seg = tracked_device_instances[unTrackedDevice];
if (finishPictureHandler.tracked_device_pose[unTrackedDevice].bPoseIsValid && device_seg.Type() != HPS::Type::None)
device_seg.SetModellingMatrix(finishPictureHandler.tracked_device_transforms[unTrackedDevice]);
}
//Set the camera for the scene. This is the camera which was calculated by the FinishPictureHandler::Handle() function
//If you are not using the Canvas / View / Model structure, you should set this camera wherever you decided to set your main camera
view.GetSegmentKey().SetCamera(camera_kit);
//Issue an update.
//NOTE: This should be the only update you issue in a VR application
window_key.UpdateWithNotifier().Wait();
}
//Shutdown:
//When you are done with VR, you should follow this shutdown procedure:
//Step 1. Shut down OpenVR
vr::VR_Shutdown();
pHmd = nullptr;
//Step 2. Optional: Cleanup any objects you might have created here
//Step 3. Cleanup the VR window
window_key.UnsetDriverEventHandler(HPS::Object::ClassID<HPS::FinishPictureEvent>());
window_key.Delete();

Considerations

  • Performance is the main concern when using VR. For a smooth experience, 60FPS should be the minimum requirement. Lower or jumpy frame rates can result in the user feeling sick. Because of this, the main render loop should be as bare-bones as possible. If you need to perform expensive calculations, it would be best to perform them on a separate thread.
  • When in VR mode you should expect that the camera is in constant movement, since it is tied to the position of the VR headset. As a consequence of this, some of the approaches used in highlighting and the default operators will not work in VR mode.

    For example:

    • Our default orbit operator works by modifying the camera position. Since the position of the camera depends on the VR headset, it will be necessary to change the model modelling matrix instead.
    • Because of the constant camera changes, Visualize will not cache the status of rendering buffers between updates. In turn this means that the Overlay highlighting setting of Drawing::Overlay::WithZValues, which relies on these cached buffers, cannot be used effectively. Highlights with an overlay setting of Drawing::Overlay::WithZValues in an AR application will be performed sub-optimally.

  • Currently the Hide operation, and in general Highlight operation with the InPlace Overlay settings are very expensive to use, performance-wise, and should be avoided in VR mode. This is something we are currently addressing for Visualize 2018 SP2.
  • Some view-dependent geometry, like non-transformable text, patterned lines, text backgrounds, simple reflections/shadows, shadow maps, custom marker symbols, and leader lines will not transform correctly in stereo mode. This will create artifacts where it will look like two instances of said geometry are visible, unless the user closes one of her eyes. These limitations will be addressed in the next release.
  • You can download the HOOPS Demonstration Viewer for a demo more fully fledged VR experience created with Visualize. You can find the VR mode button in the HDV Visual Effects tab.