Introduction

Getting Started

Programming Guides

API Reference

Additional Resources

Please note, AR/VR is currently only supported on Windows Desktop when using the C++ API and the DirectX 11 driver.

Beta Release: The approach detailed below is required for adding VR to your application with Visualize 3DF 23.10. In future releases we will provide a generic API to make the integration process easier. For a reference implementation of VR in a Visualize application, you can refer to the openvr_simple project in the samples_arvr_v141 solution, bundled with your copy of Visualize.

Virtual Reality


OpenVR

Visualize supports any VR headset compatible with OpenVR. In order to add VR to your application, the following items are required:

For OpenVR, you'll need a VR-capable GPU. For current NVIDIA GPUs, this would be a GTX 1060 or better. For AMD, this would be an RX 480 or better.

To get started with OpenVR, follow these steps:

For Oculus Rift only:

1. Initializing OpenVR in Visualize

First, you will need to set the Stereo Rendering Debug bits, to tell 3DF to render in stereo. To do this, we pass the SINGLE_PASS_STEREO bit to HC_Set_Driver_Options. The bit can be defined as:

#define Debug_SINGLE_PASS_STEREO 0x08000000

The following steps are required to initialize OpenVR:

#include <openvr.h>
#define Debug_SINGLE_PASS_STEREO 0x08000000
typedef void (HC_CDECL * CallbackFunc)(...);
//The head-mounted display
vr::IVRSystem * pHmd = nullptr;
HC_Define_System_Options("license = `" HOOPS_LICENSE "`");
//Step 1. Initialize OpenVR
vr::EVRInitError eError = vr::VRInitError_None;
::pHmd = vr::VR_Init(&eError, vr::VRApplication_Scene);
if (eError != vr::VRInitError_None)
{
//failed to initialize openVR
}
//Step 2. Get the models used for rendering the VR devices (like controllers, tracking towers, etc)
vr::IVRRenderModels * pRenderModels = (vr::IVRRenderModels *)vr::VR_GetGenericInterface(vr::IVRRenderModels_Version, &eError);
if (!pRenderModels)
{
vr::VR_Shutdown();
//failed to get models
}
//Step 3. Initialize the OpenVR Compositor
if (!vr::VRCompositor())
{
vr::VR_Shutdown();
//failed to initialize the compositor
}
//Step 4. Optionally, create a Win32 window which will show the preview of what the headset sees.
//This window will be divided in half, showing the rendition for the left and right eyes, side by side.
//You can choose to re-use an already existing Win32 window instead of creating one.
HWND hwnd = CreateWin32Window(1280, 800, L"HOOPS OpenVR Simple", false, WndProc);
//Step 5. Create Visualize objects
// HOOPS offscreen image drivers must be bound to an 'image' geometry object, although for our purposes here this image will not be used for anything.
HC_Open_Segment("/nowhere");
HC_KEY image_key = HC_Insert_Image(0, 0, 0, "rgba", render_width, render_height, nullptr);
HC_Close_Segment();
// our "init_picture" callback will be called by the driver (renderer) thread at the beginning of each update
// where we will set the camera backbuffer and stereo matrices given to us by the holographic remoting framework.
HC_Define_Callback_Name("my_finish_picture_callback", (CallbackFunc)finish_picture_callback);
HBaseModel pHModel = new HBaseModel();
pHModel->Init();
HBaseView pHView = new HBaseView(pHModel, "", "dx11", "", (void*)image_key, nullptr, nullptr, nullptr, "/driver/dx11/window1");
pHView->Init();
pHView->SetRenderMode(HRenderGouraud);
pHView->SetTransparency("hsra = depth peeling, depth peeling options = layers = 1");
HC_Open_Segment_By_Key(pHView->GetViewKey());
{
int const debug_bits = Debug_SINGLE_PASS_STEREO;
// pass the preview window HWND as the 'use window id2' parameter, this is optional and can be null if no preview is desired.
HC_Set_Driver_Options(H_FORMAT_TEXT(
"gpu resident, discard framebuffer, isolated, use window id = image key = 0x%p, use window id2 = 0x%p, debug = 0x%08x", (void*)image_key, (void*)hwnd, debug_bits));
HC_Set_Driver_Options("anti-alias = 4");
//See step 2 for an explanation of the finish picture handler
HC_Set_Callback("finish picture = my_finish_picture_callback");
HC_Set_Window_Frame("off");
}HC_Close_Segment();

2. Create an Update Handler.

Once OpenVR is initialized, and Visualize has started rendering in Stereo mode, we will need to set up an update handler. Visualize needs to pass the images it renders in stereo mode to the VR headset at the end of each update. In order to do this, we will define a finish_picture_callback function and register it as a callback.

We register it by calling:

HC_Define_Callback_Name("my_finish_picture_callback", (CallbackFunc)finish_picture_callback);
HC_Set_Callback("finish picture = my_finish_picture_callback");

Here is a sample implementation of the finish_picture_callback function:

static void finish_picture_callback(HIC_Rendition const *nr, bool swap_buffers)
{
HIC_Finish_Picture(nr, swap_buffers);
if (pHmd)
{
vr::Texture_t leftEyeTexture = { (void*)HIC_Driver_Get_Surface(nr, 0), vr::TextureType_DirectX, vr::ColorSpace_Gamma };
vr::VRCompositor()->Submit(vr::Eye_Left, &leftEyeTexture);
vr::Texture_t rightEyeTexture = { (void*)HIC_Driver_Get_Surface(nr, 1), vr::TextureType_DirectX, vr::ColorSpace_Gamma };
vr::VRCompositor()->Submit(vr::Eye_Right, &rightEyeTexture);
vr::VRCompositor()->WaitGetPoses(tracked_device_pose, vr::k_unMaxTrackedDeviceCount, NULL, 0);
for (int nDevice = 0; nDevice < vr::k_unMaxTrackedDeviceCount; ++nDevice)
{
if (tracked_device_pose[nDevice].bPoseIsValid)
ConvertOpenVRMatrixToMatrixKit(tracked_device_pose[nDevice].mDeviceToAbsoluteTracking, tracked_device_transforms[nDevice]);
}
static bool first_update = true;
if (first_update)
{
static float z_near = 0.1f;
static float z_far = 100.0f;
ConvertOpenVRMatrixToMatrixKit(pHmd->GetProjectionMatrix(vr::Eye_Left, z_near, z_far), hmd_proj_left);
ConvertOpenVRMatrixToMatrixKit(pHmd->GetProjectionMatrix(vr::Eye_Right, z_near, z_far), hmd_proj_right);
ConvertOpenVRMatrixToMatrixKit(pHmd->GetEyeToHeadTransform(vr::Eye_Left), eye_to_head_left);
HIC_Compute_Matrix_Inverse(nr, eye_to_head_left, eye_to_head_left);
ConvertOpenVRMatrixToMatrixKit(pHmd->GetEyeToHeadTransform(vr::Eye_Right), eye_to_head_right);
HIC_Compute_Matrix_Inverse(nr, eye_to_head_right, eye_to_head_right);
first_update = false;
}
if (tracked_device_pose[vr::k_unTrackedDeviceIndex_Hmd].bPoseIsValid)
{
float view[16];
HIC_Compute_Matrix_Inverse(nr, tracked_device_transforms[vr::k_unTrackedDeviceIndex_Hmd], view);
HIC_Compute_Matrix_Product(nr, view, eye_to_head_left, view_left);
HIC_Compute_Matrix_Product(nr, view, eye_to_head_right, view_right);
set_camera_parameters(
nr,
view_left,
view_right,
hmd_proj_left,
hmd_proj_right,
window_aspect,
&camera,
&near_limit);
memcpy(proj_left, hmd_proj_left, sizeof(proj_left));
memcpy(proj_right, hmd_proj_right, sizeof(proj_right));
// massage raw matrices into the form the HOOPS driver expects internally
convert_projection_matrix(proj_left);
convert_projection_matrix(proj_right);
convert_view_matrix(view_left);
convert_view_matrix(view_right);
HIC_Driver_Set_Stereo_Matrix(nr, HIC_Stereo_Matrix_VIEW_LEFT, view_left);
HIC_Driver_Set_Stereo_Matrix(nr, HIC_Stereo_Matrix_VIEW_RIGHT, view_right);
HIC_Driver_Set_Stereo_Matrix(nr, HIC_Stereo_Matrix_PROJECTION_LEFT, proj_left);
HIC_Driver_Set_Stereo_Matrix(nr, HIC_Stereo_Matrix_PROJECTION_RIGHT, proj_right);
}
}
}

3. Import a Model into Visualize

At this point you are ready to import your model into Visualize. This is also a good point to render the models of the VR devices to the scene graph, if you wish to do so.

You can find an example of how this can be done in the openvr_simple application.

4. Start a Render Loop

Now you are ready to start a render loop. The update at the end of the render loop should be the only one you have in the VR portion of your application. This render loop should continue until when you decide your application should quit, or otherwise terminate its VR mode.

//This is the main render loop for a VR application.
//It should keep running until when you decide your application is ready to exit VR mode or quit altogether
for (;;)
{
//Optional: if you created a Win32 window to preview the headset view, you can listen to events sent to it here
if (!ProcessWin32Events())
break;
//Optional: you can listen to events sent to the VR device by OpenVR here.
//For example you can handle devices becoming connected or disconnected, or input from controllers.
//For a simple example of how to implement it, refer to the OpenVR Sandbox
ProcessVREvents(pHmd, tracked_devices_include_library, tracked_devices);
//Optional: you can ask openVR to collect the status of the connected controllers, and perform actions based on their state,
//like you would with an operator
//Update the position of the connected devices. We start at offset 1, because the 0th entry is the headset.
for (uint32_t unTrackedDevice = vr::k_unTrackedDeviceIndex_Hmd + 1; unTrackedDevice < vr::k_unMaxTrackedDeviceCount; unTrackedDevice++)
{
//Optional: if you decided to display the models of the attached devices, you can update their modelling matrix here
//to reflect their position in space
auto device_seg = tracked_device_instances[unTrackedDevice];
if (tracked_device_pose[unTrackedDevice].bPoseIsValid && device_seg != INVALID_KEY)
{
HC_Open_Segment_By_Key(device_seg);
HC_Set_Modelling_Matrix(tracked_device_transforms[unTrackedDevice]);
HC_Close_Segment();
}
}
//Set the camera for the scene. This is the camera which was calculated by the finish_picture_callback
HC_Open_Segment_By_Key(pHView->GetSceneKey());;
HC_Set_Camera(&camera.position, &camera.target, &camera.up_vector, camera.field_width, camera.field_height, camera.projection);
HC_Set_Camera_Near_Limit(near_limit);
HC_Close_Segment();
//Issue an update.
//NOTE: This should be the only update you issue in a VR application
pHView->Update();
}
//Shutdown:
//When you are done with VR, you should follow this shutdown procedure:
//Step 1. Shut down OpenVR
vr::VR_Shutdown();
pHmd = nullptr;
//Step 2. Optional: Cleanup any objects you might have created here
//Step 3. Cleanup 3DF objects
delete pHView;
delete pHModel;
delete hdb;
return 0;

Considerations