Please note, AR/VR is currently only supported on Windows Desktop when using the C++ API and the DirectX 11 driver.
Beta Release: The approach detailed below is required for adding VR to your application with Visualize 3DF 23.10. In future releases we will provide a generic API to make the integration process easier. For a reference implementation of VR in a Visualize application, you can refer to the openvr_simple project in the samples_arvr_v141 solution, bundled with your copy of Visualize.
Virtual Reality
OpenVR
Visualize supports any VR headset compatible with OpenVR. In order to add VR to your application, the following items are required:
For OpenVR, you'll need a VR-capable GPU. For current NVIDIA GPUs, this would be a GTX 1060 or better. For AMD, this would be an RX 480 or better.
To get started with OpenVR, follow these steps:
-
On your local system, clone OpenVR from the GitHub repository:
-
Set the OPENVR_SDK environment variable to the location of your OpenVR root folder.
-
Ensure that the OpenVR binaries are in your PATH.
-
Install the Steam application from: https://store.steampowered.com.
-
Once Steam has been installed, run the application. In the top menu, select Library, go to VR and select "Install SteamVR" and install it on your system.
-
Run SteamVR. From within the SteamVR application, run the installer for your particular hardware (e.g., HTC Vive, Oculus Rift, etc.).
For Oculus Rift only:
-
Allow developer apps to run on the Oculus by opening the Oculus app and choosing Settings->General->Unknown Sources and toggling it to ON.
1. Initializing OpenVR in Visualize
First, you will need to set the Stereo Rendering Debug bits, to tell 3DF to render in stereo. To do this, we pass the SINGLE_PASS_STEREO bit to HC_Set_Driver_Options. The bit can be defined as:
#define Debug_SINGLE_PASS_STEREO 0x08000000
The following steps are required to initialize OpenVR:
#include <openvr.h>
#define Debug_SINGLE_PASS_STEREO 0x08000000
typedef void (HC_CDECL * CallbackFunc)(...);
vr::IVRSystem * pHmd = nullptr;
HC_Define_System_Options("license = `" HOOPS_LICENSE "`");
vr::EVRInitError eError = vr::VRInitError_None;
::pHmd = vr::VR_Init(&eError, vr::VRApplication_Scene);
if (eError != vr::VRInitError_None)
{
}
vr::IVRRenderModels * pRenderModels = (vr::IVRRenderModels *)vr::VR_GetGenericInterface(vr::IVRRenderModels_Version, &eError);
if (!pRenderModels)
{
vr::VR_Shutdown();
}
if (!vr::VRCompositor())
{
vr::VR_Shutdown();
}
HWND hwnd = CreateWin32Window(1280, 800, L"HOOPS OpenVR Simple", false, WndProc);
HC_Open_Segment("/nowhere");
HC_KEY image_key = HC_Insert_Image(0, 0, 0,
"rgba", render_width, render_height,
nullptr);
HC_Close_Segment();
HC_Define_Callback_Name("my_finish_picture_callback", (CallbackFunc)finish_picture_callback);
HBaseView pHView =
new HBaseView(pHModel,
"",
"dx11",
"", (
void*)image_key,
nullptr,
nullptr,
nullptr,
"/driver/dx11/window1");
pHView->
SetTransparency(
"hsra = depth peeling, depth peeling options = layers = 1");
{
int const debug_bits = Debug_SINGLE_PASS_STEREO;
HC_Set_Driver_Options(H_FORMAT_TEXT(
"gpu resident, discard framebuffer, isolated, use window id = image key = 0x%p, use window id2 = 0x%p, debug = 0x%08x", (void*)image_key, (void*)hwnd, debug_bits));
HC_Set_Driver_Options("anti-alias = 4");
HC_Set_Callback("finish picture = my_finish_picture_callback");
HC_Set_Window_Frame("off");
}HC_Close_Segment();
2. Create an Update Handler.
Once OpenVR is initialized, and Visualize has started rendering in Stereo mode, we will need to set up an update handler. Visualize needs to pass the images it renders in stereo mode to the VR headset at the end of each update. In order to do this, we will define a finish_picture_callback function and register it as a callback.
We register it by calling:
HC_Define_Callback_Name("my_finish_picture_callback", (CallbackFunc)finish_picture_callback);
HC_Set_Callback("finish picture = my_finish_picture_callback");
Here is a sample implementation of the finish_picture_callback function:
static void finish_picture_callback(HIC_Rendition const *nr, bool swap_buffers)
{
HIC_Finish_Picture(nr, swap_buffers);
if (pHmd)
{
vr::Texture_t leftEyeTexture = { (void*)HIC_Driver_Get_Surface(nr, 0), vr::TextureType_DirectX, vr::ColorSpace_Gamma };
vr::VRCompositor()->Submit(vr::Eye_Left, &leftEyeTexture);
vr::Texture_t rightEyeTexture = { (void*)HIC_Driver_Get_Surface(nr, 1), vr::TextureType_DirectX, vr::ColorSpace_Gamma };
vr::VRCompositor()->Submit(vr::Eye_Right, &rightEyeTexture);
vr::VRCompositor()->WaitGetPoses(tracked_device_pose, vr::k_unMaxTrackedDeviceCount, NULL, 0);
for (int nDevice = 0; nDevice < vr::k_unMaxTrackedDeviceCount; ++nDevice)
{
if (tracked_device_pose[nDevice].bPoseIsValid)
ConvertOpenVRMatrixToMatrixKit(tracked_device_pose[nDevice].mDeviceToAbsoluteTracking, tracked_device_transforms[nDevice]);
}
static bool first_update = true;
if (first_update)
{
static float z_near = 0.1f;
static float z_far = 100.0f;
ConvertOpenVRMatrixToMatrixKit(pHmd->GetProjectionMatrix(vr::Eye_Left, z_near, z_far), hmd_proj_left);
ConvertOpenVRMatrixToMatrixKit(pHmd->GetProjectionMatrix(vr::Eye_Right, z_near, z_far), hmd_proj_right);
ConvertOpenVRMatrixToMatrixKit(pHmd->GetEyeToHeadTransform(vr::Eye_Left), eye_to_head_left);
HIC_Compute_Matrix_Inverse(nr, eye_to_head_left, eye_to_head_left);
ConvertOpenVRMatrixToMatrixKit(pHmd->GetEyeToHeadTransform(vr::Eye_Right), eye_to_head_right);
HIC_Compute_Matrix_Inverse(nr, eye_to_head_right, eye_to_head_right);
first_update = false;
}
if (tracked_device_pose[vr::k_unTrackedDeviceIndex_Hmd].bPoseIsValid)
{
float view[16];
HIC_Compute_Matrix_Inverse(nr, tracked_device_transforms[vr::k_unTrackedDeviceIndex_Hmd], view);
HIC_Compute_Matrix_Product(nr, view, eye_to_head_left, view_left);
HIC_Compute_Matrix_Product(nr, view, eye_to_head_right, view_right);
set_camera_parameters(
nr,
view_left,
view_right,
hmd_proj_left,
hmd_proj_right,
window_aspect,
&camera,
&near_limit);
memcpy(proj_left, hmd_proj_left, sizeof(proj_left));
memcpy(proj_right, hmd_proj_right, sizeof(proj_right));
convert_projection_matrix(proj_left);
convert_projection_matrix(proj_right);
convert_view_matrix(view_left);
convert_view_matrix(view_right);
HIC_Driver_Set_Stereo_Matrix(nr, HIC_Stereo_Matrix_VIEW_LEFT, view_left);
HIC_Driver_Set_Stereo_Matrix(nr, HIC_Stereo_Matrix_VIEW_RIGHT, view_right);
HIC_Driver_Set_Stereo_Matrix(nr, HIC_Stereo_Matrix_PROJECTION_LEFT, proj_left);
HIC_Driver_Set_Stereo_Matrix(nr, HIC_Stereo_Matrix_PROJECTION_RIGHT, proj_right);
}
}
}
3. Import a Model into Visualize
At this point you are ready to import your model into Visualize. This is also a good point to render the models of the VR devices to the scene graph, if you wish to do so.
You can find an example of how this can be done in the openvr_simple application.
4. Start a Render Loop
Now you are ready to start a render loop. The update at the end of the render loop should be the only one you have in the VR portion of your application. This render loop should continue until when you decide your application should quit, or otherwise terminate its VR mode.
for (;;)
{
if (!ProcessWin32Events())
break;
ProcessVREvents(pHmd, tracked_devices_include_library, tracked_devices);
for (uint32_t unTrackedDevice = vr::k_unTrackedDeviceIndex_Hmd + 1; unTrackedDevice < vr::k_unMaxTrackedDeviceCount; unTrackedDevice++)
{
auto device_seg = tracked_device_instances[unTrackedDevice];
if (tracked_device_pose[unTrackedDevice].bPoseIsValid && device_seg != INVALID_KEY)
{
HC_Open_Segment_By_Key(device_seg);
HC_Set_Modelling_Matrix(tracked_device_transforms[unTrackedDevice]);
HC_Close_Segment();
}
}
HC_Open_Segment_By_Key(pHView->GetSceneKey());;
HC_Set_Camera_Near_Limit(near_limit);
HC_Close_Segment();
}
vr::VR_Shutdown();
pHmd = nullptr;
delete pHView;
delete pHModel;
delete hdb;
return 0;
Considerations
-
Performance is the main concern when using VR. For a smooth experience 60FPS should be the minimum requirement. Lower or jumpy frame rates can result in the user feeling sick. Because of this, the main render loop should be as bare-bones as possible. If you need to perform expensive calculations, it would be best to perform them on a separate thread.
-
When in VR mode you should expect that the camera is in constant movement, since it is tied to the position of the VR headset. As a consequence of this, some of the approaches used in highlighting and the default operators will not work in VR mode.
For example:
-
The default orbit operator works by modifying the camera position. Since the position of the camera depends on the VR headset, it will be necessary to change the model modelling matrix instead.
-
Because of the constant camera changes, Visualize will not cache the status of rendering buffers between updates. In turn this means that the heuristic setting of quick moves=spriting, which relies on these cached buffers, cannot be used effectively. Highlights with a heuristic setting of quick moves=spriting in a VR application will be performed sub-optimally.
-
Some view dependent geometry, like non-transformable text and patterned line will not transform correctly in stereo mode. This will create artifacts where it will look like two instances of said geometry are visible, unless the user closes one of their eyes.