Virtual Reality


This section provides an overview of how to use Visualize to render in a VR headset. Most of the source code presented below is available in the openvr_simple sandbox, which is located in the package in the directory; this project is also accessible in the samples_v1xx solution located in the root directory of the Visualize package.

In addition, the full source code for the the move, select, and scale operators is also provided in openvr_simple project, and you are encouraged to tailor these operators to meet the specific requirements of your VR applications or to develop new operators using these as a starting point.


Please note: AR/VR is currently only supported on Windows Desktop when using the C++ API and the DirectX 11 or OpenGL 2 drivers. To use VR with DirectX 11 on Windows 7 machines, please make sure Windows Update KB2670838 is installed. Additionally, stereo rendering will not work on machines that do not support DirectX 11.1.


Visualize supports any VR headset compatible with OpenVR. In order to add VR to your application, the following items are required:

For OpenVR, you'll need a VR-capable GPU. For current NVIDIA GPUs, this would be a GTX 1060 or better. For AMD, this would be an RX 480 or better.

To get started with OpenVR, follow these steps:

  1. On your local system, clone OpenVR from the GitHub repository:
    git clone
  2. Set the OPENVR_SDK environment variable to the location of your OpenVR root folder.
  3. Ensure that the OpenVR binaries are in your PATH.
  4. Install the Steam application from:
  5. Once Steam has been installed, run the application. In the top menu, select Library, go to VR and select "Install SteamVR" and install it on your system.
  6. Run the installer for your particular hardware.
  7. Establish your play space before launching the application. This is done from SteamVR if you are using a Vive, and from the Oculus app if you are using a Rift.

For Oculus Rift only:

  • Allow developer apps to run on the Oculus by opening the Oculus app and choosing Settings->General->Unknown Sources and toggling it to ON.

The VR API package

The VR API is packaged as source code within the samples_v1xx solution in the openvr_simple_v1xx project.

The VR API is currently made up of these files:

File nameDescription
vr.h/cppBasic VR class which handles viewing a model in VR
vr_move.h/cppVR operator – rotates and translates the model when a button is held down
vr_selection.h/cppVR operator – selects and highlights an entity when a button is pressed
vr_scale.h/cppVR operator – scales the whole model based on the distance between the two controllers when a button is pressed on each controller

To use the VR API, simply add the files above to your project.

These files are located in hoops_3df/Dev_Tools/hoops_vr.

Note that while the code in the API is not platform specific, Virtual Reality is currently only supported on Windows.

Using the VR API

This section covers how to create a VR app and start a VR session.

How to create a VR application using the VR Class

The VR class handles initializing and shutting down VR, processing events coming from the headset, and keeping all tracked devices and controllers up to date.

It can be used by the user both for simply viewing a model and for reacting to events and button presses.

The first step in creating a VR object is to create a class that derives from it.

The VR class has some pure virtual functions, so those functions should be present in the derived class:

class DemoVR : public VR
DemoVR(VROptions const & options)
: VR(options)
, selection_op(SelectionOperator(*this)) // These three operators are optional.
, move_op(MoveOperator(*this))
, scale_op(ScaleOperator(*this))
void Setup() override
//implementation goes here
void ProcessEvent(vr::VREvent_t const & in_event) override
//implementation goes here
void ProcessInput() override
//implementation goes here
void ProcessFrame() override
//implementation goes here

When creating a VR object, these options should be passed to it:

class VROptions
intptr_t preview_window_handle; //a handle to a window where to show a preview of what the headset is seeing. Set to 0 to disable the preview.
std::string vr_driver; //the driver used for VR. DirectX11 and OpenGL2 are supported.
HBaseModel * vr_model; //used to use pre-existing Models with VR
bool show_controllers; //whether to show 3D models of the controllers
bool show_tracking_towers; //whether to show 3D models of the tracking towers

The vr_model option is used in cases where you have a desktop application which has an optional VR mode (as can be seen in the HOOPS Demo Viewer).

In such cases, you might want to preserve the current view as it is and show it in VR. This can be done by passing it as an option.

If vr_model is not initialized, a brand new HBaseModel will be created for VR, and will be disposed of once the VR session is over.

Starting a VR session

A VR session starts once the user calls VR::Start(), and ends when the user calls VR::Stop() or the VR object goes out of scope.

Starting a VR session means that the 3D scene will be rendered to the VR headset. Here's a basic example of stopping and starting a session:

VROptions default_options;
DemoVR vr_session(vr_options);
vr_session.Start(); // VR Session starts here
vr_session.Stop(); // VR Session ends here

And an example of session lifetime with scoping:

myVR scoped_session(default_options);
scoped_session.Start(); // VR Session starts here
} // scoped VR Session ends here

The VR session will execute in a loop on a separate thread, and the calling thread will continue to the next instruction.

Initializing a VR session

Optionally, the user can decide to Initialize the VR session before Starting it (the session will be automatically initialized when Start is called if the user did not Initialize it beforehand).

The reason why someone might want to Initialize a VR session before starting is that after Initialize is called every part of the VR session is valid but rendering to the headset is not started.

This means that users can, for example, check how many controllers are connected or access the View to make changes to it before rendering starts.


myVR vr_session(default_options);
vr_session.GetVRView()->GetViewKey(); // WRONG! Session hasn't been initialized yet.
myVR other_vr_session(default_options);
other_vr_session.GetVRView()->GetViewKey(); // OK because the Start() function also initializes the session.

Interacting with VR

At this point, we've created and started a VR session. If we just wanted to view the model in VR, then nothing else is needed.

Most likely, however, we'll want to interact with the scene in VR, and therefore we'll need to implement the four functions required in all classes that derive from VR:


This function is called only once, before the very first frame of a VR session begins.

This is intended to contain any initialization of VR-specific objects you're using in your application.


ProcessEvent() is called every time the VR headset reports a new event, potentially multiple times per frame.

You can check the type of event received, do something with events you're interested in, and ignore the rest.


ProcessInput() is called once per frame, after the input from controllers has been collected and the position of the tracked devices into space has been updated.

You can use this function to make changes to the VR scene based on controller input.

ProcessFrame() ProcessFrame() is called once per frame, right before the frame is presented to the headset. This is a place where you can perform per-frame operations that are not related to controller inputs.

Sample Implementation

Here is a basic example of how to implement a class derived from the VR class. In this example:

  • the derived class uses the move VR operator to move the model when the trigger is pressed
  • the frame rate is checked at every frame
  • VR events are checked to determine whether the controller we are using ever gets disconnected

(For a more detailed implementation of this example, please refer to the openvr_simple sandbox source code.)

Example code:

#include "vr.h"
#include "vr_move.h"
class myVR : public VR
myVR(VROptions const & in_options);
virtual ~myVR();
void Setup() override;
void ProcessEvent(vr::VREvent const & in_event) override;
void ProcessInput() override;
void ProcessFrame() override;
MoveOperator move_op;
myVR::myVR(VROptions const & in_options)
: VR(options)
, move_op(MoveOperator(*this))
void myVR::Setup()
//check if controller_one is connected, and if so, use it with the move operator
//as part of the operator Attach call, we also specify which button the operator will respond to
if (controller_one.valid)
move_op.Attach(controller_one.device_index, vr::EVRButtonId::k_EButton_SteamVR_Trigger);
void myVR::ProcessEvent(vr::VREvent_t const & in_event)
//check for events that indicate that a tracked device was disconnected
//if a device with the same index as the controller used for the move operator is disconnected
//we will Detach the operator.
//We could also check for devices being connected, and re-attach the operator in the case where
//the controller is connected again.
if (in_event.eventType == vr::VREvent_TrackedDeviceDeactivated)
if (in_event.trackedDeviceIndex == controller_one.device_index)
void myVR::ProcessInput()
//At this point all input from controllers has been collected.
//We leave the move operator to handle this frame
//Note that HandleFrame checked that the connected device is still valid,
//so nothing will happen if the controller associated with this operator
//got disconnected
void ProcessFrame()
//This would be a good time to check our frame rate
//An improvement would be to only check at fixed intervals to provide more readable feedback
printf("FPS: %d", (int)CurrentFrameRate());
int main()
const char * filename = "my_file_path";
std::string driver_string = "dx11";
HC_Define_System_Options("license = `" HOOPS_LICENSE "`");
HC_Define_System_Options("multi-threading = full");
auto hdb = new HDB();
HWND hwnd = CreateWin32Window(1280, 800, L"HOOPS OpenVR Simple", false, WndProc);
VROptions vr_options;
vr_options.show_controllers = true;
vr_options.show_tracking_towers = false;
vr_options.preview_window_handle = (intptr_t)hwnd;
vr_options.vr_driver = driver_string;
HBaseModel * model = new HBaseModel();
vr_options.vr_model = model;
if (filename)
//Start a VR session
DemoVR vr_session(vr_options);
if (!vr_session.Initialize())
printf("Failed to initialize a VR session\n");
delete model;
return 0;
//Move the model a bit off the ground, and towards the screen
float min_x, max_x, min_z, max_z;
vr_session.GetPlayArea(min_x, max_x, min_z, max_z);
float play_area_size = max_z - min_z;
HPoint center;
float radius;
HC_Show_Bounding_Sphere(&center, &radius);
vr_session.GetVRView()->SetWindowColor(HPoint(100 / 255.0f, 149 / 255.0f, 237 / 255.0f), HPoint(200 / 255.0f, 200 / 255.0f, 200 / 255.0f));
HC_Translate_Object(-center.x, -center.y, -center.z);
HC_Scale_Object(1.0f / radius, 1.0f / radius, 1.0f / radius);
HC_Translate_Object(0.0f, 0.5f, -play_area_size);
while (vr_session.IsActive())
//While the VR session is running check for Win32 events.
//Terminate the VR session when the ESC button is pressed.
if (!ProcessWin32Events())
//Shutdown HOOPS
delete model;
delete hdb;
return 0;

Tracked Devices, Controllers and Head Mounted Display

The base VR class keeps track of all tracked devices (i.e., the tracking towers, the controllers, the headset, etc…), and updates their properties each frame.

Users can access data on tracked devices through the member variable tracked_device_data, by using the device index associated with the tracked device they are interested in.

This is the data associated with each tracked device:

class TrackedDevice
HC_KEY instance; //SegmentKey containing the model for this device. Can be invalid
float pose[16]; //The current pose of this device
float previous_pose[16]; //The pose this device had during the previous frame


float * controller_matrix = vr_session.tracked_device_data[device_index].pose;
float * previous_controller_matrix = vr_session.tracked_device_data[device_index].previous_pose;
float previous_controller_matrix_inverse[16];
HC_Compute_Matrix_Inverse(previous_controller_matrix, previous_controller_matrix_inverse);
//get the matrix to transform the old controller into the current one
float delta[16];
HC_Compute_Matrix_Product(previous_controller_matrix_inverse, controller_matrix, delta);

The move operator calculates a matrix which represents the difference in position for the device associated with it, from the previous frame to the current one:

  • Controllers are also available directly from the base VR class, through the member variables controller_one and controller_two.
  • Controllers can be valid or invalid. A controller is valid if it is connected.
  • Controllers have a variety of variables associated with them, describing their state. They also have a few helper functions which are designed to help users get information about the controller in an easy manner.

It is important to note that controllers are also part of the tracked devices. This means that the position in space of a controller can be determined by using its device index:


The Head Mounted Display (hmd) is the principal means of accessing the OpenVR API directly. Users will need it almost any time they wish to call OpenVR themselves.

The hmd is accessible through the GetHeadMountedDisplay() function.

Using the sample operators

The sample operators all work similarly. They need to be initialized with a reference to the VR session, so that they can have access to the tracked device data. A device index that corresponds to a controller needs to be "Attached" to them, alongside with an enum describing which button will cause the operator to become active.

All of the operators have a HandleFrame() function which should be called once controller input has been collected. This function checks that the button specified in the Attach() function is active before proceeding.

They all have a Detach() function which causes the operator to become inactive. These are simply suggestions on how interaction with VR can be accomplished, but users are of course free to experiment with any paradigm that suits their applications.

OpenVR Simple Sandbox

The OpenVR Simple sandbox application uses the API discussed above and is a good starting point for developing a VR application with Visualize. The source code for this app is available, and provides for a template for a custom VR class using all three operators.

Here are a few notes about the OpenVR Simple application:

  • It accepts two parameters: -filename and -driver. Valid values for the -driver parameter are directx11 and opengl2. If no value is provided for the -driver option, it defaults to dx11.
  • There are three operators: move, scale and selection.
    • Move is mapped to the trigger of the right controller
    • Select is mapped to the trigger of the left controller
    • Scale is enabled when both grips are pressed
  • The current FPS is shown in the win32 window

Other considerations

  • Performance is the main concern when using VR. For a smooth experience 60FPS should be the minimum requirement. Lower or jumpy frame rates can result in the user feeling sick. Because of this, the main render loop should be as bare-bones as possible. If you need to perform expensive calculations, it would be best to perform them on a separate thread.
  • When in VR mode you should expect that the camera is in constant movement, since it is tied to the position of the VR headset. As a consequence of this, some of the approaches used in highlighting and the default operators will not work in VR mode.

    For example:

    • The default orbit operator works by modifying the camera position. Since the position of the camera depends on the VR headset, it will be necessary to change the modelling matrix instead.
    • Because of the constant camera changes, Visualize will not cache the status of rendering buffers between updates. In turn this means that the heuristic setting of quick moves=spriting, which relies on these cached buffers, cannot be used effectively. Highlights with a heuristic setting of quick moves=spriting in a VR application will be performed sub-optimally.

  • Some view-dependent geometry, like non-transformable text and patterned lines will not transform correctly in stereo mode. This will create artifacts where it will look like two instances of said geometry are visible, unless the user closes one of their eyes.

top_level:4 api_ref/additional_resources:1