Introduction

Getting Started

Programming Guides

API Reference

Additional Resources

Please note, AR/VR is currently only supported on Windows Desktop when using the C++ API and the DirectX 11 or OpenGL 2 drivers.

Virtual Reality

This section provides an overview of how to use Visualize to render in a VR headset. Most of the source code presented below is available in the openvr_simple sandbox, which is located in the package in the directory; this project is also accessible in the samples_v1xx solution located in the root directory of the Visualize package.

In addition, the full source code for the the move, select, and scale operators is also provided in openvr_simple project, and you are encouraged to tailor these operators to meet the specific requirements of your VR applications or to develop new operators using these as a starting point.

Prerequisites

OpenVR

Visualize supports any VR headset compatible with OpenVR. In order to add VR to your application, the following items are required:

For OpenVR, you'll need a VR-capable GPU. For current NVIDIA GPUs, this would be a GTX 1060 or better. For AMD, this would be an RX 480 or better.

To get started with OpenVR, follow these steps:

For Oculus Rift only:

The VR API

The VR API is packaged as source code within the samples_v1xx solution in the openvr_simple_v1xx project.

The VR API is currently made up of these files:

File nameDescription
vr.h/cppBasic VR class which handles viewing a model in VR
vr_move.h/cppVR operator – rotates and translates the model when a button is held down
vr_selection.h/cppVR operator – selects and highlights an entity when a button is pressed
vr_scale.h/cppVR operator – scales the whole model based on the distance between the two controllers when a button is pressed on each controller

To use the VR API, simply add the files above to your project.

These files are located in hoops_3df/Dev_Tools/hoops_vr.

Note that while the code in the API is not platform specific, Virtual Reality is currently only supported on Windows.

How to create a VR application using the VR Class

The VR class handles initializing and shutting down VR, processing events coming from the headset, and keeping all tracked devices and controllers up to date.

It can be used by the user both for simply viewing a model and for reacting to events and button presses.

The first step in creating a VR object is to create a class that derives from it.

The VR class has some pure virtual functions, so those functions should be present in the derived class:

class DemoVR : public VR
{
public:
DemoVR(VROptions const & options)
: VR(options)
, selection_op(SelectionOperator(*this)) // These three operators are optional.
, move_op(MoveOperator(*this))
, scale_op(ScaleOperator(*this))
{
HC_Show_Time(&time);
}
void Setup() override
{
//implementation goes here
}
void ProcessEvent(vr::VREvent_t const & in_event) override
{
//implementation goes here
}
void ProcessInput() override
{
//implementation goes here
}
void ProcessFrame() override
{
//implementation goes here
}
}

When creating a VR object, these options should be passed to it:

class VROptions
{
public:
VROptions();
intptr_t preview_window_handle; //a handle to a window where to show a preview of what the headset is seeing. Set to 0 to disable the preview.
std::string vr_driver; //the driver used for VR. DirectX11 and OpenGL2 are supported.
HBaseModel * vr_model; //used to use pre-existing Models with VR
bool show_controllers; //whether to show 3D models of the controllers
bool show_tracking_towers; //whether to show 3D models of the tracking towers
};

The vr_model option is used in cases where you have a desktop application which has an optional VR mode (as can be seen in the HOOPS Demo Viewer).

In such cases, you might want to preserve the current view as it is and show it in VR. This can be done by passing it as an option.

If vr_model is not initialized, a brand new HBaseModel will be created for VR, and will be disposed of once the VR session is over.

Starting a VR session

A VR session starts once the user calls VR::Start(), and ends when the user calls VR::Stop() or the VR object goes out of scope.

Starting a VR session means that the 3D scene will be rendered to the VR headset. Here's a basic example of stopping and starting a session:

VROptions default_options;
DemoVR vr_session(vr_options);
...
vr_session.Initialize();
...
vr_session.Start(); // VR Session starts here
...
vr_session.Stop(); // VR Session ends here

And an example of session lifetime with scoping:

{
myVR scoped_session(default_options);
scoped_session.Start(); // VR Session starts here
} // scoped VR Session ends here

The VR session will execute in a loop on a separate thread, and the calling thread will continue to the next instruction.

Initializing a VR session

Optionally, the user can decide to Initialize the VR session before Starting it (the session will be automatically initialized when Start is called if the user did not Initialize it beforehand).

The reason why someone might want to Initialize a VR session before starting is that after Initialize is called every part of the VR session is valid but rendering to the headset is not started.

This means that users can, for example, check how many controllers are connected or access the View to make changes to it before rendering starts.

Example:

{
myVR vr_session(default_options);
vr_session.GetVRView()->GetViewKey(); // WRONG! Session hasn't been initialized yet.
vr_session.Initialize();
vr_session.GetVRView()->GetViewKey();
}
{
myVR other_vr_session(default_options);
other_vr_session.Start();
other_vr_session.GetVRView()->GetViewKey(); // OK because the Start() function also initializes the session.
}

Interacting with VR

At this point, we've created and started a VR session. If we just wanted to view the model in VR, then nothing else is needed.

Most likely, however, we'll want to interact with the scene in VR, and therefore we'll need to implement the four functions required in all classes that derive from VR:

FunctionDescription
Setup()

This function is called only once, before the very first frame of a VR session begins.

This is intended to contain any initialization of VR-specific objects you're using in your application.

ProcessEvent()

ProcessEvent() is called every time the VR headset reports a new event, potentially multiple times per frame.

You can check the type of event received, do something with events you're interested in, and ignore the rest.

ProcessInput()

ProcessInput() is called once per frame, after the input from controllers has been collected and the position of the tracked devices into space has been updated.

You can use this function to make changes to the VR scene based on controller input.

ProcessFrame() ProcessFrame() is called once per frame, right before the frame is presented to the headset. This is a place where you can perform per-frame operations that are not related to controller inputs.

Sample Implementation

Here is a basic example of how to implement a class derived from the VR class. In this example:

(For a more detailed implementation of this example, please refer to the openvr_simple sandbox source code.)

Example code:

#include "vr.h"
#include "vr_move.h"
class myVR : public VR
{
public:
myVR(VROptions const & in_options);
virtual ~myVR();
void Setup() override;
void ProcessEvent(vr::VREvent const & in_event) override;
void ProcessInput() override;
void ProcessFrame() override;
private:
MoveOperator move_op;
};
myVR::myVR(VROptions const & in_options)
: VR(options)
, move_op(MoveOperator(*this))
{}
myVR::~myVR()
{
move_op.Detach();
}
void myVR::Setup()
{
//check if controller_one is connected, and if so, use it with the move operator
//as part of the operator Attach call, we also specify which button the operator will respond to
if (controller_one.valid)
move_op.Attach(controller_one.device_index, vr::EVRButtonId::k_EButton_SteamVR_Trigger);
}
void myVR::ProcessEvent(vr::VREvent_t const & in_event)
{
//check for events that indicate that a tracked device was disconnected
//if a device with the same index as the controller used for the move operator is disconnected
//we will Detach the operator.
//We could also check for devices being connected, and re-attach the operator in the case where
//the controller is connected again.
if (in_event.eventType == vr::VREvent_TrackedDeviceDeactivated)
{
if (in_event.trackedDeviceIndex == controller_one.device_index)
move_op.Detach();
}
}
void myVR::ProcessInput()
{
//At this point all input from controllers has been collected.
//We leave the move operator to handle this frame
//Note that HandleFrame checked that the connected device is still valid,
//so nothing will happen if the controller associated with this operator
//got disconnected
move_op.HandleFrame();
}
void ProcessFrame()
{
//This would be a good time to check our frame rate
//An improvement would be to only check at fixed intervals to provide more readable feedback
printf("FPS: %d", (int)CurrentFrameRate());
}
int main()
{
const char * filename = "my_file_path";
std::string driver_string = "dx11";
HC_Define_System_Options("license = `" HOOPS_LICENSE "`");
HC_Define_System_Options("multi-threading = full");
auto hdb = new HDB();
hdb->Init();
HWND hwnd = CreateWin32Window(1280, 800, L"HOOPS OpenVR Simple", false, WndProc);
VROptions vr_options;
vr_options.show_controllers = true;
vr_options.show_tracking_towers = false;
vr_options.preview_window_handle = (intptr_t)hwnd;
vr_options.vr_driver = driver_string;
HBaseModel * model = new HBaseModel();
model->Init();
vr_options.vr_model = model;
if (filename)
model->Read(filename);
model->SetStaticModel(true);
//Start a VR session
DemoVR vr_session(vr_options);
if (!vr_session.Initialize())
{
printf("Failed to initialize a VR session\n");
delete model;
return 0;
}
//Move the model a bit off the ground, and towards the screen
float min_x, max_x, min_z, max_z;
vr_session.GetPlayArea(min_x, max_x, min_z, max_z);
float play_area_size = max_z - min_z;
HC_Open_Segment_By_Key(model->GetModelKey());
HPoint center;
float radius;
HC_Show_Bounding_Sphere(&center, &radius);
HC_Close_Segment();
vr_session.GetVRView()->SetWindowColor(HPoint(100 / 255.0f, 149 / 255.0f, 237 / 255.0f), HPoint(200 / 255.0f, 200 / 255.0f, 200 / 255.0f));
HC_Open_Segment_By_Key(vr_session.GetVRView()->GetOverwriteKey());
{
HC_Translate_Object(-center.x, -center.y, -center.z);
HC_Scale_Object(1.0f / radius, 1.0f / radius, 1.0f / radius);
HC_Translate_Object(0.0f, 0.5f, -play_area_size);
}
HC_Close_Segment();
vr_session.Start();
while (vr_session.IsActive())
{
//While the VR session is running check for Win32 events.
//Terminate the VR session when the ESC button is pressed.
if (!ProcessWin32Events())
{
vr_session.Cleanup();
vr_session.Stop();
}
}
//Shutdown HOOPS
delete model;
delete hdb;
return 0;
}

Tracked Devices, Controllers and Head Mounted Display

The base VR class keeps track of all tracked devices (i.e., the tracking towers, the controllers, the headset, etc…), and updates their properties each frame.

Users can access data on tracked devices through the member variable tracked_device_data, by using the device index associated with the tracked device they are interested in.

This is the data associated with each tracked device:

class TrackedDevice
{
public:
TrackedDevice();
HC_KEY instance; //SegmentKey containing the model for this device. Can be invalid
float pose[16]; //The current pose of this device
float previous_pose[16]; //The pose this device had during the previous frame
};

Example:

float * controller_matrix = vr_session.tracked_device_data[device_index].pose;
float * previous_controller_matrix = vr_session.tracked_device_data[device_index].previous_pose;
float previous_controller_matrix_inverse[16];
HC_Compute_Matrix_Inverse(previous_controller_matrix, previous_controller_matrix_inverse);
//get the matrix to transform the old controller into the current one
float delta[16];
HC_Compute_Matrix_Product(previous_controller_matrix_inverse, controller_matrix, delta);

The move operator calculates a matrix which represents the difference in position for the device associated with it, from the previous frame to the current one:

It is important to note that controllers are also part of the tracked devices. This means that the position in space of a controller can be determined by using its device index:

tracked_device_data[controller_one.device_index].pose;

The Head Mounted Display (hmd) is the principal means of accessing the OpenVR API directly. Users will need it almost any time they wish to call OpenVR themselves.

The hmd is accessible through the GetHeadMountedDisplay() function.

Using the Sample Operators

The sample operators all work similarly. They need to be initialized with a reference to the VR session, so that they can have access to the tracked device data. A device index that corresponds to a controller needs to be "Attached" to them, alongside with an enum describing which button will cause the operator to become active.

All of the operators have a HandleFrame() function which should be called once controller input has been collected. This function checks that the button specified in the Attach() function is active before proceeding.

They all have a Detach() function which causes the operator to become inactive. These are simply suggestions on how interaction with VR can be accomplished, but users are of course free to experiment with any paradigm that suits their applications.

OpenVR Simple Sandbox

The OpenVR Simple sandbox application uses the API discussed above and is a good starting point for developing a VR application with Visualize. The source code for this app is available, and provides for a template for a custom VR class using all three operators.

Here are a few notes about the OpenVR Simple application:

Considerations