Augmented Reality

Please note, HoloLens AR is currently only supported on Windows Desktop when using the C++ API and the DirectX 11 driver.

For detailed instructions on setting up AR/VR with Visualize, please see the Getting Started section.

Visualize currently supports AR in both HoloLens native and HoloLens remoting applications. A HoloLens native app is built for the HoloLens architecture and deployed to the device. A HoloLens remoting application uses a remotely connected computer to render the image and sends it over a WiFi network to the HoloLens. While there are many similarities in how these applications are structured, we will address them separately.

HoloLens Native

In order to add native HoloLens support to your application, the following items are required:

  • A HoloLens headset

  • HoloLens is currently only supported on Windows Desktop, when using the C++ API, and the DirectX 11 driver.

For a reference implementation of HoloLens native in a Visualize application, you can refer to the HPS Holographic Sandbox, bundled with your copy of Visualize.

1. Set Up PC and HoloLens

You must enable Developer mode on both the PC and HoloLens, and pair the devices, to allow deployment from Visual Studio on the PC to the HoloLens device. For details, see Microsoft’s documentation here: https://docs.microsoft.com/en-us/windows/mixed-reality/using-visual-studio

2. Initializing Application in Visualize

The fundamentally essential steps are to overload the Windows::ApplicationModel::Core::IFrameworkView and Windows::ApplicationModel::Core::IFrameworkViewSource classes, and call CoreApplication::Run(yourIFrameworkViewSource).

The IFrameworkViewSource class only needs to create a new IFrameworkView object. See the AppViewSource class in AppView.h and AppView.cpp for an example.

The IFrameworkView class should overload the various triggers that can come from the HoloLens: ViewActivated, Suspending, Resuming, WindowClosed, etc. See the AppView class in AppView.h and AppView.cpp for an example.

We recommend separating the HPS related activity into a separate class - in the sample, we use HolographicSandboxMain.

3. Initializing HoloLens in Visualize

In the IFrameworkView::Initialize method, you can set up HPS to handle the HoloLens device. This includes an InitPicture event handler to maintain the HoloLens context:

4. Start a Render Loop

In the IFrameworkView::Run method, we process events coming in from the HoloLens, update the scene, and render to a new frame. We do this by requesting a new frame from the HoloLens, rendering into that frame, and then presenting the frame back to the device:

HoloLens Remoting

In order to add remote HoloLens support to your application, the following items are required:

  • A HoloLens headset

  • HoloLens is currently only supported on Windows desktop, when using the C++ API, and the DirectX 11 driver.

For a reference implementation of HoloLens remote in a Visualize application, you can refer to the HPS Holographic Remoting Sandbox, bundled with your copy of Visualize.

1. Set Up PC and HoloLens

Connect the PC and HoloLens to the same network. Open the Holographic Remoting application on the HoloLens. This should display an IP address, which you will need to provide to your code. The Remoting sample app uses a command line argument.

2. Initializing Application in Visualize

The fundamental steps to HoloLens remoting are similar to HoloLens native development. You must create an HPS offscreen window, connect to the device, and then start your update/render loop. See the main function in main.cpp for details. The difference is that instead of overloading functions in the IFrameworkView class, you must instead initialize a HolographicStreamerHelpers object, and assign ConnectedEvents to its members: OnConnected, OnDisconnected, etc.

3. Connect to HoloLens From PC

The HolographicStreamerHelpers class contains most of the ways of interacting with the HoloLens. In particular, it has a Connect function that you must call in order to start receiving poses and sending frames to the device. You pass this the IP address of the HoloLens headset, and a port number - by default, 8001. We can also get the HolographicSpace from the HolographicStreamerHelpers object, which will be how our render loop receives poses from the device.

4. Render Loop

We then start the main loop. We request a prediction of where the cameras will be from the device, then set those cameras in HPS. We then update the HPS window, and Present the result to the HoloLens. See the main loop for an example. In particular, see the Update and Render functions.

Considerations

  • Performance is the main concern when using AR. For a smooth experience, 60FPS should be the minimum requirement. Lower or jumpy frame rates can result in the user feeling sick. Because of this, the main render loop should be as bare-bones as possible. If you need to perform expensive calculations, it would be best to perform them on a separate thread.

  • When in AR mode you should expect that the camera is in constant movement, since it is tied to the position of the AR headset. As a consequence of this, some of the approaches used in highlighting and the default operators will not work in AR mode. For example:

    • Visualize’s default orbit operator works by modifying the camera position. Since the position of the camera depends on the AR headset, it will be necessary to change the model modelling matrix instead.

    • Because of the constant camera changes, Visualize will not cache the status of rendering buffers between updates. In turn this means that the overlay highlighting setting of HPS::Drawing::Overlay::WithZValues, which relies on these cached buffers, cannot be used effectively. Highlights with an overlay setting of HPS::Drawing::Overlay::WithZValues in an AR application will be performed sub-optimally.

  • Currently the Hide operation, and in general Highlight operation with the InPlace Overlay settings are very expensive to use, performance-wise, and should be avoided in VR mode. This is something we are currently addressing for Visualize 2018 SP2.

  • Some view dependent geometry, like non-transformable text and patterned line will not transform correctly in stereo mode. This will create artifacts where it will look like two instances of said geometry are visible, unless the user closes one of their eyes.

  • UI design in the HoloLens is an active area of interest. For Microsoft’s best practices, see here: https://docs.microsoft.com/en-us/windows/mixed-reality/design

  • For developing UI with the HoloLens, you can directly query the HoloLens API. See examples of gesture detection, speech recognition, and more here: https://github.com/Microsoft/MixedRealityCompanionKit

  • (Remoting) Network latency is paramount to a well-behaved remoting application. It is recommended to use the HolographicStreamerHelpers->SetMaxBitrate() function to set an appropriate bitrate, to balance image quality with latency. We recommend a value of around 4k.

  • (Remoting) Since the PC’s graphics card is used for rendering, it is recommended that you do not simultaneously run graphics intense processes on the PC as the remoting application.