Please note, HoloLens AR is currently only supported on Windows Desktop when using the C++ API and the DirectX 11 driver.

Augmented Reality


For detailed instructions on setting up AR/VR with Visualize, please see the Getting Started section.

Visualize currently supports AR in both HoloLens native and HoloLens remoting applications. A HoloLens native app is built for the HoloLens architecture and deployed to the device. A HoloLens remoting application uses a remotely connected computer to render the image and sends it over a WiFi network to the HoloLens. While there are many similarities in how these applications are structured, we will address them separately.

HoloLens Native

In order to add native HoloLens support to your application, the following items are required:

  • A HoloLens headset
  • HoloLens is currently only supported on Windows Desktop, when using the C++ API, and the DirectX 11 driver.

For a reference implementation of HoloLens native in a Visualize application, you can refer to the HPS Holographic Sandbox, bundled with your copy of Visualize.

1. Set Up PC and HoloLens

You must enable Developer mode on both the PC and HoloLens, and pair the devices, to allow deployment from Visual Studio on the PC to the HoloLens device. For details, see Microsoft's documentation here: https://docs.microsoft.com/en-us/windows/mixed-reality/using-visual-studio

2. Initializing Application in Visualize

The fundamentally essential steps are to overload the Windows::ApplicationModel::Core::IFrameworkView and Windows::ApplicationModel::Core::IFrameworkViewSource classes, and call CoreApplication::Run( yourIFrameworkViewSource).

The IFrameworkViewSource class only needs to create a new IFrameworkView object. See the AppViewSource class in AppView.h and AppView.cpp for an example.

The IFrameworkView class should overload the various triggers that can come from the HoloLens: ViewActivated, Suspending, Resuming, WindowClosed, etc. See the AppView class in AppView.h and AppView.cpp for an example.

We recommend separating the HPS related activity into a separate class - in the sample, we use HolographicSandboxMain.

3. Initializing HoloLens in Visualize

In the IFrameworkView::Initialize method, you can set up HPS to handle the HoloLens device. This includes an InitPictureEventHandler to maintain the HoloLens context.

HolographicSandboxMain::HolographicSandboxMain()
{
//Step 1. Set up an offscreen window for HoloLens
oswok.SetHardwareResident(true);
oswok.SetAntiAliasCapable(false);
oswok.SetOpacity(0.0f);
m_window_key = HPS::Database::CreateOffScreenWindow(1268, 720, oswok);
m_window_aspect = 1268.0f / 720;
//Step 2. Set HPS driver bit to render to HoloLens
HPS_Driver_Set_HoloLens_Bit(m_window_key, 0);
//Step 3. Create an Init Picture Handler (see below)
m_window_key.SetDriverEventHandler(m_init_picture_handler, HPS::Object::ClassID<HPS::InitPictureEvent>());
m_window_key.UpdateWithNotifier().Wait();
assert(m_init_picture_handler.dx11_device.Get());
//Step 4 create HPS canvas, view, and model
auto portfolio = HPS::Database::CreatePortfolio();
m_canvas = HPS::Factory::CreateCanvas(m_window_key, portfolio);
m_canvas.AttachViewAsLayout(m_view);
m_view.GetSegmentKey().GetDrawingAttributeControl().SetWorldHandedness(HPS::Drawing::Handedness::Right);
m_view.GetSegmentKey().GetMaterialMappingControl().SetWindowColor(HPS::RGBAColor(0.0f, 0.0f, 0.0f, 0.0f));
m_view.GetSegmentKey().GetCullingControl().SetFrustum(true).SetExtent(0);
auto model = HPS::Factory::CreateModel();
m_view.AttachModel(model);
//Your Initialization code here
}
//InitPictureHandler class is an HPS class with just one method - Handle
class InitPictureHandler : public HPS::DriverEventHandler
{
public:
Microsoft::WRL::ComPtr<ID3D11Device> dx11_device;
ID3D11Texture2D * target = nullptr;
HPS::MatrixKit proj_left;
HPS::MatrixKit proj_right;
HPS::MatrixKit view_left;
HPS::MatrixKit view_right;
void Handle(HPS::DriverEvent const * in_event);
};
//This method populates the projection and view matrices for each eye
void InitPictureHandler::Handle(HPS::DriverEvent const * in_event)
{
auto e = (HPS::InitPictureEvent const *)in_event;
static bool first_update = true;
if (first_update)
{
auto ctx = (ID3D11DeviceContext*)e->GetGraphicsContext();
ctx->GetDevice(&dx11_device);
first_update = false;
}
else
{
e->SetRenderSurface((HPS::OpaqueHandle)target);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionLeft, proj_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionRight, proj_right);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewLeft, view_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewRight, view_right);
}
}

4. Start a Render Loop

In the IFrameworkView::Run method, we process events coming in from the HoloLens, update the scene, and render to a new frame. We do this by requesting a new frame from the HoloLens, rendering into that frame, and then presenting the frame back to the Device.

//m_main is of class HolographicSandboxMain
void AppView::Run()
{
while (!m_windowClosed)
{
if (m_windowVisible && (m_holographicSpace != nullptr))
{
CoreWindow::GetForCurrentThread()->Dispatcher->ProcessEvents(CoreProcessEventsOption::ProcessAllIfPresent);
HolographicFrame^ holographicFrame = m_main->Update();
if (m_main->Render(holographicFrame))
{
// The holographic frame has an API that presents the swap chain for each
HolographicFramePresentResult present_result = holographicFrame->PresentUsingCurrentPrediction(HolographicFramePresentWaitBehavior::DoNotWaitForFrameToFinish);
}
}
else
{
CoreWindow::GetForCurrentThread()->Dispatcher->ProcessEvents(CoreProcessEventsOption::ProcessOneAndAllPending);
}
}
}
// Updates the application state once per frame.
HolographicFrame^ HolographicSandboxMain::Update()
{
// The HolographicFrame has information that the app needs in order
// to update and render the current frame. The app begins each new
// frame by calling CreateNextFrame.
HolographicFrame^ holographicFrame = m_holographicSpace->CreateNextFrame();
// Get a prediction of where holographic cameras will be when this frame
// is presented.
HolographicFramePrediction^ prediction = holographicFrame->CurrentPrediction;
// Next, we get a coordinate system from the attached frame of reference that is
// associated with the current frame. Later, this coordinate system is used for
// for creating the stereo view matrices when rendering the sample content.
SpatialCoordinateSystem^ currentCoordinateSystem = m_referenceFrame->CoordinateSystem;
// We complete the frame update by using information about our content positioning
// to set the focus point.
for (auto cameraPose : prediction->CameraPoses)
{
// The HolographicCameraRenderingParameters class provides access to set
// the image stabilization parameters.
HolographicCameraRenderingParameters^ renderingParameters = holographicFrame->GetRenderingParameters(cameraPose);
// SetFocusPoint informs the system about a specific point in your scene to
// prioritize for image stabilization. The focus point is set independently
// for each holographic camera.
// You should set the focus point near the content that the user is looking at.
// In this example, we put the focus point at the center of the sample hologram,
// since that is the only hologram available for the user to focus on.
// You can also set the relative velocity and facing of that content; the sample
// hologram is at a fixed point so we only need to indicate its position.
renderingParameters->SetFocusPoint(currentCoordinateSystem, m_focus_point);
}
// The holographic frame will be used to get up-to-date view and projection matrices and
// to present the swap chain.
return holographicFrame;
}
// Renders the current frame to each holographic camera, according to the
// current application and spatial positioning state. Returns true if the
// frame was rendered to at least one camera.
bool HolographicSandboxMain::Render(Windows::Graphics::Holographic::HolographicFrame^ holographicFrame)
{
// Up-to-date frame predictions enhance the effectiveness of image stabilization and
// allow more accurate positioning of holograms.
holographicFrame->UpdateCurrentPrediction();
auto prediction = holographicFrame->CurrentPrediction;
bool atLeastOneCameraRendered = false;
for (auto cameraPose : prediction->CameraPoses)
{
// The view and projection matrices for each holographic camera will change
// every frame.
auto & cameraProjectionTransform = cameraPose->ProjectionTransform;
// If TryGetViewTransform returns a null pointer, that means the pose and coordinate
// system cannot be understood relative to one another; content cannot be rendered
// in this coordinate system for the duration of the current frame.
// This usually means that positional tracking is not active for the current frame, in
// which case it is possible to use a SpatialLocatorAttachedFrameOfReference to render
// content that is not world-locked instead.
Platform::IBox<HolographicStereoTransform>^ viewTransformContainer = cameraPose->TryGetViewTransform(m_referenceFrame->CoordinateSystem);
if (viewTransformContainer)
{
auto & viewCoordinateSystemTransform = viewTransformContainer->Value;
UpdateCameraParams(
(float const *)&cameraProjectionTransform.Left,
(float const *)&cameraProjectionTransform.Right,
(float const *)&viewCoordinateSystemTransform.Left,
(float const *)&viewCoordinateSystemTransform.Right,
m_window_aspect,
m_init_picture_handler.proj_left,
m_init_picture_handler.proj_right,
m_init_picture_handler.view_left,
m_init_picture_handler.view_right,
m_camera_kit);
}
HolographicCameraRenderingParameters^ camera_parameters = holographicFrame->GetRenderingParameters(cameraPose);
ComPtr<ID3D11Resource> resource;
ThrowIfFailed(GetDXGIInterfaceFromObject(camera_parameters->Direct3D11BackBuffer, IID_PPV_ARGS(&resource)));
ComPtr<ID3D11Texture2D> camera_backbuffer;
ThrowIfFailed(resource.As(&camera_backbuffer));
m_init_picture_handler.target = camera_backbuffer.Get();
m_window_key.UpdateWithNotifier().Wait();
m_init_picture_handler.target = nullptr;
// docs say to use last frame's camera for culling
m_view.GetSegmentKey().SetCamera(m_camera_kit);
atLeastOneCameraRendered = true;
}
return atLeastOneCameraRendered;
}

HoloLens Remoting

In order to add remote HoloLens support to your application, the following items are required:

  • A HoloLens headset
  • HoloLens is currently only supported on Windows Desktop, when using the C++ API, and the DirectX 11 driver.

For a reference implementation of HoloLens remote in a Visualize application, you can refer to the HPS Holographic Remoting Sandbox, bundled with your copy of Visualize.

1. Set Up PC and HoloLens

Connect the PC and HoloLens to the same network. Open the Holographic Remoting application on the HoloLens. This should display an IP address, which you will need to provide to your code. The Remoting sample app uses a command line argument.

2. Initializing Application in Visualize

The fundamental steps to HoloLens Remoting are similar to HoloLens native development. You must create an HPS offscreen window, connect to the device, and then start your update/render loop. See the main function in main.cpp for details. The difference is that instead of overloading functions in the IFrameworkView class, you must instead initialize a HolographicStreamerHelpers object, and assign ConnectedEvents to its members: OnConnected, OnDisconnected, etc.

#include <d3d11_1.h>
#include <dxgi1_3.h>
#include <assert.h>
#include <HolographicStreamerHelpers.h>
#include <windows.graphics.directx.direct3d11.interop.h>
#pragma comment(lib, "d3d11.lib")
#include "hps.h"
#include "sprk.h"
#include "hoops_license.h"
#include "../vr_shared/arvr.h"
using namespace Microsoft::Holographic;
using namespace Microsoft::WRL;
using namespace Microsoft::WRL::Wrappers;
using namespace Windows::Graphics::Holographic;
using namespace Windows::Graphics::DirectX::Direct3D11;
using namespace Windows::Perception::Spatial;
static const int VIDEO_FRAME_WIDTH = 1280;
static const int VIDEO_FRAME_HEIGHT = 720;
static const float window_aspect = (float)VIDEO_FRAME_WIDTH / VIDEO_FRAME_HEIGHT;
static HPS::Canvas canvas;
static HPS::CameraKit camera_kit;
static HPS::OffScreenWindowKey window_key;
static HPS::View view;
static HPS::SegmentKey help_seg;
static Platform::String^ ip_address = L"127.0.0.1";
static Windows::Graphics::Holographic::HolographicSpace^ holographic_space;
static HolographicStreamerHelpers^ streamer_helpers;
static Windows::Perception::Spatial::SpatialStationaryFrameOfReference^ reference_frame;
static bool connected = false;
//InitPictureHandler class is an HPS class with just one method - Handle
//This method populates the projection and view matrices for each eye
class InitPictureHandler : public HPS::DriverEventHandler
{
public:
ComPtr<ID3D11Device> dx11_device;
ID3D11Texture2D * target = nullptr;
HPS::MatrixKit proj_left;
HPS::MatrixKit proj_right;
HPS::MatrixKit view_left;
HPS::MatrixKit view_right;
void Handle(HPS::DriverEvent const * in_event)
{
auto e = (HPS::InitPictureEvent const *)in_event;
static bool first_update = true;
if (first_update)
{
auto ctx = (ID3D11DeviceContext*)e->GetGraphicsContext();
ctx->GetDevice(&dx11_device);
first_update = false;
}
else
{
e->SetRenderSurface((HPS::OpaqueHandle)target);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionLeft, proj_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionRight, proj_right);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewLeft, view_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewRight, view_right);
}
}
};
static void Connect()
{
if (connected)
{
streamer_helpers->Disconnect();
connected = false;
}
else
{
try
{
streamer_helpers->Connect(ip_address->Data(), 8001);
}
catch (Platform::Exception^ ex)
{
DebugLog(L"Connect failed with hr = 0x%08X", ex->HResult);
}
}
}
int main(Platform::Array<Platform::String^>^ args)
{
RoInitializeWrapper roinit(RO_INIT_MULTITHREADED);
ThrowIfFailed(HRESULT(roinit));
Platform::String^ filename;
float scale = 1.0f;
for (uint32_t i = 1; i < args->Length; ++i)
{
if (args[i] == "-ip" && i + 1 < args->Length)
ip_address = args[++i]; //Get the ip address of the device
}
auto world = new HPS::World(HOOPS_LICENSE);
HWND hwnd = CreateWin32Window(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT, L"HPS Holographic Remoting Sandbox", false, wndproc);
oswok.SetHardwareResident(true);
oswok.SetAntiAliasCapable(true);
oswok.SetOpacity(0.0f);
window_key = HPS::Database::CreateOffScreenWindow(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT, oswok);
HPS_Driver_Set_HoloLens_Bit(window_key, (HPS::WindowHandle)hwnd);
auto portfolio = HPS::Database::CreatePortfolio();
canvas = HPS::Factory::CreateCanvas(window_key, portfolio);
InitPictureHandler init_picture_handler;
window_key.SetDriverEventHandler(init_picture_handler, HPS::Object::ClassID<HPS::InitPictureEvent>());
window_key.UpdateWithNotifier().Wait();
assert(init_picture_handler.dx11_device.Get());
view.GetSegmentKey().GetDrawingAttributeControl().SetWorldHandedness(HPS::Drawing::Handedness::Right);
view.GetSegmentKey().GetMaterialMappingControl().SetWindowColor(HPS::RGBAColor(0.0f, 0.0f, 0.0f, 0.0f));
view.GetSegmentKey().GetCullingControl().SetFrustum(true).SetExtent(0);
//This segment will be displayed on the PC
help_seg = view.GetSegmentKey().Subsegment("help_text");
help_seg.InsertText(HPS::Point(0, 0, 0), "Press space bar to connect");
help_seg.GetVisibilityControl().SetText(true);
help_seg.GetMaterialMappingControl().SetTextColor(HPS::RGBAColor::White());
canvas.AttachViewAsLayout(view);
auto model = HPS::Factory::CreateModel();
view.AttachModel(model);
Windows::Foundation::Numerics::float3 focus_point;
streamer_helpers = ref new HolographicStreamerHelpers();
streamer_helpers->CreateStreamer(init_picture_handler.dx11_device.Get());
streamer_helpers->SetVideoFrameSize(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT);
streamer_helpers->SetMaxBitrate(4 * 1024);
streamer_helpers->SetEnableAudio(false);
holographic_space = streamer_helpers->HolographicSpace;
{
// Acquire the DXGI interface for the Direct3D device.
ComPtr<IDXGIDevice3> dxgi_device;
ThrowIfFailed(init_picture_handler.dx11_device.As<IDXGIDevice3>(&dxgi_device));
// Wrap the native device using a WinRT interop object.
auto interop_device = CreateDirect3DDevice(dxgi_device.Get());
holographic_space->SetDirect3D11Device(interop_device);
}
reference_frame = SpatialLocator::GetDefault()->CreateStationaryFrameOfReferenceAtCurrentLocation();
streamer_helpers->OnConnected += ref new ConnectedEvent(
[]()
{
DebugLog(L"Connected");
connected = true;
});
//Handle disconnection
Platform::WeakReference streamerHelpersWeakRef = Platform::WeakReference(streamer_helpers);
streamer_helpers->OnDisconnected += ref new DisconnectedEvent(
[streamerHelpersWeakRef](_In_ HolographicStreamerConnectionFailureReason failureReason)
{
DebugLog(L"Disconnected with reason %d", failureReason);
connected = false;
// Reconnect if this is a transient failure.
if (failureReason == HolographicStreamerConnectionFailureReason::Unreachable ||
failureReason == HolographicStreamerConnectionFailureReason::ConnectionLost)
{
DebugLog(L"Reconnecting...");
try
{
auto helpersResolved = streamerHelpersWeakRef.Resolve<HolographicStreamerHelpers>();
if (helpersResolved)
helpersResolved->Connect(ip_address->Data(), 8001);
else
DebugLog(L"Failed to reconnect because a disconnect has already occurred.\n");
}
catch (Platform::Exception^ ex)
{
DebugLog(L"Connect failed with hr = 0x%08X", ex->HResult);
}
}
else
{
DebugLog(L"Disconnected with unrecoverable error, not attempting to reconnect.");
}
});
streamer_helpers->OnSendFrame += ref new SendFrameEvent(
[](_In_ const ComPtr<ID3D11Texture2D>& spTexture, _In_ FrameMetadata metadata)
{
});
// Main Update/Render loop
// See hololens_remote_2.cpp for an example
.....
}

3. Connect to HoloLens from PC

The HolographicStreamerHelpers class contains most of the ways of interacting with the HoloLens. In particular, it has a Connect function that you must call in order to start receiving poses and sending frames to the device. You pass this the IP address of the HoloLens headset, and a port number - by default 8001. We can also get the HolographicSpace from the HolographicStreamerHelpers object, which will be how our render loop receives poses from the device.

#include <d3d11_1.h>
#include <dxgi1_3.h>
#include <assert.h>
#include <HolographicStreamerHelpers.h>
#include <windows.graphics.directx.direct3d11.interop.h>
#pragma comment(lib, "d3d11.lib")
#include "hps.h"
#include "sprk.h"
#include "hoops_license.h"
#include "../vr_shared/arvr.h"
using namespace Microsoft::Holographic;
using namespace Microsoft::WRL;
using namespace Microsoft::WRL::Wrappers;
using namespace Windows::Graphics::Holographic;
using namespace Windows::Graphics::DirectX::Direct3D11;
using namespace Windows::Perception::Spatial;
static const int VIDEO_FRAME_WIDTH = 1280;
static const int VIDEO_FRAME_HEIGHT = 720;
static const float window_aspect = (float)VIDEO_FRAME_WIDTH / VIDEO_FRAME_HEIGHT;
static HPS::Canvas canvas;
static HPS::CameraKit camera_kit;
static HPS::OffScreenWindowKey window_key;
static HPS::View view;
static HPS::SegmentKey help_seg;
static Platform::String^ ip_address = L"127.0.0.1";
static Windows::Graphics::Holographic::HolographicSpace^ holographic_space;
static HolographicStreamerHelpers^ streamer_helpers;
static Windows::Perception::Spatial::SpatialStationaryFrameOfReference^ reference_frame;
static bool connected = false;
//InitPictureHandler class is an HPS class with just one method - Handle
//This method populates the projection and view matrices for each eye
class InitPictureHandler : public HPS::DriverEventHandler
{
public:
ComPtr<ID3D11Device> dx11_device;
ID3D11Texture2D * target = nullptr;
HPS::MatrixKit proj_left;
HPS::MatrixKit proj_right;
HPS::MatrixKit view_left;
HPS::MatrixKit view_right;
void Handle(HPS::DriverEvent const * in_event)
{
auto e = (HPS::InitPictureEvent const *)in_event;
static bool first_update = true;
if (first_update)
{
auto ctx = (ID3D11DeviceContext*)e->GetGraphicsContext();
ctx->GetDevice(&dx11_device);
first_update = false;
}
else
{
e->SetRenderSurface((HPS::OpaqueHandle)target);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionLeft, proj_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionRight, proj_right);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewLeft, view_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewRight, view_right);
}
}
};
static void Connect()
{
if (connected)
{
streamer_helpers->Disconnect();
connected = false;
}
else
{
try
{
streamer_helpers->Connect(ip_address->Data(), 8001);
}
catch (Platform::Exception^ ex)
{
DebugLog(L"Connect failed with hr = 0x%08X", ex->HResult);
}
}
}
int main(Platform::Array<Platform::String^>^ args)
{
RoInitializeWrapper roinit(RO_INIT_MULTITHREADED);
ThrowIfFailed(HRESULT(roinit));
Platform::String^ filename;
float scale = 1.0f;
for (uint32_t i = 1; i < args->Length; ++i)
{
if (args[i] == "-ip" && i + 1 < args->Length)
ip_address = args[++i]; //Get the ip address of the device
}
auto world = new HPS::World(HOOPS_LICENSE);
HWND hwnd = CreateWin32Window(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT, L"HPS Holographic Remoting Sandbox", false, wndproc);
oswok.SetHardwareResident(true);
oswok.SetAntiAliasCapable(true);
oswok.SetOpacity(0.0f);
window_key = HPS::Database::CreateOffScreenWindow(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT, oswok);
HPS_Driver_Set_HoloLens_Bit(window_key, (HPS::WindowHandle)hwnd);
auto portfolio = HPS::Database::CreatePortfolio();
canvas = HPS::Factory::CreateCanvas(window_key, portfolio);
InitPictureHandler init_picture_handler;
window_key.SetDriverEventHandler(init_picture_handler, HPS::Object::ClassID<HPS::InitPictureEvent>());
window_key.UpdateWithNotifier().Wait();
assert(init_picture_handler.dx11_device.Get());
view.GetSegmentKey().GetDrawingAttributeControl().SetWorldHandedness(HPS::Drawing::Handedness::Right);
view.GetSegmentKey().GetMaterialMappingControl().SetWindowColor(HPS::RGBAColor(0.0f, 0.0f, 0.0f, 0.0f));
view.GetSegmentKey().GetCullingControl().SetFrustum(true).SetExtent(0);
//This segment will be displayed on the PC
help_seg = view.GetSegmentKey().Subsegment("help_text");
help_seg.InsertText(HPS::Point(0, 0, 0), "Press space bar to connect");
help_seg.GetVisibilityControl().SetText(true);
help_seg.GetMaterialMappingControl().SetTextColor(HPS::RGBAColor::White());
canvas.AttachViewAsLayout(view);
auto model = HPS::Factory::CreateModel();
view.AttachModel(model);
Windows::Foundation::Numerics::float3 focus_point;
streamer_helpers = ref new HolographicStreamerHelpers();
streamer_helpers->CreateStreamer(init_picture_handler.dx11_device.Get());
streamer_helpers->SetVideoFrameSize(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT);
streamer_helpers->SetMaxBitrate(4 * 1024);
streamer_helpers->SetEnableAudio(false);
holographic_space = streamer_helpers->HolographicSpace;
{
// Acquire the DXGI interface for the Direct3D device.
ComPtr<IDXGIDevice3> dxgi_device;
ThrowIfFailed(init_picture_handler.dx11_device.As<IDXGIDevice3>(&dxgi_device));
// Wrap the native device using a WinRT interop object.
auto interop_device = CreateDirect3DDevice(dxgi_device.Get());
holographic_space->SetDirect3D11Device(interop_device);
}
reference_frame = SpatialLocator::GetDefault()->CreateStationaryFrameOfReferenceAtCurrentLocation();
streamer_helpers->OnConnected += ref new ConnectedEvent(
[]()
{
DebugLog(L"Connected");
connected = true;
});
//Handle disconnection
Platform::WeakReference streamerHelpersWeakRef = Platform::WeakReference(streamer_helpers);
streamer_helpers->OnDisconnected += ref new DisconnectedEvent(
[streamerHelpersWeakRef](_In_ HolographicStreamerConnectionFailureReason failureReason)
{
DebugLog(L"Disconnected with reason %d", failureReason);
connected = false;
// Reconnect if this is a transient failure.
if (failureReason == HolographicStreamerConnectionFailureReason::Unreachable ||
failureReason == HolographicStreamerConnectionFailureReason::ConnectionLost)
{
DebugLog(L"Reconnecting...");
try
{
auto helpersResolved = streamerHelpersWeakRef.Resolve<HolographicStreamerHelpers>();
if (helpersResolved)
helpersResolved->Connect(ip_address->Data(), 8001);
else
DebugLog(L"Failed to reconnect because a disconnect has already occurred.\n");
}
catch (Platform::Exception^ ex)
{
DebugLog(L"Connect failed with hr = 0x%08X", ex->HResult);
}
}
else
{
DebugLog(L"Disconnected with unrecoverable error, not attempting to reconnect.");
}
});
streamer_helpers->OnSendFrame += ref new SendFrameEvent(
[](_In_ const ComPtr<ID3D11Texture2D>& spTexture, _In_ FrameMetadata metadata)
{
});
// Main Update/Render loop
// See hololens_remote_2.cpp for an example
.....
}

4. Render Loop

We then start the main loop. We request a prediction of where the cameras will be from the device, then set those cameras in HPS. We then update the HPS window, and Present the result to the HoloLens. See the main loop for an example. In particular, see the Update and Render functions.

#include <d3d11_1.h>
#include <dxgi1_3.h>
#include <assert.h>
#include <HolographicStreamerHelpers.h>
#include <windows.graphics.directx.direct3d11.interop.h>
#pragma comment(lib, "d3d11.lib")
#include "hps.h"
#include "sprk.h"
#include "hoops_license.h"
#include "../vr_shared/arvr.h"
using namespace Microsoft::Holographic;
using namespace Microsoft::WRL;
using namespace Microsoft::WRL::Wrappers;
using namespace Windows::Graphics::Holographic;
using namespace Windows::Graphics::DirectX::Direct3D11;
using namespace Windows::Perception::Spatial;
static const int VIDEO_FRAME_WIDTH = 1280;
static const int VIDEO_FRAME_HEIGHT = 720;
static const float window_aspect = (float)VIDEO_FRAME_WIDTH / VIDEO_FRAME_HEIGHT;
static HPS::Canvas canvas;
static HPS::CameraKit camera_kit;
static HPS::OffScreenWindowKey window_key;
static HPS::View view;
static HPS::SegmentKey help_seg;
static Platform::String^ ip_address = L"127.0.0.1";
static Windows::Graphics::Holographic::HolographicSpace^ holographic_space;
static HolographicStreamerHelpers^ streamer_helpers;
static Windows::Perception::Spatial::SpatialStationaryFrameOfReference^ reference_frame;
static bool connected = false;
static HolographicFrame^ Update(Windows::Foundation::Numerics::float3 const & focus_point)
{
if (!holographic_space)
return nullptr;
// The HolographicFrame has information that the app needs in order
// to update and render the current frame. The app begins each new
// frame by calling CreateNextFrame.
HolographicFrame^ holographic_frame = holographic_space->CreateNextFrame();
// Get a prediction of where holographic cameras will be when this frame
// is presented.
HolographicFramePrediction^ prediction = holographic_frame->CurrentPrediction;
auto current_coordinate_system = reference_frame->CoordinateSystem;
for (auto pose : prediction->CameraPoses)
{
HolographicCameraRenderingParameters^ camera_parameters = holographic_frame->GetRenderingParameters(pose);
camera_parameters->SetFocusPoint(current_coordinate_system, focus_point);
}
return holographic_frame;
}
static bool Render(HolographicFrame^ holographic_frame, InitPictureHandler & init_picture_handler)
{
holographic_frame->UpdateCurrentPrediction();
HolographicFramePrediction^ prediction = holographic_frame->CurrentPrediction;
bool atLeastOneCameraRendered = false;
for (auto pose : prediction->CameraPoses)
{
// The projection transform for each frame is provided by the HolographicCameraPose.
auto & cameraProjectionTransform = pose->ProjectionTransform;
// If TryGetViewTransform returns a null pointer, that means the pose and coordinate
// system cannot be understood relative to one another; content cannot be rendered
// in this coordinate system for the duration of the current frame.
// This usually means that positional tracking is not active for the current frame, in
// which case it is possible to use a SpatialLocatorAttachedFrameOfReference to render
// content that is not world-locked instead.
Platform::IBox<HolographicStereoTransform>^ viewTransformContainer = pose->TryGetViewTransform(reference_frame->CoordinateSystem);
if (viewTransformContainer)
{
auto & viewCoordinateSystemTransform = viewTransformContainer->Value;
UpdateCameraParams(
(float const *)&cameraProjectionTransform.Left,
(float const *)&cameraProjectionTransform.Right,
(float const *)&viewCoordinateSystemTransform.Left,
(float const *)&viewCoordinateSystemTransform.Right,
::window_aspect,
init_picture_handler.proj_left,
init_picture_handler.proj_right,
init_picture_handler.view_left,
init_picture_handler.view_right,
::camera_kit);
}
HolographicCameraRenderingParameters^ camera_parameters = holographic_frame->GetRenderingParameters(pose);
ComPtr<ID3D11Resource> resource;
ThrowIfFailed(GetDXGIInterfaceFromObject(camera_parameters->Direct3D11BackBuffer, IID_PPV_ARGS(&resource)));
ComPtr<ID3D11Texture2D> camera_backbuffer;
ThrowIfFailed(resource.As(&camera_backbuffer));
init_picture_handler.target = camera_backbuffer.Get();
window_key.UpdateWithNotifier().Wait();
init_picture_handler.target = nullptr;
// hololens docs say to use previous frame camera for culling.
view.GetSegmentKey().SetCamera(camera_kit);
atLeastOneCameraRendered = true;
}
return atLeastOneCameraRendered;
}
int main(Platform::Array<Platform::String^>^ args)
{
//Initialization of HoloLens. See hololens_remote_1.cpp
.....
// Main Update/Render loop
for (;;)
{
if (!ProcessWin32Events())
break;
if (connected)
{
//Get current prediction of camera location/rotation
HolographicFrame^ holographic_frame = Update(focus_point);
if (holographic_frame && Render(holographic_frame, init_picture_handler))
{
auto present_result = holographic_frame->PresentUsingCurrentPrediction(HolographicFramePresentWaitBehavior::DoNotWaitForFrameToFinish);
if (present_result == HolographicFramePresentResult::DeviceRemoved)
{
DebugLog(L"Device removed\n");
}
}
}
else
{
}
}
init_picture_handler.dx11_device = nullptr;
holographic_space = nullptr;
streamer_helpers->Disconnect();
delete streamer_helpers;
streamer_helpers = nullptr;
window_key.UnsetDriverEventHandler(HPS::Object::ClassID<HPS::InitPictureEvent>());
window_key.Delete();
delete world;
return 0;
}

Considerations

  • Performance is the main concern when using AR. For a smooth experience, 60FPS should be the minimum requirement. Lower or jumpy frame rates can result in the user feeling sick. Because of this, the main render loop should be as bare-bones as possible. If you need to perform expensive calculations, it would be best to perform them on a separate thread.
  • When in AR mode you should expect that the camera is in constant movement, since it is tied to the position of the AR headset. As a consequence of this, some of the approaches used in highlighting and the default operators will not work in AR mode.

    For example:

    • Visualize's default orbit operator works by modifying the camera position. Since the position of the camera depends on the AR headset, it will be necessary to change the model modelling matrix instead.
    • Because of the constant camera changes, Visualize will not cache the status of rendering buffers between updates. In turn this means that the Overlay highlighting setting of Drawing::Overlay::WithZValues, which relies on these cached buffers, cannot be used effectively. Highlights with an overlay setting of Drawing::Overlay::WithZValues in an AR application will be performed sub-optimally.

  • Currently the Hide operation, and in general Highlight operation with the InPlace Overlay settings are very expensive to use, performance-wise, and should be avoided in VR mode. This is something we are currently addressing for Visualize 2018 SP2.
  • Some view dependent geometry, like non-transformable text and patterned line will not transform correctly in stereo mode. This will create artifacts where it will look like two instances of said geometry are visible, unless the user closes one of their eyes.
  • UI design in the HoloLens is an active area of interest. For Microsoft's best practices, see here: https://docs.microsoft.com/en-us/windows/mixed-reality/design
  • For developing UI with the HoloLens, you can directly query the HoloLens API. See examples of gesture detection, speech recognition, and more here: https://github.com/Microsoft/MixedRealityCompanionKit
  • (Remoting) Network latency is paramount to a well-behaved remoting application. It is recommended to use the HolographicStreamerHelpers->SetMaxBitrate() function to set an appropriate bitrate, to balance image quality with latency. We recommend a value of around 4k.
  • (Remoting) Since the PC's graphics card is used for rendering, it is recommended that you do not simultaneously run graphics intense processes on the PC as the remoting application.