Please note, HoloLens AR is currently only supported on Windows Desktop when using the C++ API and the DirectX 11 driver.
Augmented Reality
For detailed instructions on setting up AR/VR with Visualize, please see the Getting Started section.
Visualize currently supports AR in both HoloLens native and HoloLens remoting applications. A HoloLens native app is built for the HoloLens architecture and deployed to the device. A HoloLens remoting application uses a remotely connected computer to render the image and sends it over a WiFi network to the HoloLens. While there are many similarities in how these applications are structured, we will address them separately.
HoloLens Native
In order to add native HoloLens support to your application, the following items are required:
-
A HoloLens headset
-
HoloLens is currently only supported on Windows Desktop, when using the C++ API, and the DirectX 11 driver.
For a reference implementation of HoloLens native in a Visualize application, you can refer to the HPS Holographic Sandbox, bundled with your copy of Visualize.
1. Set Up PC and HoloLens
You must enable Developer mode on both the PC and HoloLens, and pair the devices, to allow deployment from Visual Studio on the PC to the HoloLens device. For details, see Microsoft's documentation here: https://docs.microsoft.com/en-us/windows/mixed-reality/using-visual-studio
2. Initializing Application in Visualize
The fundamentally essential steps are to overload the Windows::ApplicationModel::Core::IFrameworkView and Windows::ApplicationModel::Core::IFrameworkViewSource classes, and call CoreApplication::Run( yourIFrameworkViewSource).
The IFrameworkViewSource class only needs to create a new IFrameworkView object. See the AppViewSource class in AppView.h and AppView.cpp for an example.
The IFrameworkView class should overload the various triggers that can come from the HoloLens: ViewActivated, Suspending, Resuming, WindowClosed, etc. See the AppView class in AppView.h and AppView.cpp for an example.
We recommend separating the HPS related activity into a separate class - in the sample, we use HolographicSandboxMain.
3. Initializing HoloLens in Visualize
In the IFrameworkView::Initialize method, you can set up HPS to handle the HoloLens device. This includes an InitPictureEventHandler to maintain the HoloLens context.
HolographicSandboxMain::HolographicSandboxMain()
{
m_window_aspect = 1268.0f / 720;
HPS_Driver_Set_HoloLens_Bit(m_window_key, 0);
m_window_key.SetDriverEventHandler(m_init_picture_handler, HPS::Object::ClassID<HPS::InitPictureEvent>());
m_window_key.UpdateWithNotifier().Wait();
assert(m_init_picture_handler.dx11_device.Get());
m_canvas.AttachViewAsLayout(m_view);
m_view.GetSegmentKey().GetMaterialMappingControl().SetWindowColor(
HPS::RGBAColor(0.0f, 0.0f, 0.0f, 0.0f));
m_view.GetSegmentKey().GetCullingControl().SetFrustum(true).SetExtent(0);
m_view.AttachModel(model);
}
{
public:
Microsoft::WRL::ComPtr<ID3D11Device> dx11_device;
ID3D11Texture2D * target = nullptr;
};
{
static bool first_update = true;
if (first_update)
{
auto ctx = (ID3D11DeviceContext*)e->GetGraphicsContext();
ctx->GetDevice(&dx11_device);
first_update = false;
}
else
{
e->SetRenderSurface((HPS::OpaqueHandle)target);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionLeft, proj_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionRight, proj_right);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewLeft, view_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewRight, view_right);
}
}
4. Start a Render Loop
In the IFrameworkView::Run method, we process events coming in from the HoloLens, update the scene, and render to a new frame. We do this by requesting a new frame from the HoloLens, rendering into that frame, and then presenting the frame back to the Device.
void AppView::Run()
{
while (!m_windowClosed)
{
if (m_windowVisible && (m_holographicSpace != nullptr))
{
CoreWindow::GetForCurrentThread()->Dispatcher->ProcessEvents(CoreProcessEventsOption::ProcessAllIfPresent);
HolographicFrame^ holographicFrame = m_main->Update();
if (m_main->Render(holographicFrame))
{
HolographicFramePresentResult present_result = holographicFrame->PresentUsingCurrentPrediction(HolographicFramePresentWaitBehavior::DoNotWaitForFrameToFinish);
}
}
else
{
CoreWindow::GetForCurrentThread()->Dispatcher->ProcessEvents(CoreProcessEventsOption::ProcessOneAndAllPending);
}
}
}
HolographicFrame^ HolographicSandboxMain::Update()
{
HolographicFrame^ holographicFrame = m_holographicSpace->CreateNextFrame();
HolographicFramePrediction^ prediction = holographicFrame->CurrentPrediction;
SpatialCoordinateSystem^ currentCoordinateSystem = m_referenceFrame->CoordinateSystem;
for (auto cameraPose : prediction->CameraPoses)
{
HolographicCameraRenderingParameters^ renderingParameters = holographicFrame->GetRenderingParameters(cameraPose);
renderingParameters->SetFocusPoint(currentCoordinateSystem, m_focus_point);
}
return holographicFrame;
}
bool HolographicSandboxMain::Render(Windows::Graphics::Holographic::HolographicFrame^ holographicFrame)
{
holographicFrame->UpdateCurrentPrediction();
auto prediction = holographicFrame->CurrentPrediction;
bool atLeastOneCameraRendered = false;
for (auto cameraPose : prediction->CameraPoses)
{
auto & cameraProjectionTransform = cameraPose->ProjectionTransform;
Platform::IBox<HolographicStereoTransform>^ viewTransformContainer = cameraPose->TryGetViewTransform(m_referenceFrame->CoordinateSystem);
if (viewTransformContainer)
{
auto & viewCoordinateSystemTransform = viewTransformContainer->Value;
UpdateCameraParams(
(float const *)&cameraProjectionTransform.Left,
(float const *)&cameraProjectionTransform.Right,
(float const *)&viewCoordinateSystemTransform.Left,
(float const *)&viewCoordinateSystemTransform.Right,
m_window_aspect,
m_init_picture_handler.proj_left,
m_init_picture_handler.proj_right,
m_init_picture_handler.view_left,
m_init_picture_handler.view_right,
m_camera_kit);
}
HolographicCameraRenderingParameters^ camera_parameters = holographicFrame->GetRenderingParameters(cameraPose);
ComPtr<ID3D11Resource> resource;
ThrowIfFailed(GetDXGIInterfaceFromObject(camera_parameters->Direct3D11BackBuffer, IID_PPV_ARGS(&resource)));
ComPtr<ID3D11Texture2D> camera_backbuffer;
ThrowIfFailed(resource.As(&camera_backbuffer));
m_init_picture_handler.target = camera_backbuffer.Get();
m_window_key.UpdateWithNotifier().Wait();
m_init_picture_handler.target = nullptr;
m_view.GetSegmentKey().SetCamera(m_camera_kit);
atLeastOneCameraRendered = true;
}
return atLeastOneCameraRendered;
}
HoloLens Remoting
In order to add remote HoloLens support to your application, the following items are required:
-
A HoloLens headset
-
HoloLens is currently only supported on Windows Desktop, when using the C++ API, and the DirectX 11 driver.
For a reference implementation of HoloLens remote in a Visualize application, you can refer to the HPS Holographic Remoting Sandbox, bundled with your copy of Visualize.
1. Set Up PC and HoloLens
Connect the PC and HoloLens to the same network. Open the Holographic Remoting application on the HoloLens. This should display an IP address, which you will need to provide to your code. The Remoting sample app uses a command line argument.
2. Initializing Application in Visualize
The fundamental steps to HoloLens Remoting are similar to HoloLens native development. You must create an HPS offscreen window, connect to the device, and then start your update/render loop. See the main function in main.cpp for details. The difference is that instead of overloading functions in the IFrameworkView class, you must instead initialize a HolographicStreamerHelpers object, and assign ConnectedEvents to its members: OnConnected, OnDisconnected, etc.
#include <d3d11_1.h>
#include <dxgi1_3.h>
#include <assert.h>
#include <HolographicStreamerHelpers.h>
#include <windows.graphics.directx.direct3d11.interop.h>
#pragma comment(lib, "d3d11.lib")
#include "hps.h"
#include "sprk.h"
#include "hoops_license.h"
#include "../vr_shared/arvr.h"
using namespace Microsoft::Holographic;
using namespace Microsoft::WRL;
using namespace Microsoft::WRL::Wrappers;
using namespace Windows::Graphics::Holographic;
using namespace Windows::Graphics::DirectX::Direct3D11;
using namespace Windows::Perception::Spatial;
static const int VIDEO_FRAME_WIDTH = 1280;
static const int VIDEO_FRAME_HEIGHT = 720;
static const float window_aspect = (float)VIDEO_FRAME_WIDTH / VIDEO_FRAME_HEIGHT;
static Platform::String^ ip_address = L"127.0.0.1";
static Windows::Graphics::Holographic::HolographicSpace^ holographic_space;
static HolographicStreamerHelpers^ streamer_helpers;
static Windows::Perception::Spatial::SpatialStationaryFrameOfReference^ reference_frame;
static bool connected = false;
{
public:
ComPtr<ID3D11Device> dx11_device;
ID3D11Texture2D * target = nullptr;
{
static bool first_update = true;
if (first_update)
{
auto ctx = (ID3D11DeviceContext*)e->GetGraphicsContext();
ctx->GetDevice(&dx11_device);
first_update = false;
}
else
{
e->SetRenderSurface((HPS::OpaqueHandle)target);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionLeft, proj_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionRight, proj_right);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewLeft, view_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewRight, view_right);
}
}
};
static void Connect()
{
if (connected)
{
streamer_helpers->Disconnect();
connected = false;
}
else
{
try
{
streamer_helpers->Connect(ip_address->Data(), 8001);
}
catch (Platform::Exception^ ex)
{
DebugLog(L"Connect failed with hr = 0x%08X", ex->HResult);
}
}
}
int main(Platform::Array<Platform::String^>^ args)
{
RoInitializeWrapper roinit(RO_INIT_MULTITHREADED);
ThrowIfFailed(HRESULT(roinit));
Platform::String^ filename;
float scale = 1.0f;
for (uint32_t i = 1; i < args->Length; ++i)
{
if (args[i] == "-ip" && i + 1 < args->Length)
ip_address = args[++i];
}
HWND hwnd = CreateWin32Window(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT, L"HPS Holographic Remoting Sandbox", false, wndproc);
HPS_Driver_Set_HoloLens_Bit(window_key, (HPS::WindowHandle)hwnd);
InitPictureHandler init_picture_handler;
assert(init_picture_handler.dx11_device.Get());
view.GetSegmentKey().GetMaterialMappingControl().SetWindowColor(
HPS::RGBAColor(0.0f, 0.0f, 0.0f, 0.0f));
view.GetSegmentKey().GetCullingControl().SetFrustum(true).SetExtent(0);
help_seg = view.GetSegmentKey().
Subsegment(
"help_text");
help_seg.
InsertText(HPS::Point(0, 0, 0),
"Press space bar to connect");
view.AttachModel(model);
Windows::Foundation::Numerics::float3 focus_point;
streamer_helpers = ref new HolographicStreamerHelpers();
streamer_helpers->CreateStreamer(init_picture_handler.dx11_device.Get());
streamer_helpers->SetVideoFrameSize(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT);
streamer_helpers->SetMaxBitrate(4 * 1024);
streamer_helpers->SetEnableAudio(false);
holographic_space = streamer_helpers->HolographicSpace;
{
ComPtr<IDXGIDevice3> dxgi_device;
ThrowIfFailed(init_picture_handler.dx11_device.As<IDXGIDevice3>(&dxgi_device));
auto interop_device = CreateDirect3DDevice(dxgi_device.Get());
holographic_space->SetDirect3D11Device(interop_device);
}
reference_frame = SpatialLocator::GetDefault()->CreateStationaryFrameOfReferenceAtCurrentLocation();
streamer_helpers->OnConnected += ref new ConnectedEvent(
[]()
{
DebugLog(L"Connected");
connected = true;
});
Platform::WeakReference streamerHelpersWeakRef = Platform::WeakReference(streamer_helpers);
streamer_helpers->OnDisconnected += ref new DisconnectedEvent(
[streamerHelpersWeakRef](_In_ HolographicStreamerConnectionFailureReason failureReason)
{
DebugLog(L"Disconnected with reason %d", failureReason);
connected = false;
if (failureReason == HolographicStreamerConnectionFailureReason::Unreachable ||
failureReason == HolographicStreamerConnectionFailureReason::ConnectionLost)
{
DebugLog(L"Reconnecting...");
try
{
auto helpersResolved = streamerHelpersWeakRef.Resolve<HolographicStreamerHelpers>();
if (helpersResolved)
helpersResolved->Connect(ip_address->Data(), 8001);
else
DebugLog(L"Failed to reconnect because a disconnect has already occurred.\n");
}
catch (Platform::Exception^ ex)
{
DebugLog(L"Connect failed with hr = 0x%08X", ex->HResult);
}
}
else
{
DebugLog(L"Disconnected with unrecoverable error, not attempting to reconnect.");
}
});
streamer_helpers->OnSendFrame += ref new SendFrameEvent(
[](_In_ const ComPtr<ID3D11Texture2D>& spTexture, _In_ FrameMetadata metadata)
{
});
.....
}
3. Connect to HoloLens from PC
The HolographicStreamerHelpers class contains most of the ways of interacting with the HoloLens. In particular, it has a Connect function that you must call in order to start receiving poses and sending frames to the device. You pass this the IP address of the HoloLens headset, and a port number - by default 8001. We can also get the HolographicSpace from the HolographicStreamerHelpers object, which will be how our render loop receives poses from the device.
#include <d3d11_1.h>
#include <dxgi1_3.h>
#include <assert.h>
#include <HolographicStreamerHelpers.h>
#include <windows.graphics.directx.direct3d11.interop.h>
#pragma comment(lib, "d3d11.lib")
#include "hps.h"
#include "sprk.h"
#include "hoops_license.h"
#include "../vr_shared/arvr.h"
using namespace Microsoft::Holographic;
using namespace Microsoft::WRL;
using namespace Microsoft::WRL::Wrappers;
using namespace Windows::Graphics::Holographic;
using namespace Windows::Graphics::DirectX::Direct3D11;
using namespace Windows::Perception::Spatial;
static const int VIDEO_FRAME_WIDTH = 1280;
static const int VIDEO_FRAME_HEIGHT = 720;
static const float window_aspect = (float)VIDEO_FRAME_WIDTH / VIDEO_FRAME_HEIGHT;
static Platform::String^ ip_address = L"127.0.0.1";
static Windows::Graphics::Holographic::HolographicSpace^ holographic_space;
static HolographicStreamerHelpers^ streamer_helpers;
static Windows::Perception::Spatial::SpatialStationaryFrameOfReference^ reference_frame;
static bool connected = false;
{
public:
ComPtr<ID3D11Device> dx11_device;
ID3D11Texture2D * target = nullptr;
{
static bool first_update = true;
if (first_update)
{
auto ctx = (ID3D11DeviceContext*)e->GetGraphicsContext();
ctx->GetDevice(&dx11_device);
first_update = false;
}
else
{
e->SetRenderSurface((HPS::OpaqueHandle)target);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionLeft, proj_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ProjectionRight, proj_right);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewLeft, view_left);
e->SetStereoMatrix(HPS::DriverEvent::StereoMatrix::ViewRight, view_right);
}
}
};
static void Connect()
{
if (connected)
{
streamer_helpers->Disconnect();
connected = false;
}
else
{
try
{
streamer_helpers->Connect(ip_address->Data(), 8001);
}
catch (Platform::Exception^ ex)
{
DebugLog(L"Connect failed with hr = 0x%08X", ex->HResult);
}
}
}
int main(Platform::Array<Platform::String^>^ args)
{
RoInitializeWrapper roinit(RO_INIT_MULTITHREADED);
ThrowIfFailed(HRESULT(roinit));
Platform::String^ filename;
float scale = 1.0f;
for (uint32_t i = 1; i < args->Length; ++i)
{
if (args[i] == "-ip" && i + 1 < args->Length)
ip_address = args[++i];
}
HWND hwnd = CreateWin32Window(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT, L"HPS Holographic Remoting Sandbox", false, wndproc);
HPS_Driver_Set_HoloLens_Bit(window_key, (HPS::WindowHandle)hwnd);
InitPictureHandler init_picture_handler;
assert(init_picture_handler.dx11_device.Get());
view.GetSegmentKey().GetMaterialMappingControl().SetWindowColor(
HPS::RGBAColor(0.0f, 0.0f, 0.0f, 0.0f));
view.GetSegmentKey().GetCullingControl().SetFrustum(true).SetExtent(0);
help_seg = view.GetSegmentKey().
Subsegment(
"help_text");
help_seg.
InsertText(HPS::Point(0, 0, 0),
"Press space bar to connect");
view.AttachModel(model);
Windows::Foundation::Numerics::float3 focus_point;
streamer_helpers = ref new HolographicStreamerHelpers();
streamer_helpers->CreateStreamer(init_picture_handler.dx11_device.Get());
streamer_helpers->SetVideoFrameSize(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT);
streamer_helpers->SetMaxBitrate(4 * 1024);
streamer_helpers->SetEnableAudio(false);
holographic_space = streamer_helpers->HolographicSpace;
{
ComPtr<IDXGIDevice3> dxgi_device;
ThrowIfFailed(init_picture_handler.dx11_device.As<IDXGIDevice3>(&dxgi_device));
auto interop_device = CreateDirect3DDevice(dxgi_device.Get());
holographic_space->SetDirect3D11Device(interop_device);
}
reference_frame = SpatialLocator::GetDefault()->CreateStationaryFrameOfReferenceAtCurrentLocation();
streamer_helpers->OnConnected += ref new ConnectedEvent(
[]()
{
DebugLog(L"Connected");
connected = true;
});
Platform::WeakReference streamerHelpersWeakRef = Platform::WeakReference(streamer_helpers);
streamer_helpers->OnDisconnected += ref new DisconnectedEvent(
[streamerHelpersWeakRef](_In_ HolographicStreamerConnectionFailureReason failureReason)
{
DebugLog(L"Disconnected with reason %d", failureReason);
connected = false;
if (failureReason == HolographicStreamerConnectionFailureReason::Unreachable ||
failureReason == HolographicStreamerConnectionFailureReason::ConnectionLost)
{
DebugLog(L"Reconnecting...");
try
{
auto helpersResolved = streamerHelpersWeakRef.Resolve<HolographicStreamerHelpers>();
if (helpersResolved)
helpersResolved->Connect(ip_address->Data(), 8001);
else
DebugLog(L"Failed to reconnect because a disconnect has already occurred.\n");
}
catch (Platform::Exception^ ex)
{
DebugLog(L"Connect failed with hr = 0x%08X", ex->HResult);
}
}
else
{
DebugLog(L"Disconnected with unrecoverable error, not attempting to reconnect.");
}
});
streamer_helpers->OnSendFrame += ref new SendFrameEvent(
[](_In_ const ComPtr<ID3D11Texture2D>& spTexture, _In_ FrameMetadata metadata)
{
});
.....
}
4. Render Loop
We then start the main loop. We request a prediction of where the cameras will be from the device, then set those cameras in HPS. We then update the HPS window, and Present the result to the HoloLens. See the main loop for an example. In particular, see the Update and Render functions.
#include <d3d11_1.h>
#include <dxgi1_3.h>
#include <assert.h>
#include <HolographicStreamerHelpers.h>
#include <windows.graphics.directx.direct3d11.interop.h>
#pragma comment(lib, "d3d11.lib")
#include "hps.h"
#include "sprk.h"
#include "hoops_license.h"
#include "../vr_shared/arvr.h"
using namespace Microsoft::Holographic;
using namespace Microsoft::WRL;
using namespace Microsoft::WRL::Wrappers;
using namespace Windows::Graphics::Holographic;
using namespace Windows::Graphics::DirectX::Direct3D11;
using namespace Windows::Perception::Spatial;
static const int VIDEO_FRAME_WIDTH = 1280;
static const int VIDEO_FRAME_HEIGHT = 720;
static const float window_aspect = (float)VIDEO_FRAME_WIDTH / VIDEO_FRAME_HEIGHT;
static Platform::String^ ip_address = L"127.0.0.1";
static Windows::Graphics::Holographic::HolographicSpace^ holographic_space;
static HolographicStreamerHelpers^ streamer_helpers;
static Windows::Perception::Spatial::SpatialStationaryFrameOfReference^ reference_frame;
static bool connected = false;
static HolographicFrame^ Update(Windows::Foundation::Numerics::float3 const & focus_point)
{
if (!holographic_space)
return nullptr;
HolographicFrame^ holographic_frame = holographic_space->CreateNextFrame();
HolographicFramePrediction^ prediction = holographic_frame->CurrentPrediction;
auto current_coordinate_system = reference_frame->CoordinateSystem;
for (auto pose : prediction->CameraPoses)
{
HolographicCameraRenderingParameters^ camera_parameters = holographic_frame->GetRenderingParameters(pose);
camera_parameters->SetFocusPoint(current_coordinate_system, focus_point);
}
return holographic_frame;
}
static bool Render(HolographicFrame^ holographic_frame, InitPictureHandler & init_picture_handler)
{
holographic_frame->UpdateCurrentPrediction();
HolographicFramePrediction^ prediction = holographic_frame->CurrentPrediction;
bool atLeastOneCameraRendered = false;
for (auto pose : prediction->CameraPoses)
{
auto & cameraProjectionTransform = pose->ProjectionTransform;
Platform::IBox<HolographicStereoTransform>^ viewTransformContainer = pose->TryGetViewTransform(reference_frame->CoordinateSystem);
if (viewTransformContainer)
{
auto & viewCoordinateSystemTransform = viewTransformContainer->Value;
UpdateCameraParams(
(float const *)&cameraProjectionTransform.Left,
(float const *)&cameraProjectionTransform.Right,
(float const *)&viewCoordinateSystemTransform.Left,
(float const *)&viewCoordinateSystemTransform.Right,
::window_aspect,
init_picture_handler.proj_left,
init_picture_handler.proj_right,
init_picture_handler.view_left,
init_picture_handler.view_right,
::camera_kit);
}
HolographicCameraRenderingParameters^ camera_parameters = holographic_frame->GetRenderingParameters(pose);
ComPtr<ID3D11Resource> resource;
ThrowIfFailed(GetDXGIInterfaceFromObject(camera_parameters->Direct3D11BackBuffer, IID_PPV_ARGS(&resource)));
ComPtr<ID3D11Texture2D> camera_backbuffer;
ThrowIfFailed(resource.As(&camera_backbuffer));
init_picture_handler.target = camera_backbuffer.Get();
init_picture_handler.target = nullptr;
atLeastOneCameraRendered = true;
}
return atLeastOneCameraRendered;
}
int main(Platform::Array<Platform::String^>^ args)
{
.....
for (;;)
{
if (!ProcessWin32Events())
break;
if (connected)
{
HolographicFrame^ holographic_frame = Update(focus_point);
if (holographic_frame && Render(holographic_frame, init_picture_handler))
{
auto present_result = holographic_frame->PresentUsingCurrentPrediction(HolographicFramePresentWaitBehavior::DoNotWaitForFrameToFinish);
if (present_result == HolographicFramePresentResult::DeviceRemoved)
{
DebugLog(L"Device removed\n");
}
}
}
else
{
}
}
init_picture_handler.dx11_device = nullptr;
holographic_space = nullptr;
streamer_helpers->Disconnect();
delete streamer_helpers;
streamer_helpers = nullptr;
delete world;
return 0;
}
Considerations
-
Performance is the main concern when using AR. For a smooth experience, 60FPS should be the minimum requirement. Lower or jumpy frame rates can result in the user feeling sick. Because of this, the main render loop should be as bare-bones as possible. If you need to perform expensive calculations, it would be best to perform them on a separate thread.
-
When in AR mode you should expect that the camera is in constant movement, since it is tied to the position of the AR headset. As a consequence of this, some of the approaches used in highlighting and the default operators will not work in AR mode.
For example:
-
Visualize's default orbit operator works by modifying the camera position. Since the position of the camera depends on the AR headset, it will be necessary to change the model modelling matrix instead.
-
Because of the constant camera changes, Visualize will not cache the status of rendering buffers between updates. In turn this means that the Overlay highlighting setting of Drawing::Overlay::WithZValues, which relies on these cached buffers, cannot be used effectively. Highlights with an overlay setting of Drawing::Overlay::WithZValues in an AR application will be performed sub-optimally.
-
Currently the Hide operation, and in general Highlight operation with the InPlace Overlay settings are very expensive to use, performance-wise, and should be avoided in VR mode. This is something we are currently addressing for Visualize 2018 SP2.
-
Some view dependent geometry, like non-transformable text and patterned line will not transform correctly in stereo mode. This will create artifacts where it will look like two instances of said geometry are visible, unless the user closes one of their eyes.
-
UI design in the HoloLens is an active area of interest. For Microsoft's best practices, see here: https://docs.microsoft.com/en-us/windows/mixed-reality/design
-
For developing UI with the HoloLens, you can directly query the HoloLens API. See examples of gesture detection, speech recognition, and more here: https://github.com/Microsoft/MixedRealityCompanionKit
-
(Remoting) Network latency is paramount to a well-behaved remoting application. It is recommended to use the HolographicStreamerHelpers->SetMaxBitrate() function to set an appropriate bitrate, to balance image quality with latency. We recommend a value of around 4k.
-
(Remoting) Since the PC's graphics card is used for rendering, it is recommended that you do not simultaneously run graphics intense processes on the PC as the remoting application.