Augmented Reality

Please note, AR/VR is currently only supported on Windows Desktop when using the C++ API and the DirectX 11 driver.

Beta Release: The approaches detailed below are required for adding HoloLens support to your application with Visualize 3DF. In future releases, we will provide a generic API to make the integration process easier.


Introduction

Visualize currently supports AR in both HoloLens native and HoloLens remoting applications. A HoloLens native app is built for the HoloLens architecture and deployed to the device. A HoloLens remoting application uses a remotely connected computer to render the image and sends it over a Wifi network to the HoloLens. While there are many similarities in how these applications are structured, we will address them separately.

Our AR package contains two sample applications that demonstrate how to use Visualize 3DF for AR:

  • HoloLens Native
  • HoloLens Remoting

These sandboxes make it possible to view HSF files in the VR headset.

Augmented Reality

HoloLens Native

This sample project is named hololens_simple, which is located in demo/uwp and can be accessed via the samples_arvr_v141 solution in the root folder of the Visualize package. In contrast to the HoloLens remoting sample, this sandbox creates an application to be deployed directly onto the HoloLens hardware.

Native HoloLens applications are compatible with the DX11 graphics driver only. Also, it requires the HOOPS UWP libraries, and the ability to build UWP apps (Visual Studio should prompt you to install the requirements if they aren't already present).

For compatibility with the 32-bit HoloLens hardware, the sample is configured to generate 32-bit binaries.

Adding HoloLens Native to Your Application

In order to add native HoloLens support to your application, the following items are required:

  • A HoloLens headset
  • HoloLens is currently only supported on Windows Desktop, when using the C++ API, and the DirectX 11 driver.

When deploying your sample application to the HoloLens, your package must include the HSF file of the model you want to view. To include a file in your application, in Visual Studio right-click the "Content" filter and select "Add" -> "Existing Item" and select your HSF file. After the file has been added to the "Content" filter, click the file name and in the "Content" field of the Properties pane, select "Yes".

To load the file in your application, in hololens_simpleMain.cpp, change the "filename" variable to the name of your HSF file.

Here are the basic steps for setting up your application:

1. Set Up PC and HoloLens

You must enable Developer mode on both the PC and HoloLens, and pair the devices, to allow deployment from Visual Studio on the PC to the HoloLens device. For details, see Microsoft's documentation here: https://docs.microsoft.com/en-us/windows/mixed-reality/using-visual-studio

2. Initialize Application in Visualize

The fundamentally essential steps are to overload the Windows::ApplicationModel::Core::IFrameworkView and Windows::ApplicationModel::Core::IFrameworkViewSource classes, and call CoreApplication::Run (yourIFrameworkViewSource).

The IFrameworkViewSource class only needs to create a new IFrameworkView object. See the AppViewSource class in AppView.h and AppView.cpp for an example.

The IFrameworkView class should overload the various triggers that can come from HoloLens: ViewActivated, Suspending, Resuming, WindowClosed, etc. See the AppView class in AppView.h and AppView.cpp for an example.

We recommend separating the 3DF-related activity into a separate class - in the sample, we use hololens_simpleMain.

3. Initialize HoloLens in Visualize

Application initialization is handled by IFrameworkView::Initialize method. We use it to create an instance of hololens_simpleMain. The constructor of this class is where we set up 3DF. We initialize the model and view, set rendering options on our segments, and load a model. We set up 3DF to render to an image of the same dimension as the HoloLens display - we will later present this image to the HoloLens.

In addition, we need to tell 3DF that we are rendering in stereo and targeting the HoloLens architecture. To do this, the following Debug bits will need to be set in HC_Set_Driver_Options.

#define Debug_SINGLE_PASS_STEREO 0x08000000
#define Debug_TARGET_HOLOLENS 0x10000000

Here are the steps for initialization:

hololens_simpleMain::hololens_simpleMain()
{
//1. Define 3DF variables
HC_Define_Callback_Name("holo_init_picture", (CallbackFunc)init_picture);
ResizeImage(1268, 720);
m_pHModel = new HBaseModel();
m_pHModel->Init();
m_pHView = new HBaseView(m_pHModel, "", "dx11", "", (void*)image_key, nullptr, nullptr, nullptr, "/driver/dx11/hololens1");
m_pHView->Init();
m_pHView->SetRenderMode(HRenderGouraud);
//2. Set up rendering Options
HC_Open_Segment_By_Key(m_pHView->GetViewKey());
{
HC_Set_Driver_Options(H_FORMAT_TEXT(
"gpu resident, discard framebuffer, isolated, use window id = image key = 0x%p, debug = 0x%08x",
(void*)image_key, Debug_TARGET_HOLOLENS | Debug_SINGLE_PASS_STEREO));
HC_Set_Callback("init picture = holo_init_picture");
HC_Set_Driver_Options("window opacity = 0.0");
HC_Set_Window_Frame("off");
HC_Set_Color("window = black");
}HC_Close_Segment();
HC_Open_Segment_By_Key(m_pHView->GetSceneKey());
HC_Set_Heuristics("culling = (view frustum, no maximum extent)");
HC_Close_Segment();
m_pHView->SetHandedness(HandednessRight);
m_pHView->SetDisplayListType(DisplayListSegment);
m_pHView->SetDisplayListMode(true);
m_pHView->SetTransparency("hsra = depth peeling, depth peeling options = layers = 1");
m_pHView->Update();
//3. Load model
Windows::ApplicationModel::Package^ package = Windows::ApplicationModel::Package::Current;
auto filename = package->InstalledLocation->Path + "\\bnc.hsf";
m_pHModel->Read(filename->Data(), m_pHView);
}

4. Start a Render Loop

In the IFrameworkView::Run method, we process events coming in from the HoloLens, update the scene, and render to a new frame. We do this by requesting a new frame from the HoloLens, rendering into that frame, and then presenting the frame back to the Device.

void AppView::Run()
{
while (!m_windowClosed)
{
if (m_windowVisible && (m_holographicSpace != nullptr))
{
CoreWindow::GetForCurrentThread()->Dispatcher->ProcessEvents(CoreProcessEventsOption::ProcessAllIfPresent);
HolographicFrame^ holographicFrame = m_main->Update();
//See below for Update code
if (m_main->Render(holographicFrame))
{//See below for Render code
HolographicFramePresentResult present_result = holographicFrame->PresentUsingCurrentPrediction(HolographicFramePresentWaitBehavior::DoNotWaitForFrameToFinish);
}
}
else
{
CoreWindow::GetForCurrentThread()->Dispatcher->ProcessEvents(CoreProcessEventsOption::ProcessOneAndAllPending);
}
}
}
// Updates the application state once per frame.
HolographicFrame^ hololens_simpleMain::Update()
{
// The HolographicFrame has information that the app needs in order
// to update and render the current frame.
HolographicFrame^ holographicFrame = m_holographicSpace->CreateNextFrame();
// Get a prediction and coordinate system from the frame.
HolographicFramePrediction^ prediction = holographicFrame->CurrentPrediction;
SpatialCoordinateSystem^ currentCoordinateSystem = m_referenceFrame->CoordinateSystem;
// We complete the frame update by using information about our content positioning
// to set the focus point.
for (auto cameraPose : prediction->CameraPoses)
{
HolographicCameraRenderingParameters^ renderingParameters = holographicFrame->GetRenderingParameters(cameraPose);
renderingParameters->SetFocusPoint(currentCoordinateSystem, m_focus_point);
}
return holographicFrame;
}
// Renders the current frame to each holographic camera, according to the
// current application and spatial positioning state. Returns true if the
// frame was rendered to at least one camera.
bool hololens_simpleMain::Render(Windows::Graphics::Holographic::HolographicFrame^ holographicFrame)
{
// Up-to-date frame predictions enhance the effectiveness of image stabilization and
// allow more accurate positioning of holograms.
holographicFrame->UpdateCurrentPrediction();
HolographicFramePrediction^ prediction = holographicFrame->CurrentPrediction;
bool atLeastOneCameraRendered = false;
for (auto cameraPose : prediction->CameraPoses)
{
// The view and projection matrices for each holographic camera will change
// every frame. This function refreshes the data in the constant buffer for
// the holographic camera indicated by cameraPose.
auto & cameraProjectionTransform = cameraPose->ProjectionTransform;
memcpy(proj_left, (float const *)&cameraProjectionTransform.Left, 16 * sizeof(float));
memcpy(proj_right, (float const *)&cameraProjectionTransform.Right, 16 * sizeof(float));
Platform::IBox<HolographicStereoTransform>^ viewTransformContainer = cameraPose->TryGetViewTransform(m_referenceFrame->CoordinateSystem);
if (viewTransformContainer)
{
auto & viewCoordinateSystemTransform = viewTransformContainer->Value;
memcpy(view_left, (float const *)&viewCoordinateSystemTransform.Left, 16 * sizeof(float));
memcpy(view_right, (float const *)&viewCoordinateSystemTransform.Right, 16 * sizeof(float));
}
HCamera camera;
float near_limit = 0.0f;
set_camera_parameters(
view_left,
view_right,
proj_left,
proj_right,
(float)m_image_width / m_image_height,
&camera,
&near_limit);
convert_projection_matrix(proj_left);
convert_projection_matrix(proj_right);
convert_view_matrix(view_left);
convert_view_matrix(view_right);
HolographicCameraRenderingParameters^ camera_parameters = holographicFrame->GetRenderingParameters(cameraPose);
ComPtr<ID3D11Resource> resource;
ThrowIfFailed(GetDXGIInterfaceFromObject(camera_parameters->Direct3D11BackBuffer, IID_PPV_ARGS(&resource)));
ComPtr<ID3D11Texture2D> camera_backbuffer;
ThrowIfFailed(resource.As(&camera_backbuffer));
::target = camera_backbuffer.Get();
//HC_Control_Update_By_Key(m_pHView->GetViewKey(), "refresh");
m_pHView->Update();
//use last frame's camera for culling
HC_Open_Segment_By_Key(m_pHView->GetSceneKey());;
HC_Set_Camera(&camera.position, &camera.target, &camera.up_vector, camera.field_width, camera.field_height, camera.projection);
HC_Set_Camera_Near_Limit(near_limit);
HC_Close_Segment();
::target = nullptr;
atLeastOneCameraRendered = true;
}
return atLeastOneCameraRendered;
}

HoloLens Remoting

This sample project is named holographic_remoting_simple in demo/win32 and can be accessed via the samples_arvr_v141 solution in the root folder of the Visualize package. The sample demonstrates how 3DF can be used to develop a Remote HoloLens application on Windows 10 machines/devices using the the DX11 driver. The application streams image data to the HoloLens headset and processes data transmitted from the HoloLens headset.

To build and deploy the remoting application, your machine must include the Windows 10 Anniversary update.

You'll need a VR-capable GPU for HoloLens Remoting. For current NVIDIA GPUs this would be a GTX 1060 or better. For AMD, this would be an RX 480 or better.

To enable streaming from the application on your PC to your HoloLens, you'll need to install the Holographic Remoting Player on your HoloLens. To get the Windows Holographic Remoting Player, visit the Windows app store from your HoloLens, search for Remoting, and download the app.

Adding HoloLens Remoting to Your Application

In order to add remote HoloLens support to your application, the following items are required:

  • A HoloLens headset
  • HoloLens is currently only supported on Windows Desktop, when using the C++ API, and the DirectX 11 driver.

Here are the steps for setting up your application:

1. Set Up PC and HoloLens

Connect the PC and HoloLens to the same network. Open the Holographic Remoting application on the HoloLens. This should display an IP address, which you will need to provide to your code. The holographic_remoting_simple app uses a command line argument.

2. Initialize the Application in Visualize

The fundamental steps to HoloLens Remoting are similar to HoloLens native development. You must create a 3DF offscreen window, connect to the device, and then start your update/render loop. See the main function in main.cpp for details. The difference is that instead of overloading functions in the IFrameworkView class, you must instead initialize a HolographicStreamerHelpers object, and assign ConnectedEvents to its members: OnConnected, OnDisconnected, etc.

In addition, we need to tell 3DF that we are rendering in stereo and targeting the HoloLens architecture. To do this, the following Debug bits will need to be set in HC_Set_Driver_Options:

#define Debug_SINGLE_PASS_STEREO 0x08000000
#define Debug_TARGET_HOLOLENS 0x10000000

#define Debug_SINGLE_PASS_STEREO 0x08000000
#define Debug_TARGET_HOLOLENS 0x10000000
typedef void (HC_CDECL * CallbackFunc)(...);
static void Connect()
{
if (connected)
{
streamer_helpers->Disconnect();
connected = false;
}
else
{
try
{
streamer_helpers->Connect(ip_address->Data(), 8001);
}
catch (Platform::Exception^ ex)
{
DebugLog(L"Connect failed with hr = 0x%08X", ex->HResult);
}
}
}
int main(Platform::Array<Platform::String^>^ args)
{
// HOOPS offscreen image drivers must be bound to an 'image' geometry object, although for our purposes here this image will not be used for anything.
HC_Open_Segment("/nowhere");
HC_KEY image_key = HC_Insert_Image(0, 0, 0, "rgba", VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT, nullptr);
HC_Close_Segment();
HWND hwnd = CreateWin32Window(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT, L"HOOPS Holographic Remoting Simple", false, wndproc);
// our "init_picture" callback will be called by the driver (renderer) thread at the beginning of each update
// where we will set the camera backbuffer and stereo matrices given to us by the holographic remoting framework.
HC_Define_Callback_Name("holo_init_picture", (CallbackFunc)init_picture);
pHModel = new HBaseModel();
pHModel->Init();
pHView = new HBaseView(pHModel, "", "dx11", "", (void*)image_key, nullptr, nullptr, nullptr, "/driver/dx11/hololens1");
pHView->Init();
HC_Open_Segment_By_Key(pHView->GetViewKey());
{
int const debug_bits = Debug_TARGET_HOLOLENS | Debug_SINGLE_PASS_STEREO;
// pass the preview window HWND as the 'use window id2' parameter, this is optional and can be null if no preview is desired.
HC_Set_Driver_Options(H_FORMAT_TEXT(
"gpu resident, discard framebuffer, isolated, use window id = image key = 0x%p, use window id2 = 0x%p, debug = 0x%08x", (void*)image_key, (void*)hwnd, debug_bits));
HC_Set_Driver_Options("anti-alias = 4");
HC_Set_Callback("init picture = holo_init_picture");
HC_Set_Driver_Options("window opacity = 0.0");
HC_Set_Window_Frame("off");
HC_Set_Color("window = black");
}HC_Close_Segment();
HC_Open_Segment_By_Key(pHView->GetSceneKey());
{
HC_Set_Heuristics("culling = (view frustum, no maximum extent)");
help_seg = HC_Open_Segment("help_text");
{
HC_Insert_Text(0, 0, 0, "Press space bar to connect");
HC_Set_Text_Font("size = 0.05 sru");
HC_Set_Color("text = white");
HC_Set_Visibility("text = on");
}HC_Close_Segment();
}HC_Close_Segment();
pHView->SetHandedness(HandednessRight);
pHView->SetDisplayListType(DisplayListSegment);
pHView->SetDisplayListMode(true);
pHView->Update();
pHModel->SetStaticModel(true);
streamer_helpers = ref new HolographicStreamerHelpers();
streamer_helpers->CreateStreamer(dx11_device.Get());
streamer_helpers->SetVideoFrameSize(VIDEO_FRAME_WIDTH, VIDEO_FRAME_HEIGHT);
streamer_helpers->SetEnableAudio(false);
//Sets the stream rate for connection. A higher stream rate, even with high-end routers,
// requires hardware encoding from the source PC. This is often a slow process.
// Microsoft recommends a 4k bitrate as a balance of speed and image quality
streamer_helpers->SetMaxBitrate(4 * 1024);
holographic_space = streamer_helpers->HolographicSpace;
{
// Acquire the DXGI interface for the Direct3D device.
ComPtr<IDXGIDevice3> dxgi_device;
ThrowIfFailed(dx11_device.As<IDXGIDevice3>(&dxgi_device));
// Wrap the native device using a WinRT interop object.
auto interop_device = CreateDirect3DDevice(dxgi_device.Get());
holographic_space->SetDirect3D11Device(interop_device);
}
reference_frame = SpatialLocator::GetDefault()->CreateStationaryFrameOfReferenceAtCurrentLocation();
streamer_helpers->OnConnected += ref new ConnectedEvent(
[]()
{
DebugLog(L"Connected");
HC_Open_Segment_By_Key(help_seg);
HC_Set_Visibility("off");
HC_Close_Segment();
connected = true;
});
Platform::WeakReference streamerHelpersWeakRef = Platform::WeakReference(streamer_helpers);
streamer_helpers->OnDisconnected += ref new DisconnectedEvent(
[streamerHelpersWeakRef](_In_ HolographicStreamerConnectionFailureReason failureReason)
{
DebugLog(L"Disconnected with reason %d", failureReason);
connected = false;
// Reconnect if this is a transient failure.
if (failureReason == HolographicStreamerConnectionFailureReason::Unreachable ||
failureReason == HolographicStreamerConnectionFailureReason::ConnectionLost)
{
DebugLog(L"Reconnecting...");
try
{
auto helpersResolved = streamerHelpersWeakRef.Resolve<HolographicStreamerHelpers>();
if (helpersResolved)
helpersResolved->Connect(ip_address->Data(), 8001);
else
DebugLog(L"Failed to reconnect because a disconnect has already occurred.\n");
}
catch (Platform::Exception^ ex)
{
DebugLog(L"Connect failed with hr = 0x%08X", ex->HResult);
}
}
else
{
DebugLog(L"Disconnected with unrecoverable error, not attempting to reconnect.");
}
});
streamer_helpers->OnSendFrame += ref new SendFrameEvent(
[](_In_ const ComPtr<ID3D11Texture2D>& spTexture, _In_ FrameMetadata metadata)
{
});
// Main Update/Render loop
// See hololensStream2.cpp for an example
.....
}

3. Connect to HoloLens from PC

The HolographicStreamerHelpers class contains most of the ways of interacting with the HoloLens. In particular, it has a Connect() function that you must call in order to start receiving poses and sending frames to the device. You pass this the IP address of the HoloLens headset, and a port number - by default 8001. We can also get the HolographicSpace from the HolographicStreamerHelpers object, which will be how our render loop receives poses from the device.

(See the code snippet in the previous section for an example of the Connect() function.)

4. Start the Render Loop

We then start the main loop. We request a prediction of where the cameras will be from the device, then set those cameras in 3DF. We then update the 3DFwindow, and Present the result to the HoloLens. See the main loop for an example. In particular, see the Update and Render functions.

static HolographicFrame^ Update(Windows::Foundation::Numerics::float3 const & focus_point)
{
if (!holographic_space)
return nullptr;
// The HolographicFrame has information that the app needs in order
// to update and render the current frame. The app begins each new
// frame by calling CreateNextFrame.
HolographicFrame^ holographic_frame = holographic_space->CreateNextFrame();
// Get a prediction of where holographic cameras will be when this frame
// is presented.
HolographicFramePrediction^ prediction = holographic_frame->CurrentPrediction;
auto current_coordinate_system = reference_frame->CoordinateSystem;
for (auto pose : prediction->CameraPoses)
{
HolographicCameraRenderingParameters^ camera_parameters = holographic_frame->GetRenderingParameters(pose);
camera_parameters->SetFocusPoint(current_coordinate_system, focus_point);
}
return holographic_frame;
}
static bool Render(HolographicFrame^ holographic_frame)
{
// Time has passed since we got the last prediction, so update it
// Up-to-date frame predictions enhance the effectiveness of image stabilization and
// allow more accurate positioning of holograms.
holographic_frame->UpdateCurrentPrediction();
// Get a prediction of where holographic cameras will be when this frame
// is presented.
HolographicFramePrediction^ prediction = holographic_frame->CurrentPrediction;
bool atLeastOneCameraRendered = false;
// Note: We expect only a single pose in the camera. here we will set the
// matrices by extracting information from the camera pose and converting them from
// a form that the HoloLens gives us to what 3DF expects
for (auto pose : prediction->CameraPoses)
{
auto & cameraProjectionTransform = pose->ProjectionTransform;
memcpy(proj_left, (float const *)&cameraProjectionTransform.Left, 16 * sizeof(float));
memcpy(proj_right, (float const *)&cameraProjectionTransform.Right, 16 * sizeof(float));
Platform::IBox<HolographicStereoTransform>^ viewTransformContainer = pose->TryGetViewTransform(reference_frame->CoordinateSystem);
if (viewTransformContainer)
{
auto & viewCoordinateSystemTransform = viewTransformContainer->Value;
memcpy(view_left, (float const *)&viewCoordinateSystemTransform.Left, 16 * sizeof(float));
memcpy(view_right, (float const *)&viewCoordinateSystemTransform.Right, 16 * sizeof(float));
}
HCamera camera;
float near_limit = 0.0f;
// extract camera parameters from stereo matrices so HOOPS will have an accurate camera for what the hololens is seeing.
set_camera_parameters(view_left, view_right, proj_left, proj_right, window_aspect, &camera, &near_limit);
// massage raw matrices into the form the HOOPS driver expects internally
convert_projection_matrix(proj_left);
convert_projection_matrix(proj_right);
convert_view_matrix(view_left);
convert_view_matrix(view_right);
// The HolographicCameraRenderingParameters class provides access to set
// the image stabilization parameters.
HolographicCameraRenderingParameters^ camera_parameters = holographic_frame->GetRenderingParameters(pose);
ComPtr<ID3D11Resource> resource;
ThrowIfFailed(GetDXGIInterfaceFromObject(camera_parameters->Direct3D11BackBuffer, IID_PPV_ARGS(&resource)));
ComPtr<ID3D11Texture2D> camera_backbuffer;
ThrowIfFailed(resource.As(&camera_backbuffer));
::target = camera_backbuffer.Get();
pHView->Update();
::target = nullptr;
// hololens docs say to use previous frame camera for culling.
HC_Open_Segment_By_Key(pHView->GetSceneKey());
HC_Set_Camera(&camera.position, &camera.target, &camera.up_vector, camera.field_width, camera.field_height, camera.projection);
HC_Set_Camera_Near_Limit(near_limit);
HC_Close_Segment();
atLeastOneCameraRendered = true;
}
return atLeastOneCameraRendered;
}
int main(Platform::Array<Platform::String^>^ args)
{
//Initialization of HoloLens. See hololens_remote_1.cpp
.....
// Main Update/Render loop
for (;;)
{
if (!ProcessWin32Events())
break;
if (connected)
{
//Update will set up the focus point in the HoloLens and set up a frame to render into
HolographicFrame^ holographic_frame = Update(focus_point);
//Render will take HoloLens position data and create matrices in 3DF to draw geometry from
// this perspective.
if (holographic_frame && Render(holographic_frame))
{
//Finally, Present will draw this buffer to the HoloLens
auto present_result = holographic_frame->PresentUsingCurrentPrediction(HolographicFramePresentWaitBehavior::DoNotWaitForFrameToFinish);
if (present_result == HolographicFramePresentResult::DeviceRemoved)
DebugLog(L"Device removed\n");
}
}
else
{
HC_Control_Update_By_Key(pHView->GetViewKey(), "refresh");
pHView->Update();
}
}
dx11_device = nullptr;
holographic_space = nullptr;
streamer_helpers->Disconnect();
delete streamer_helpers;
streamer_helpers = nullptr;
delete pHView;
delete pHModel;
delete hdb;
return 0;
}

Other Considerations

  • Performance is the main concern when using AR. For a smooth experience, 60FPS should be the minimum requirement. Lower or jumpy frame rates can result in the user feeling sick. Because of this, the main render loop should be as bare-bones as possible. If you need to perform expensive calculations, it would be best to perform them on a separate thread.
  • When in AR mode you should expect that the camera is in constant movement, since it is tied to the position of the AR headset. As a consequence of this, some of the approaches used in highlighting and the default operators will not work in AR mode.

    For example:

    • The default orbit operator works by modifying the camera position. Since the position of the camera depends on the AR headset, it will be necessary to change the model modelling matrix instead.
    • Because of the constant camera changes, Visualize will not cache the status of rendering buffers between updates. In turn this means that the heuristic setting of quick moves=spriting, which relies on these cached buffers, cannot be used effectively. Highlights with a heuristic setting of quick moves=spriting in an AR application will be performed sub-optimally.

  • Some view-dependent geometry, like non-transformable text and patterned line will not transform correctly in stereo mode. This will create artifacts where it will look like two instances of said geometry are visible, unless the user closes one of their eyes.
  • UI design in the HoloLens is an active area of interest. For Microsoft's best practices, see here: https://docs.microsoft.com/en-us/windows/mixed-reality/design
  • For developing UI with the HoloLens, you can directly query the HoloLens API. See examples of gesture detection, speech recognition, and more here: https://github.com/Microsoft/MixedRealityCompanionKit
  • (Remoting) Network latency is paramount to a well-behaved remoting application. It is recommended to use the HolographicStreamerHelpers->SetMaxBitrate() function to set an appropriate bitrate, to balance image quality with latency. We recommend a value of around 4k.
  • (Remoting) Since the PC's graphics card is used for rendering, it is recommended that you do not simultaneously run graphics intensive processes on the same PC as the remoting application.

top_level:4 api_ref/additional_resources:1