Note that the package hierarchy in the previous section includes <gui>_sandbox. Each platform includes a "sandbox", which is a sample application intended to be used as a starting point for prototyping, testing, and learning the API. Each sandbox contains a viewport and a set of manipulation operators, such as rotation, pan, and zoom. The sandboxes are precompiled and ready to use from your HOOPS Visualize package.
|Project name||Supported OS||Description|
|mfc_sandbox||Windows||MFC GUI framework for C++|
|wpf_sandbox||Windows||WPF GUI framework for C#|
|cocoa_sandbox||macOS||Cocoa GUI framework for C++|
|qt_sandbox||Windows, macOS, Linux||QT GUI framework for C++ and Qt Creator|
|ios_sandbox||iOS||Cocoa-Touch framework for Objective-C/C++|
|ios_arkit_sandbox||iOS||ARKit Augmented Reality Cocoa-Touch framework for Objective-C/C++|
|android_sandbox||Android||Android SDK framework for Java/C++ and Android Studio|
|arcore_sandbox||Android||ARCore Augmented Reality Android SDK framework for Java/C++ and Android Studio|
|holographic_sandbox||Windows||Hololens SDK framework for Augmented Reality Hololens Native applications|
|holographic_remoting_sandbox||Windows||Hololens SDK framework for Augmented Reality Hololens Remoting applications|
|openvr_sandbox||Windows||OpenVR SDK framework for Virtual Reality applications|
|qt_quick_sandbox||Windows, Linux, mobile||Our cross-platform sandbox application for desktop and mobile platforms based on Qt Quick, requires Qt Creator|
After starting the sandbox of your choice, open a file using the main menu. Supported file formats are listed here. Once the file is loaded, you'll see the model as well as a model browser and segment browser. Also note the various operators, such as orbit, pan, and zoom, which you can use to manipulate your view of the model.
The segment browser is a Microsoft tree control that is included in both the WPF and MFC sandboxes.
The segment browser is a GUI control that allows you to inspect the segment structure of the loaded model. SceneTree and SceneTreeItem facilitate populating the tree-view, and they have derived classes located in CHPSSegmentBrowserPane.cpp (for MFC), or SegmentBrowser.cs (for WPF).
The model browser is a Microsoft tree control that is included in both the WPF and MFC sandboxes. It is provided in source code form as a reference, to facilitate adding a custom model browser to your application.
The model browser is a GUI control that allows you to inspect the Component hierarchy of models loaded with either the Exchange or Parasolid interface. Right-clicking on an item in the tree-view brings up a context menu which allows you to show, hide, or isolate an item in the viewport. Selecting an item in the viewport will highlight its corresponding item in the tree-view, and vice versa.
ComponentTree and ComponentTreeItem facilitate populating the tree control, and they have derived classes located in CHPSModelBrowserPane.cpp (for MFC), or ModelBrowser.cs (for WPF). Note that the model browser is only applicable when a Component hierarchy is present, such as when a model is loaded using one of the Component integrations.
Our AR/VR package contains five sandboxes:
These sandboxes make it possible to view HSF files in a virtual reality headset or on an augmented reality device.
This sandbox project is named holographic_remoting_sandbox in the Visualize sample solution. The sandbox can be used to develop 64-bit and 32-bit apps for Windows 10 machines/devices using the the DX11 driver. The application streams image data to the HoloLens headset and processes data transmitted from the HoloLens headset.
To build and deploy the remoting application, your machine must include the Windows 10 Anniversary update.
You'll need a VR-capable GPU for HoloLens Remoting. For current NVIDIA GPUs, this would be a GTX 1060 or better. For AMD, this would be an RX 480 or better.
To enable streaming from the application on your PC to your HoloLens, you'll need to install the Holographic Remoting Player on your HoloLens. To get the Windows Holographic Remoting Player, visit the Windows app store from your HoloLens, search for Remoting, and download the app.
This sandbox project is named holographic_sandbox in the Visualize sample solution. In contrast to the HoloLens remoting sandbox, this sandbox creates an application to be deployed directly onto the HoloLens hardware.
HoloLens Native sandbox app is compatible with the DX11 graphics driver only. Also, it requires the HOOPS UWP libraries, and the ability to build UWP apps (Visual Studio should prompt you to install the requirements if they aren't already present).
For compatibility with the 32-bit HoloLens hardware, this sandbox is configured to generate 32-bit binaries.
When deploying your application to the HoloLens, your package must include the HSF file of the model you want to view. To include a file in your application, in Visual Studio right-click the "Content" filter and select "Add" -> "Existing Item" and select your HSF file. After the file has been added to the "Content" filter, click the file name and in the "Content" field of the Properties pane, select "True".
To load the file in your application, in HolographicSandboxMain.cpp, change the "filename" variable to the name of your HSF file.
This sandbox project is named openvr_sandbox in the Visualize sample solution and supports any VR headset compatible with OpenVR.
For OpenVR, you'll need a VR-capable GPU. For current NVIDIA GPUs, this would be a GTX 1060 or better. For AMD, this would be an RX 480 or better.
To get started with OpenVR, follow these steps:
On your local system, clone OpenVR from the GitHub repository:
git clone https://github.com/ValveSoftware/openvr
- Set the OPENVR_SDK environment variable to the location of your OpenVR root folder.
- Ensure that the OpenVR binaries are in your PATH.
- Install the Steam application from: https://store.steampowered.com
- Once Steam has been installed, run the application. In the top menu, select Library, go to VR and select "Install SteamVR" and install it on your system.
- Run SteamVR. From within the SteamVR application, run the installer for your particular hardware (e.g., HTC Vive, Oculus Rift, etc.).
For Oculus Rift only:
- Allow developer apps to run on the Oculus by opening the Oculus app and choosing Settings->General->Unknown Sources and toggling it to ON.
This sandbox project is named arcore_sandbox and can be found in the Visualize samples directory.
To see if your device supports ARCore, please refer to this page.
From a build perspective, this app behaves exactly like the android_sandbox. No ARCore dependencies need to be installed, since they are pulled from the internet via a Gradle package.
The app depends on having ARCore installed as well as the proper camera permissions. The initial screen of the app is a file list view. Selecting a file will launch the rendering activity.
Once the rendering activity is running, it will load the file you selected and display the camera contents as the file loads. When the file load is done, you should start to see points streaming in which identify points-of-interest for ARCore. If pointing the device at a planar surface, planes will be inserted which identify those surfaces.
You may interact with the app in the following ways:
- Pressing the top-right most button should toggle the visibility for the ARCore debug display geometry (point cloud, detected planes)
- Tapping the screen once will attempt to anchor an instance of the loaded model to a plane existing at the selected point. If ARCore has not detected a plane which covers this point, no model will be inserted. If a plane is found, the newly inserted model instance will become the "active" anchor. An active anchor should have a blue ring under it.
- If you have an active anchor, tapping the screen twice will attempt to to translate the anchor (and the attached model) to a new location.
- If you have an active anchor, pinching the screen will scale your active anchor (and the attached model).
- If you have an active anchor, touching one finger down and moving it horizontally will rotate your active anchor about its local Y-axis.
This sandbox project is named ios_arkit_sandbox and is located in the Visualize samples directory.
ARKit requires iOS 11.0 or later, and a device with an A9 processor or later. This sandbox is set up to build against iOS 12, which allows you to use ARKit 2 APIs.
To get up and running with the ARKit sandbox, all you need to do is launch the corresponding Xcode project, attach an ARKit-capable iOS device running iOS 12 or later, and build and deploy the app to that device.
It is not possible to debug AR apps using the simulator, so you need to have an ARKit-capable device on hand to perform any testing.
An app using ARKit will require permission to access the camera, though the sandbox should prompt for this automatically if necessary.
If ARKit will be required for your app, it should be added as a required device capability to the Info.plist file (see the Info.plist file in the examples/ios_arkit_sandbox directory for an example).
The ARKit-specific files in this sandbox are:
This is just a minimal wrapper around a UIView which Visualize can use to render into via OpenGL ES.
This is a view controller associated with the ARView which handles the standard UIView life cycle, sets up gesture recognizers, and acts as the delegate for ARKit callbacks.
This is where a majority of the ARKit-Visualize interaction occurs. It handles the usual bind()/release() logic and file importing as in the standard iOS Sandbox, but augments this with the ability to start and stop an AR render loop for Visualize.
This contains code used to pass the camera images to Visualize to draw as the window background.
Utility functions used to convert view and projection matrices into a Visualize camera.
Generally these files (or at least the logic they contain) will be necessary to add ARKit functionality to your own app. Most of the customization that you will likely do will be in the ARSurface file and the ARViewController file, as these are the points where ARKit and Visualize interact.
You pick a file from the file list, and then the app will transition to AR mode, and allow you to place the selected file on planes that ARKit detects. Plane and point geometry that ARKit uses to detect planes can have their visibility toggled with the Plane button in the upper right corner. They will be visible by default.
You can single tap on a plane to place a model anchor. It will have a red circle of geometry underneath it to signify it as an "active" anchor. Active anchors can be rotated by panning left and right on the screen, scaled by pinching, and placed in a new location by double tapping.
If you place a new anchor by single tapping, the old anchor will become inactive, and the newly placed anchor will become active.
The Code1-Code4 buttons are conveniences to allow you to add your own code to run when they are tapped. The sandbox also caps the number of anchors it allows, though this can be changed in the source.