GeneratePlayspace_UpdateScan_DynamicScan: Called each frame to update the scanning process.GeneratePlayspace_InitScan: Indicates that the scan phase should begin.SpatialUnderstanding_Init: Called once at the start.The included SpatialUnderstanding.cs file manages the scanning phase process. Spatial mapping mesh in white and understanding playspace mesh in green A mesh is generated approximately every second by extracting the isosurface from the voxel volume.
Internally, it stores its voxel space aligned to these axes. During the initial part of scanning, a primary component analysis is completed to determine the axes of the room. The DLL for the spatial understanding module internally stores the playspace as a grid of 8cm sized voxel cubes. The mesh seen during this phase is an important piece of visual feedback that lets users know what parts of the room are being scanned. During the scanning process, you look around your room and "paint' the areas that should be included in the scan. When you load the spatial understanding module, the first thing you'll do is scan your space, so all the usable surfaces-such as the floor, ceiling, and walls-are identified and labeled. However, during the scanning phase, a primary axis analysis is completed to optimize the mesh tessellation along major and minor axis.
Indoors home or office setup: The query functions are designed around flat surfaces and walls at right angles.The generated mesh is important to provide user feedback during this phase. User driven playspace “painting”: During the scanning phase, the user moves and looks around the playspace, effectively painting the areas which should be included.Query functions will not function until after the scan has been finalized. One-time scan process: The process requires a discrete scanning phase where the user walks around, defining the playspace.Fixed size playspace: The user specifies the maximum playspace size in the init call.As such, its solution is structured around a specific process and set of assumptions: While the spatial mapping solution provided by HoloLens is designed to be generic enough to meet the needs of the entire gamut of problem spaces, the spatial understanding module was built to support the needs of two specific games. There are many useful queries included in the Unity sample that will allow you to find empty spaces on walls, place objects on the ceiling or on large spaces on the floor, identify places for characters to sit, and a myriad of other spatial understanding queries.
The code for the C++ solver has been wrapped into a UWP DLL and exposed to Unity with a drop-in prefab contained within MixedRealityToolkit. All the source code is included, allowing you to customize it to your needs and share your improvements with the community. Microsoft and Asobo have now open-sourced this code and made it available on MixedRealityToolkit for you to use in your own projects. We took Asobo's original code and created a library that encapsulates this technology. It also gave us the ability to optimize against a set of constraints to determine the best placement for holographic objects. Using this, we could analyze a player's room and identify surfaces such as walls, tables, chairs, and floors. In Young Conker, we wanted Conker and his opponents to be able to use raised surfaces in a player's room as platforms.Īsobo Studios, our development partner for these games, faced this problem head-on and created a technology that extends the spatial mapping capabilities of HoloLens. We also wanted to be able to identify surfaces that life-size holographic characters could sit on, such as a couch or a chair. Each game had its own specific placement needs: In Fragments, for example, we wanted to be able to distinguish between different surfaces-such as the floor or a table-to place clues in relevant locations. While we were working on Fragments and Young Conker, two of the first games for HoloLens, we found that when we were doing procedural placement of holograms in the physical world, we needed a higher level of understanding about the user's environment.