Designing the New Mixed Reality Annotations
Published Apr 13 2023 07:35 AM 8,842 Views
Microsoft


MR Annotations.jpg

 

Improving reliability and introducing direct inking to Dynamics 365 Guides


Drawing has been a universal form of communication and expression for millennia. From cave paintings to modern-day instruction manuals, the act of putting ink to a surface is a powerful tool for conveying ideas and information in a simple manner. Mixed reality technology takes this instinctual medium to unchartered territory by enabling users to annotate anywhere in their physical space. With Dynamics 365 Guides 8.0, we brought the power of ink to augmented reality, offering a unique way for remote participants to transfer their expertise by inking directly within a HoloLens (HL) operator's view.

Historically, annotating in D365 Guides has been restricted to placing ink on the HoloLens-generated mesh, a virtual representation of the complex environment perceived by the HoloLens. Placing ink on the mesh works best in scenarios with simple, flat surfaces. For example: highlighting a tape measure on a workbench. However, our customers use Guides in significantly more complex environments, ranging from fast-moving automotive lines to high-precision chip manufacturing plants. We wanted to build something reliable, accurate, and special for our users – something that felt familiar yet powerful. Here’s how we did it:


Good to great

With Guides 8.0, our goal was to port over core functionality from Remote Assist to Guides, enabling customers to leverage mixed reality calling while working on their existing Guides. With 8.0, our focus was parity. Going forward, we wanted to take good to great.  We began with an in-depth review, analyzing internal data, customer interviews, and extensive testing in our facilities to discover three main drawbacks. We will explore these in the context of a common use case for annotations, highlighting a specific area within the HoloLens operator’s view: a machine, a radiator, or shock absorbers on an ATV.


1. Remote ink splattering

By this, we mean you had to position yourself in the exact same point of view as the annotator in order to see the annotations accurately. Even small orientation changes could cause the annotations to appear splattered over the Z-axis. This was further exacerbated when remote experts drew over complex surface structures generating incomplete mesh data. This splattering of remote ink confuses HoloLens operators and reduces remote expert reliability in annotations as a tool.


2. Multi-stroke shapes 

Remote experts will often draw multi-stroke shapes on a 2D cross-section of the HoloLens operator's view, where they draw one line, lift the pen, and draw again to create a shape. Since they are seeing a 2D view, they expect the HoloLens operator to see the shape the same way.

 

However, the mesh reading might be incomplete or change between strokes, so HoloLens operators often see a disjointed view. This results in an extremely confusing experience for HoloLens operators as they are left to parse together what the remote expert meant to convey. 

 

3. Inking within arm's reach  
On HoloLens, you would traditionally ink your world by pointing to a surface in the distance. To do so, you would select the annotations button, raise and point your hand, sending a ray from your fingertip to a mesh (usually, a surface). You could then annotate onto that mesh, drawing the ink with your fingertip. However, this only allowed you to ink on a surface some distance away, thereby limiting your ability to annotate anything in your immediate vicinity. Artificially restricting where operators can & cannot annotate limits our users, and we know they deserve better

 

Seeing is believing  
At MR Apps, we know the power of mixed reality is seeing and believing. This same philosophy extends to feature development. Before writing a single line of code, we spent many hours hovering around the ATV, air gesturing imaginary annotations to visualize how we wanted annotations to appear for HoloLens operators.   

Testing and prototyping in Vancouver.jpg

 

While doing this, we explored numerous approaches for improving annotations, everything from smoothing the HoloLens-generated mesh to virtual windows from the remote expert's point of view. Through physical prototyping, we were able to stress test our ideas, validate against edge cases, and reach viable solutions significantly faster. Ultimately, we came up with two novel approaches for improving annotations:

  1. Intelligent Plane Annotations:
    The core issue we uncovered during our research was that remote experts draw in 2d while HoloLens operators perceive in 3d, often causing a discrepancy between what the remote expert intended and what the HoloLens operator perceived. We chose to avoid this discrepancy entirely by drawing each remote expert stroke on a single plane. We intelligently decide the plane for each annotation at runtime through a statistical analysis of the mesh in the region the ink was placed. The plane itself is invisible to HoloLens operators. Instead, it just acts as a canvas for remote expert ink.
    theoneprashant_3-1681151231255.png

     

    Using a plane approach also helps solve the second problem we identified, multi-stroke objects. We apply a caching mechanism to determine when the plane should be recalculated based on time and distance proximity of the previous ink stroke. This approach enables us to solve two problems simultaneously and significantly improve the reliability of remote expert inking.
     ArrowFuel.jpeg
  2. Direct inking:
    Addressing the ability for HoloLens operators to ink within their immediate vicinity would require rethinking how HoloLens operators place ink in the world. We wanted direct inking to be as easy as picking up a pen and drawing. Direct inking involves two core components: engaging direct inking mode and direct inking interaction.

    Engaging direct inking:
    During our research phase, we explored numerous concepts to engage direct inking mode. Some were simple as adding an additional button to the existing Guides UI while others included futuristic concepts, such as turning your wrist to get an annotations bar. Ultimately the team introduced a new annotations bar below the existing call window, with a radio button to select a specific annotation mode. This offers a simple and easy way to enable direct inking without redefining interaction paradigms.

    Interaction:
    We brainstormed several ideas on how operators can place ink in their world but kept returning to a well-established paradigm – pinch and move. HoloLens operators would see a paint blob on their index finger, pinch to place and release their fingers when finished painting. This was a model familiar to existing MR Apps customers across our apps and has many parallels in the physical world – like pinching a pen before starting to write.  
    Annotate from fingertips.jpg

A leap forward
After multiple design explorations, daily prototyping, and addressing a multitude of perspectives, we are excited to show you the revamped annotations experience in Dynamics 365 Guides.

Our introduction of intelligent plane selection and plane caching led to significant gains in reliability for our remote experts. Remote Experts are now able to draw on highly complex surfaces, such as engine parts and automobile skeletons, with confidence that any ink placed will appear as they intended. Take a look at our tests on a disassembled ATV.

 

Annotation ink on ATV 1.jpg

Annotation ink on ATV 2.jpg

 

Furthermore, with touch we have introduced an entire novel interaction paradigm, empowering our HoloLens operators to share their thoughts easily – whether that be suggestions for tire placements or reconfiguring fuel tanks. 

What next?  
At MR Apps, we're always focused on pushing the boundaries of mixed reality, and we can't wait to see how our customers will use the improved annotations to strengthen and create new workflows. In the meantime, we are constantly updating the D365 Guides application with stability improvements and new features – so stay tuned to the Mixed Reality blog for the latest and greatest.

 

Co-Authors
Version history
Last update:
‎Apr 13 2023 09:53 AM
Updated by: