We’ve heard it pretty often already: Augmented Reality (AR) was the rising star of 2016 and will probably gain even more relevance in 2017. In the (near) future, our everyday applications will move away from the flat screen and occupy our surrounding space. For this reason, working on the user experience of these applications has been particularly exciting. This past November, I was lucky enough to collaborate with the JPL Ops Lab to design a file visualization feature for their AR application ProtoSpace. ProtoSpace is used by mechanical engineers at NASA to view a realistic model of the rovers they are currently building. This application is used with the Hololens headset. As of now, engineers have to look at CAD files to view different parts within this model. My task was to think of a way in which engineers could navigate this file hierarchy in an AR environment without having to go back to their computers. While designing a solution, I found that I have to approach the problem from a different angle. Since there are no established “UX best practices” in AR yet, I’d like to share my own personal approach to UX in AR:
Think about how we interact with physical objects within our environment and use that as inspiration.
How do we notice a sign when we walk down the street? How does a particular object catch our attention? It’s useful to think about these elements and to learn from real life situations when designing an interface in AR. Since our design solution will be integrated within our space, we want it to feel as natural as possible. For example, while I was working on the ProtoSpace project, I encountered an interesting problem. I had to figure out how NASA engineers could comfortably search for a particular file (like a specific rover part) within a hierarchy through the Hololens if the file size was extremely large. I was then given great advice on how to approach this problem: let’s say you’re at a subway station and you see that the train approaching the station is packed. How were you able to tell that it was packed? This prompted me to think about how we instinctively know that we are dealing with a large number of something (whether it’s people or objects). Within a confined environment, do we notice the lack of space first? Or do we just see a large number of one thing and only then notice the lack of space? Or most likely, both at the same time? As a derivation of the train example, these questions got me to think about how I would search for a particular person within a crowd once I realize that there are too many people to conduct a “linear search” (picture the nightmare scenario of going to a concert with over 500 people and having to look through every single one of these people in order to find your friend). Maybe to make things easier, I could pick out a set of characteristics that I know about this person to narrow down the search (I know what my friend looks like and maybe what they’re wearing). Now that I have thought about a real life version of my design problem and the logical steps I would personally take to solve it, I can then think about how this solution could be visualized through design. Pretty simple right?
Don’t try to apply 2D UX best practices to 3D.
This one is a bit of a given. Looking at a flat screen is a completely different experience than interacting with a virtual tridimensional object right in the middle of your office or living room. Then why should you use the same design principles for both?
The most effective initial prototypes are always physical.
This goes with my previous point. If you are going to create something that you will be seeing within a physical space, then it makes more sense to prototype accordingly. It’s a bad idea to use your typical Sketch wireframe or Axure clickable prototype to present your initial designs. By doing this, you will not only get inaccurate user testing results, but you also won’t get a good grasp of how your design will work/feel. A better low-fidelity alternative would be to create a cardboard model of your designs and take pictures of the different stages of user interaction. For extra points, you could even create an interactive prototype that can be tested using Google Cardboard.
UI Elements don’t necessarily have to be in 3D.
Not everything in your AR application has to have a 3D shape. For example, when you’re driving on a highway the signs will most likely be flat rectangles or squares. You rarely encounter cube signs. It’s totally fine to display some elements that are in 2D.
I hope that the following approaches are useful for those of you brave souls out there who want to venture into the world of AR. If you have experience designing for AR, I’d love to hear about your own approach in the comment section below!