
Design Lessons Learned From Mobile AR Experiments
For the last couple of months I have been working on a variety of small AR experiments and side projects. With the release of both ARCore and ARKit, it’s been fun exploring and learning what these platforms are capable of. I’ve listed some lessons that I’ve learned along the way, from both a visual design and user experience perspective.
Provide visual feedback with AR interactions
Augmented objects in the world aren’t always visible on the screen. For the Breadcrumbs app, Tulips are placed every couple of feet as a user walks. Since the Tulips are behind the user, there is no way to see this interaction happening until the user looks behind them. To solve this problem, I added a 2D Tulip icon that pulses to white everytime a 3D Tulip is placed in the world.


Consider motion and proximity based interactions
It’s easy to track your device’s location and movement, so use this information to enhance the user experience. Create moments of delight in your AR app, with the use of movement based interactions and the proximity between the user and the augmentation. For the Cuties app, I added a small interaction based on the distance of the device to the characters. For their idle animations, they occasionally blink but when the user gets close to a Cutie, a wink animation will trigger.

Understand how animation affects content
The first iteration of Cuties included a small bouncing animation when two Cuties are placed next to each other. But I soon realized that any movement that occurs in a direction that is parallel to the plane, or oriented feature points, makes the augmentation look unstable. I iterated on the animation from a bounce to a subtle change in scale, which helps the Cuties express the same happy emotion.


Incorporate more UI into the environment
I became interested in the challenge of incorporating UI in the environment rather than on the screen. For the flower’s speech bubble, I placed the UI above the flower’s head. I made the bubbles rotate so that his dialogue is always readable, no matter where the user is positioned when viewing the flower. Another benefit of not placing this UI on screen is that it doesn’t block the augmentation or cover up the rest of the camera feed for an extended period of time.

Provide visual cues for 3D augmentations
When placing 3D objects in the real world, you need to make sure you are considering and designing for visual cues like colors, sizing, and shadows. In the Breadcrumbs app, the 3D tulips are placed in a path that indicates the user’s steps. The first iterations of Breadcrumbs used a single flower color, oriented the tulips parallel to the ground, and displayed various sizes in different sequences. However from all of these trials, I found the tulips were visually unreadable. It was hard to distinguish depth, especially when the tulips formed a straight line. By keeping the flowers a consistent size, displaying a sequence of colors, and adding a shadow, it’s a lot easier to understand their location and placement in the real world.


Personalize your AR project with customization
When building an AR app, it’s important to keep in mind that features like plane visualizations and/or markers are required visual elements. These can be modified to make your app unique and enhance the user experience. In ARCore, planes are visualized with a default texture and randomly chosen colors. For Cuties, I created a transparent heart pattern for plane visualization. The new design is subtle, doesn’t distract the user, and helps maintain the overall love theme.
Whereas for marker based AR applications, it’s easy to upload or customize your own designs. This can help brand your application. You also have the potential to incorporate real world settings like murals or integrate markers into everyday objects like posters.


Tell a story with your AR work
In mobile AR, the technology has a limited understanding of the environment. As a creator, you have to recognize these limitations and work with them. So when sharing and capturing content, consider the story you are trying to tell. Depending on your application, it is important to select an open space or a location with enough light to properly detect planes. With the way mobile AR rendering currently works, there isn’t a way to block or occlude content which passes through things like walls, objects, or even people.
So selecting the wrong environment for your captures can really break the experience and visualization. Often times I’ve had to capture the same experience multiple times. I make sure that I am positioning myself and the device at the best viewing angle. Despite the limitation with the technology, the goal is to present the ideal implementation of the content and the story that you are trying to tell.


With new features and new AR platforms being released, I’m excited to continue exploring this medium and designing for the future. I am also interested in designing and prototyping what best practices and good user experience for augmented reality looks like. To view more of my AR experiments and design projects, you can find me on Twitter @vishnuganti.