Prototyping AR 2

Week 13 : July 23rd - July 29th

Prototyping AR 2


Frames are digital boundaries mapped to the physical world which are used to contain and display 2D or 3D content. They are aware of surfaces and will automatically scale and place themselves. Frames can be moved, resized and interacted with.

Handles are the visual interfaces which control each frame. Every frame has it’s own handle which stores the information of the frame. Handles hold UI options and additional frame control.


World Content: Low deinsity content which is automatically generated, placed and visible to anyone.

Personal Content: Private user created and positioned content which can be shared with others.

Mobile Integration

Follow-up videos were created soon after the feedback sessions to explore a common theme of interest across user testing to date: Mobile Integration.

Many participants imagined the frames and handles paradigm working seemlessly with mobile devices, allowing them to transfer content from their mobile to a frame in their environment. Sketching and videos were used to explore different aspects of handle and content creation in AR and mobile, leading to another shift in the AR platform functionality.

Old Content Creation (AR): New content was created entirely in AR by accessing digital content through AR apps or browsers and creating a frame and handle from the content.

New Content Creation (Mobile): New content is now created entirely through mobile. Mobile devices linked to the AR glasses will have a “cast” feature which will load AR friendly content into handle which can be placed by the user. This functionality is similar to a Chromecast.

Mobile screen and flow mockups were made to assist in the design of a mobile integrated AR platform. Mobile is much better than AR at navigating through content quickly and finding what you're looking for. Using the strength of mobile to find content, and the strength of AR to display content provides a superior user experience than a pure AR system.

A recently used technique could provide users with a fully AR way of recreating frames that have already been created through mobile at some time, ensuring the potential of a more seamless process when placing commonly used content.

Unity Prototyping

Functional prototypes were developed in Unity to present variations of key platform interactions to participants. Gestural interactions were tested in Unity with a Leap Motion to gather insights from participants, aiding in defining intuitive and functional gestures.

Participants were asked to test a variety of pre-determined gestures and the effects on digital content within unity. Combining the Unity prototypes with video prototypes allowed participants to understand the purpose of the gestures, but Unity was more applicable to testing comfort and ease of use of individual gestures than the entire experience.

Unity Prototyping

Projector Prototyping

Prototypes which allowed users to experience how the AR platform would look and feel allowed participants to expand on the interactions. These tests provided subjective feedback and opinions which offered insight into the further development of the platform.

Unity is a powerful tool for creating AR / VR content and evaluating the usability and ergonomics of gestures. However, the user experience needed to be tested in a more realistic environment beyond the screen. Experience prototyping was carried out to guage the overall user experience and needs more accurately beyond functional requirements.

Experience Prototyping Reduced

Using a mobile prototype made in Adobe XD, participants were asked to create a new handle and then place it on a wall. A projector was used to visualise the frame. Various gestures were experimented with to move, resize, and control the content within the frame and participant feedback and gesture recommendations were recorded. This approach also allowed participants to experiment with their own choice of gestures.

Experience prototypes allowed participants to imagine how the AR platform would fit into their life more clearly, offering insight into how they would use it personally. The findings from these tests ensured a human centered approach to the design of interactions and display modes for the AR platform.

Some of the testing can be seen in the video below.

Paper Prototyping

Quick, iterative paper mockups of handles and handle UIs were made to represent user needs throughout testing. The functionality of these paper prototypes was defined by what users wanted to be able to control through gestural interactions. This quick technique provided a good sense of the expectations of users while role playing with the prototypes.

Paper Prototyping

Key Findings

This series of experience prototyping provided significant insight into how users expected and wanted the platform to function.

Feedback: Increased system feedback when interacting with content. This feedback could be around hands or around the frame during the gestures.

Gestural Control: After placement, participants wanted to limit the amount of required mobile interaction. In the use case of playing media content for example, it’s necessary to be able to pause, play and fast forward through video content without picking their phone back up.

Deeper World Content Interaction: Combining world content with their personal information to add to the world content experience. Contextual reminders such as “buy bread” could pop up automatically while passing a shop.

Content Sharing: Media content should share automatically when with others to create a more social enviornment and platform.

Frames Without Surfaces: Frames which aren’t linked to a surface should orbit the user for easier placement.

Frames With Surfaces: The frame will interpret the surface it is being mapped to and scale itself automatically at first.

Defining Gestures

Based on the interactive prototyping techniques above, key gestural control requirements were identified. Comfortable, intuitive and quick gestures were developed to control the AR platform. Gestures needed to have a consistent feel, while being difficult to do accidentally.

The potential of both one and two handed gestures were explored, however due to user testing & feedback and general ergonomics and effort, completely single handed gestures were chosen.