Final Video & Feedback

Week 16 : August 13th - August 24th

Final Video & Feedback

Final Video Prototype

Over the past 2 weeks, I've been preparing the final video prototype. The video follows a basic story across each use case to show a relatable and understandable benefit to using AR and the potential of the Frames & Handles paradigm.

All interactions with digital content were mimed for the film, and the effects were added in post production by rotoscoping. Rotocosoping involves creating a layer on top of the current video and cutting out the content which will occlude the digital content frame by frame. It can be a very time consuming process, but allowed me to easily animate the digital content once done. The final output is clearly a concept video, but it has a level of realism to it due to what looks like actual digital interaction.

Final Unity Prototype

To provide users with an experience beyond just watching the video, a basic Unity prototype was created to allow them to try out a few gestures. Using the leap motion and Unity, three different gestures were built into the program:

  1. Moving a frame
  2. Resizing a frame
  3. Creating a handle

The prototype has issues tracking certain gestures due to the tracking of the leap motion, and incorrect gestures are accidentally activated at times. The prototype did allow users to feel what it would be like to place a frame in an environment quite well, but did have it's limitations.

User Feedback

The final video prototype, visuals and interactive Unity prototype were brought to users at the conclusion of the project to evaluate the effectiveness of the concept, and the level of user understanding.

User Understanding

Based on the research, AR can be a difficult technology for new users to grasp initially. Understanding how it could work for people in their everyday lives or identifying key uses can be an even more difficult task.

The frames and handles paradigm gave AR a level of clarity and structure, aiding in user understanding of AR as a technology. The prototype video identified use cases which were relatable and simple to grasp. Users could see what AR could do for them and different methods of use without an extensive explanation of the technology.

 

Gestural Testing

An updated Unity prototype was created which combined three gestures (move frame, resize frame, open handle UI). Using a Leap Motion, users were asked to use the gestures to place a frame on the wall of a virtual room and size it accordingly.

Limitations of the Leap Motion tracking and implementation led to some gestures occurring accidentally, however user feedback was positive overall. Users felt that the gestures felt natural when controlling the frame, and when the gestures were tracked correctly they felt good to use. This prototype provided a good sense of the interactions within the system, adding a clear benefit to user comprehension when used alongside the video prototype.

 

Use Speculation

The project was successful in eliciting a speculative response from users; imagining how they would use this AR in their own lives. Speculation ranged from building upon the use cases presented in the video to bringing up completely new ways which this AR paradigm could be used.

The majority of use speculation involved adding a level of personalisation to the use cases. They could now see how AR could be implemented to solve their problems; such as increasing text size so they could read without glasses or being able to watch their own media when their partner wants to watch something they don’t like.

New use cases were discussed; using AR as a productivity tool or a creative digital tool were explored. These uses involved sharing creative content with others in a team and being able to create a digital whiteboard which could easily be brought from place to place.

Future Speculation

The examples above, highlight relatively grounded and relatable use cases for AR and the frames and handles paradigm. The majority of these examples would use readily available content such as videos, text and API content, however it is important to speculate how this AR paradigm could develop in the future.

 

AR Media Content

Using frames and handles, media content can be cast from a mobile device and placed on a surface for viewing. Development in media types and environment control could enhance AR specific viewing experiences.

Immersive media content which maps to the walls, ceiling and floor could create a unique viewing experience closer to VR immersion but with the benefit of still being in the physical world and sharing the experience with others.

3D content would provide an interactive experience, perhaps allowing holographic media content to respond to and interact with the environment it’s being viewed in. Imagine a scripted “escape room” experience in your own home, powered by AR and 3D content.

Regular 2D video content could also be extended through manipulating the environment it’s being watched in. It may be possible to darken the room to create a cinematic experience while watching a film on a large AR frame.

Smart / Adaptive Content

In situations when frames are being created through parsing online content such as recipes, instructions, etc; there is scope for the AR platform to be “smarter” and generate adaptive content.

For example, before cooking a user could point at the food they have out on the counter and generate recipes based on the food they have available to them. These recipes could also provide additional information on ingredients or methods, such as showing videos of specific methods, or offering alternative ingredients if the user wants a healthy option. This content could adapt to the needs of the user without a huge amount of input.

 

Location Aware AR Content

Frames provide a location and boundary for AR content to be displayed, however this content isn’t limited to a 2D plane. The AR Cloud and location awareness of the system could allow the AR platform to generate contextual information automatically based on frames which are currently active. This content would be an extension of the frames.

If a frame displaying a recipe was open while the user was cooking, user actions related to cooking could be enhanced throughout the process. Examples of this content could be automatically displaying the current and required milliliters while filling a jug; or highlighting how many grams of an ingredient has been added.

Expert Validation

Subject matter experts were interviewed on the completion of the project to gain criticisms, insights and validation into the process, output and decisions made throughout.

Virtual Reality Ireland - Camille Donegan & Terry Madigan

Virtual Reality Ireland produce AR, VR and 360 video experiences; following a human centered approach to ensure that they offer their clients the cutting edge of technology and user experience in their products.

This project was presented to Camille Donegan (General Manager & Producer) & Terry Madigan (CEO) of Virtual Reality Ireland. Overall they believed that the future of AR should be calm, and that this project is in line with how they believe AR should progress. They saw potential in IoT and connected devices linking to an AR system as a way to further develop interactions while reducing the necessity of a mobile device. A smart fridge which is aware of the food you have, could generate recipes for the user which could in turn be represented in AR.

The visual feedback in the platform was liked, however they believed that the addition of optional subtle audio cues would enhance user comprehension. They likened developing a gestural language for AR to sign language, but without a huge learning curve. Camille and Terry liked the simplicity and intent of the gestures chosen in the platform.

Contextually generated world content was a topic which Terry was very interested in. He discussed the concept of VPS (Visual Positioning System) and it’s potential to place this form of world content in the correct location efficiently. He also had a keen interest in the future of control and input modalities, mentioning a PAN (Personal Area Network). This PAN would be a network of wearables / devices for the user to control the digital content around them.

Camille & Terry were extremely positive about the project scope, proposal and outputs. They aim to extend their involvement with academic research and education to bridge the gap between industry and academia in Augmented Reality.

Fjord Extended Reality Team

The Fjord Extended Reality team work on a variety of AR/VR projects, concepts and prototypes for international clients.

An overview of the project was discussed with members of the team to guage their opinions of the aims and outputs. The team was familiar with early iterations and video prototypes and were very positive about the project intention. They see the importance of looking to the future of what a technology could be, and would like to be able to engage in more speculative projects themselves.

The meeting with the team was too brief to have an in-depth conversation about the project. Final project submissions were emailed to the team after the meeting to gather deeper comments and feedback.

Extended Video

To provide more context into the actual project and process, I also decided to make an extended version of the final video which could be used as a standalone video.