Experiential Design /Task 1: Trending Experience

Start from 24.9.2024

24.9.2024 -15.9.2024 /Week 1- Week 4
Kong Cai Yi / 0363862 
Experiential Design  / Bachelor of Design (Hons) in Creative Media 
Task 1: Trending Experience 




INDEX
1. Lectures
3. Feedback

LECTURES

Week 1/ Introduction to Module
During the first week, Mr. Razif introduced us to the subject outline and expectations. We were shown examples of past student projects to help us understand AR and spark inspiration. He also covered the different types of AR experiences and development technologies, guiding us through the process of quickly designing an AR experience. We gained a clear understanding of the differences between AR, VR, and MR, learned how to identify AR and MR, designed our own AR experience, and even created a simple AR app during the lecture.

  • Augmented reality (AR): a view of the real world—physical world—with an overlay of digital elements.
  • Mixed Reality (MR): a view of the real world—physical world—with an overlay of digital elements where physical and digital elements can interact.
  • Virtual reality (VR): a fully-immersive digital environment.
  • Extended reality (XR): an umbrella term that covers all these different technologies, including AR, MR, and VR. 
Fig 1.0 Difference between AR, MR and VR (24.09.2024 - Week 1)


For our in-class Activity 1, we were tasked with launching google.com on our phone's browser, searching for terms like "dinosaur," "dog," "cat," or "tiger," and using the "View in AR" feature to see them in life size.
Fig 1.1 Activity 1 (24.09.2024 - Week 1)

Fig 1.2 Week 1 Lecture (24.09.2024 - Week 1)


Task 2 Instructions:
Fig 1.3 Task 2 instructions (26.09.2024 - Week 1)

Gym

  • Scenario: Many people go to gym but don't know how to properly use the workout machines. The AR experience could allow users to scan each machine with their phone, and a virtual trainer or instructions appear, demonstrating the correct usage of the equipment, including proper form and exercises for different muscle groups.
  • Extended Visualization: This AR feature would project interactive 3D models of how to use the machines, animated in real-time, alongside tips on reps, sets, and which muscle areas are targeted. Additionally, it could offer workout routines based on the machines available.
  • User Feeling: The goal is for the user to feel confident and empowered while using the equipment. The AR guidance should make the user feel secure, providing them with instant, practical knowledge to enhance their workout experience and avoid injury.
Fig 1.4 Gym room (26.09.2024 - Week 1)

Fashion Shop

  • Scenario: In a fashion store, finding the price and fitting of an item can be inconvenient, especially when trying to locate tags or sizes. The AR experience could allow customers to scan any clothing item with their phone, instantly displaying the price, size availability, and additional details like material, washing instructions, and style recommendations.
  • Extended Visualization: Along with price and details, the AR feature could measure the user’s bust, waist, and hip measurements on the spot. This would help the user find the right size without needing to try on multiple options. The AR visualization could also offer a virtual fitting room where users can see themselves wearing the clothes in real-time, using their body measurements for an accurate preview.
  • User Feeling: The user will feel more informed and efficient, as they can instantly access prices and sizing, and avoid the hassle of trying on clothes in a fitting room. The AR feature will save time and offer a more personalized shopping experience, enhancing both convenience and satisfaction.
Fig 1.5 Fashion Shop (26.09.2024 - Week 1)

Week 2/ User Journey Maps
This week, in Mr. Razif's lecture, we were introduced to professional abbreviations like BX (Brand Experience), SD (Service Design), and IA (Information Architecture). We also learned about the distinction between UX and XD. Additionally, we explored user mapping and empathy mapping, which serve two key purposes: creating a shared understanding and aiding decision-making. Before class ended, we were instructed to download the Unity app, sign up, and select a license by watching the tutorial video.

Fig 1.6 Week 2 Lecture (01.10.2024 - Week 2)

In class, Mr. Razif divided us into 4 groups, each consisting of around 7 people. Our task was to choose a place for creating a journey map showing user experience, pain poin, solutions and present before the class ended. We decided to focus on a theme park. We chose Figma as our tool and found a great template to help us organize and input our ideas. 

My group had decided to make a journey map through a themepark. In the themepark , we visualize the scenario in the order below:
  • Parking Car
  • Queue for ticket
  • Searching for ride
  • Queue for rides
  • Riding
  • Toilet time
  • Eatery
  • Merchandise Store
  • Park Exit
MIRO LINK
Fig 1.7 Theme park user journey map (01.10.2024 - Week 2)

Fig 1.9 Group Presentation (01.10.2024 - Week 2)

Fig 1.8 Theme park user journey map (01.10.2024 - Week 2)


Week 3/ Introduction to Unity 
This week, we installed Unity on our laptops as the primary tool for developing our AR app. Before the class begins, we need to follow the MyTimes tutorial, which covers the basic techniques for creating an AR experience. This involves scanning an image, which triggers the display of a cube and a video plane. I used a selfie as the image marker and an online video. By carefully following the tutorial step-by-step, I ensured that when I tested the AR project, everything worked as expected.

Vuforia Engine
  • Register account
  • Choose 10.14 and download
  • Add vuforia engine to a Unity project / upgrade to the latest version
  • Finish download, double-click, then import in Unity

Fig 1.9 Download Vuforia Engine (07.10.2024 - Week 3)


After downloading vuforia , we will be able to obtain the API key for us to access libraries such as dataset.
 
Fig 1.10 Add License Key (07.10.2024 - Week 3)

Fig 1.11 Import Vuforia Engine (07.10.2024 - Week 3)

First, we set up the image target and model the cube. For fun, I even used my own self-portrait as the image target! 😄
Fig 1.12 Self Portrait  (07.10.2024 - Week 3)

Fig 1.13 Add Image Target (07.10.2024 - Week 3)

After setting the ImageTarget, we can easily add any object as a child under it. For our first hands-on AR experience, we tried modeling a cube block by using image recognition to identify the target image.

Fig 1.14 AR Experience 1 (07.10.2024 - Week 3)

Fig 1.15 AR Experience 1- Add Video (07.10.2024 - Week 3)

User Controls and UI
In a recent class lecture, we were instructed to create different scenes, such as a Menu, an AR scene, and a Credits scene, and link them together using scripts. It has been a very interesting and fun experience.

Fig 1.16 AR Experience 1 Progress (08.10.2024 - Week 3)


Mr. Razif introduced us to two ways of connecting scenes through scripts. The first method involves copy-pasting the same script for every button, then modifying the script to change the button names and their respective actions. The second, more efficient method is to use the "gotoScene(string)" function, where you can call different scenes by simply passing the scene name as a string parameter. This allows for cleaner, more organized code, as it avoids repetition and makes it easier to manage scene transitions in larger projects.

Fig 1.17 Use gotoScene(String) Format (08.10.2024 - Week 3)

The "gotoscene(string)" method is particularly useful when dealing with multiple scenes because it reduces the risk of errors and makes future updates simpler, as I don’t need to edit every individual button script.
Fig 1.18 Add Scripts to different scene (08.10.2024 - Week 3)

Fig 1.19 Final Outcome (08.10.2024 - Week 3)


Week 4/ Markerless AR Experience

This week, we explored Markerless AR, which allows users to scan the ground and place objects directly onto surfaces without the need for image markers. After the lecture, Mr. Razif reminded us about the Task 1 AR experience, emphasizing that we need to propose at least three ideas to create a seamless user experience and submit them by the end of this weekend.
  • Plane finder: allow user to detect the ground
  • Ground Plane Stage: allow us to items on the ground
Fig 1.20 Add Plane Finder and Ground Plane Stage (17.10.2024 - Week 4)

Fig 1.21 Add cube and represent ground plane  (17.10.2024 - Week 4)

Fig 1.22 Test with ground plane simulator  (17.10.2024 - Week 4)

Check for
  • Resolution and presentation > render over native UI (TICK)
  • Metal API validation: disable
  • Camera usage description: AR Camera
  • Target minimum IOS version: 17
Fig 1.23 Player Setting (17.10.2024 - Week 4)

Fig 1.24 Connect Xcode with my phone device (17.10.2024 - Week 4)


Fig 1.25 Connected! (17.10.2024 - Week 4)


INSTRUCTIONS



Task 1/ Trending Experience

Instructions:
  • Explore the current, popular trend in the market to give them better understanding of the technologies and the knowledge in creating content for those technologies. 
  • Conduct research and experiment to find out features and limitation which will later allows them to make decision on which technologies they should proceed with in their final project.
  • To complete all exercises to demonstrate understanding the development platform fundamentals.
  • Submit the link to your blog post, make sure all the exercises are updated on your blog.


AR Research

Before proposing my idea, I did some research online to gain a better understanding and gather more inspiration.

14 examples of augmented reality brand experiences
If augmented reality headsets and eyewear become more commonplace, brands and entertainment companies may take the opportunity to become more ambitious with AR, leaning into the potential of experiences that can overlay interactive elements onto reality.



Fig 2.1 AR Research (13.10.2024 - Week 4)


The Usability of AR

Most users are still unfamiliar with AR technology and, unless they have extensive gaming experience, they may be challenged by the icons and patterns used by AR features. If your mobile app uses AR, make users aware of this feature and help them easily locate those items in your app for which it is available.

Help people interact seamlessly with your virtual objects by using clear signifiers, text labels for icons, and step-by-step instructions that are easy to see. Allow them to easily modify the AR object within the AR scene, to prevent the need for recalibration, but ensure that UI icons leave enough room for the AR scene.

Even though most users find delight in augmented reality, do not implement AR for the sake of AR, but, rather, make sure that it adds value to the overall user experience.


Fig 2.2 AR Research (13.10.2024 - Week 4)



Qualities of great AR
Guide people to the right environment, to take advantage of screen space, to design for constant movement, to think about ergonomics and a limited field of view, to use depth cues, and to limit the duration of the experience to keep people from getting fatigued.


Fig 2.3 AR Research (13.10.2024 - Week 4)


Marker-based vs markerless augmented reality

Marker-based AR

Pros
  • If the marker image is prepared correctly, marker-based AR content provides quality experiences and tracking is very stable, the AR content doesn’t shake.
  • Easy to use, detailed instructions are not required for people who use it for the first time

Cons
  • When the mobile camera is moved away from the marker, AR experience disappears and the trigger photo has to be scanned again. It is possible to use extended tracking, but in most cases, extended tracking makes things worse.
  • Scanning will not work if markers reflect light in certain situations (can be challenging with large format OD banners in ever-changing weather conditions)
  • Marker has to have strong borders/contrast between black and white colors to make tracking more stable. Smooth color transition will make recognition impossible
Markerless AR
Pros
  • Once the content is placed in a room, it is more flexible than marker-based alternatives.
Cons
  • The augmented reality content may not make sense in a certain context.
  • For better experience, it is required that the surface has a texture for computer vision to recognize it.


Location-based AR


Fig 2.4 Google Map AR Example (13.10.2024 - Week 4)


Pros
  • Allows for geographical targeting in tourist hotspots without requiring expensive outdoor banners
  • Allows for practical applications in terms of directions
Cons
  • It is challenging to get precise GEO-located experiences on mobile devices because of lack of sensors accuracy in phones.



Lecture Video

In this video, he introduced Affordances in AR (Augmented Reality) which refer to the perceived and actual properties of an object or environment that allow users to interact with it. In AR, affordances guide users on how to interact with virtual elements overlaid on the real world. For example, a 3D button may afford pressing, or a virtual object may afford rotating, swiping, or pulling. Good affordances in AR enhance usability and ensure intuitive interactions, making the experience seamless for users by aligning with real-world behaviors.

Fig 2.5 Lecture Video (13.10.2024 - Week 4)




Task 1 Ideation


I have documented my four ideas in a Google document, each followed by a visual image. For each idea, I included the problem, solution, target audience, process, as well as the pros and cons.

Fig 3.1 Task 1 Ideation Document (16.10.2024 - Week 4)


FEEDBACK

Week 4/Consultation
  • Idea 1 AR in Fashion Store: The user experience can be enhanced by adding a feature that outlines the user's body, allowing them to see how clothes would fit their size. I could also create a virtual fashion store that can be accessed from home, so I don’t need to visit a physical store for changes of my future progress. 
  • Idea 2 AR in Instax Film: When users scan the Instax film and a video pops up, this could be done online using AI. However, the user experience needs to be more engaging than just showing event details and dates. Think of ways to make the interaction more meaningful and immersive for the user. 
  • Idea 3 AR in Piano: Mr. Razif mentioned that this idea might be a bit inconvenient and unrealistic because people would have to keep opening their phone camera to scan notes while practicing. This could be a drawback, making it less practical for users during piano practice. 
Consultation with Mr Razif at late night 10:30pm haha!(17.10.2024- Week 3)



REFLECTIONS

I've been exploring the concept of AR while compare it from MR and VR. Understanding these differences has been crucial to developing immersive AR experiences. In one week class, we also created a user journey map to empathize with users' needs and emotions, which is very important for creating an effective AR interactions.

Initially, I struggled with Unity, but rewatching Mr. Razif’s lectures helped me grasp it better. The feeling of successfully launching my first AR marker was incredibly motivating. And  I didn't anticipate the number of tools I’d need to download, like Vuforia and Visual Studio Code, which filled up my laptop's storage! I’ve since cleaned up space to continue progressing haha!

The class exercise in Week 1 sparked my interest in creating an AR experience for fashion stores. I’ve included this concept in my documentation, where users can scan garments in-store to receive information like sizes, price and product names in real-time. This idea excites me, and I’m determined to turn it into something innovative and engaging.


Comments

Popular Posts