EXPERIENTIAL DESIGN TASK 1: TRENDING EXPERIENCE

23.09.2024 - 14.10.2024  / Week 1 -Week 4

Emily Goh Jin Yee / 0357722 / Bachelor of Design (Honours) in Creative Media 

Experiential Design/ MMD60204 / Section 01

Task 1: Trending Experience


TABLE OF CONTENTS

1. LECTURES

2. INSTRUCTIONS

3. TUTORIAL

4. TASKS

5. FEEDBACK 

6. REFLECTION


LECTURES

Week 1- LECTURE 


Week 2- LECTURE 



Week 2- TUTORIAL 



Week 3- TUTORIAL 



Week 4- TUTORIAL 





INSTRUCTIONS





TUTORIAL

WEEKLY EXERCISES 


WEEK 1: TRYING OUT AN AR EXPERIENCE
In class, we used our phones to try out an AR experience by viewing 3D animal models from Google. We also explored a project by Fashion Design students who created 3D models for viewing in AR, allowing us to see how their designs look in augmented reality.

Fig W1.1 trying out an AR experience, Week 1 (23/09/2024)



CREATING MARKER-BASED EXPERIENCE USING UNITY & VUFORIA

Unity: A platform for creating and managing complete AR experiences, encompassing 3D models, user interfaces, and interactions.

Vuforia: A plugin that enhances Unity by providing image recognition and tracking features essential for marker-based augmented reality.



1. SET UP UNITY AND VUFORIA 
Unity and Vuforia were installed. I followed a tutorial video on youtube to set up the project. To set up the project, I first deleted the main camera at the hierarchy panel and then added AR camera. Then, I imported the Vuforia package in Unity. 


2. CREATE MARKER

 
Fig W1.2 process, Week 1 (29/09/2024)

In Vuforia, I went to the target manager and added the image target game object by selecting the jpeg file. Then, I created a link for the license, copied the app license key from Vuforia into Unity. 
Fig W1.3 target manager in Vuforia, Week 1 (29/09/2024)

Fig W1.4 importing image into Unity, Week 1 (29/09/2024)

Fig W1.5 imported image, Week 1 (29/09/2024)


3. ATTACH AN OBJECT TO THE MARKER

A cube from 3D object was selected and placed by arranging properly to the center. The cube was slightly too big but I only adjusted it during week 2.
Fig W1.6 placing cube above image, Week 1 (29/09/2024)

After that, I tried the created AR object(cube) with the laptop's camera on the software. 

Fig W1.7 trying out the marker-based AR on camera, Week 1 (29/09/2024)

Fig W1.8 trying out the marker-based AR on camera(video), Week 1 (29/09/2024)



WEEK 2 CLASS EXERCISE: JOURNEY MAPPING

During Week 2, we were divided into groups of six to collaborate on creating a journey map by selecting a location of our choice. Using Miro, we worked together to document the key steps of the journey, focusing on identifying Gain Points, Pain Points, and proposing Solutions for each stage of the user's experience. After that, we presented the map to the class. This exercise helped us analyze and improve the customer journey by understanding both the positive aspects and areas that need enhancement.
Fig W2.1 Genting Skyworlds Theme Park current user journey map, Week 2 (30/09/2024)

Fig W2.2 Genting Skyworlds Theme Park future user journey map, Week 2 (30/09/2024)

In this week, we were thought on how to create markerbased AR, which I've done on the previous day because I thought we had to submit that. So, I just adjusted the cube's size to make it smaller.



WEEK 3 TUTORIAL: USER CONTROLS AND UI(buttons)

We learned on creating buttons and to animate the cube. I switched the platform to iOS after installing. 

Fig W3.1 changing platform to iOS, Week 3 (07/10/2024)
Fig W3.2 process, Week 3 (07/10/2024)
Fig W3.3 changing colour and font size for the button, Week 3 (07/10/2024)



Creating Script
In the Script, I named it SceneManagerr. There were two ways to write the code: ones gotoScene() Loadscene("x") and another is gotoCustomScene(string sceneName), it can go to any scene that is written later on.
Fig W3.4 script: gotoScene() Loadscene("x"), Week 3 (07/10/2024)

Fig W3.5 script: gotoCustomScene(string sceneName), Week 3 (07/10/2024)

Fig W3.6 adding gotoscene, Week 3 (07/10/2024)



Create Animation
Window>Animation>Animation, then drag the panel to the bottom. Press on child object of image target then create animation. Click on the red button to begin recording animation, move the cube, copy keyframe and paste at 1 minute.
Fig W3.7 adjusting the cube in animation, Week 3 (07/10/2024)

Fig W3.8 types of animation, Week 3 (07/10/2024)

Fig W3.9 drag cube to 'on click', Week 3 (07/10/2024)

Fig W3.10 outcome of animated cube with buttons, Week 3 (07/10/2024)



Screen Navigation
I have added new scenes and named them 'Menu' and 'Credits'. The buttons bring users to different scenes.
Fig W3.11 creating new scene, Week 3 (07/10/2024)

Fig W3.12 scenes, Week 3 (07/10/2024)

Fig W3.13 menu page and credits page, Week 3 (07/10/2024)



Video as Virtual Content

Create a Quad through Hierarchy > 3D object. In the inspector panel add component and search for video player. Drag the video given in Drive into the video clip box.

Fig W3.14 Inspector Panel, Week 3 (07/10/2024)

Click image target, at inspector panel, drag the quad into the box under On Target Found and On Target Lost. When image target is detected, the video plays automatically, when not detected, it pauses.
Fig W3.15 events when target is found section, Week 3 (07/10/2024)

Fig W3.16 outcome of video on image target, Week 3 (07/10/2024)



WEEK 4 TUTORIAL: CREATING MARKER-LESS AR

Today, we learned on how to create marker-less AR. This requires us to export the unity project to our phones.

Fig W4.1 process, Week 4 (14/10/2024)

The ground plane image acts as the ground in real life, we can test if the AR works on this image, if it does then it would most likely work in real life.
Fig W4.2 given image- ground plane, Week 4 (14/10/2024)

Fig W4.3 testing AR on ground plane image, Week 4 (14/10/2024)



Exporting AR Experience to Mobile Device

Download virtual server from Google Drive.
Fig W4.4 Google Drive Virtual server files, Week 4 (14/10/2024)

After getting the files downloaded, I followed the steps on how to install the MacOS ISO into the virtual server.
Fig W4.5 process, Week 4 (14/10/2024)

I tried to create a new server of Sonoma, however, it didn't work as it was stucked on the page where it asks me to connect to a bluetooth keyboard which I didn't have. Therefore, I had to install Sequoia and try again.



*currently still facing problems with the installation*
update: tried many attempts and few hours by reinstalling as well but still not working, decide to use android




WEEK 5 TUTORIAL

Sphere: disable cube under image target, make the sphere move up and down by tapping. remove the 'toggle' button.




WEEK 6 TUTORIAL






WEEK 9 TUTORIAL

Camera Point to Object- when crosshair aims on the shape, an effect occurs (colour change/hover)





WEEK 10 TUTORIAL

We created an avatar and learnt on animating it in Unity.






TASKS

TASK 1: TRENDING EXPERIENCE


WEEK 1: INTRODUCTION TO EXPERIENTIAL DESIGN

In Week 1, we reviewed the module details, and Mr. Razif showed us an example of past students' work to clarify expectations. He introduced exercises to explore current market trends, helping us understand new technologies and how to create content for them. Through research and experimentation, we will learn their features and limitations, guiding our decisions for the final project.

Before exploring on the apps, I used TDS XD Assistant to checkout the differences of the types of AR:

Fig T1.1 Differences of the types of AR


WEEK 1 EXERCISE

Imagine the scenario in either of the two places. what would the AR experience be and what extended visualization can be useful? What do you want the user to feel? (Ex : Kitchen, Shopping Mall, Gym):

1. AR Shopping Mall Experience: Personalized Fashion Adventure

I imagined an AR scenario in a shopping mall where users embark on a personalized fashion adventure. Upon entering the mall and launching the AR app, users create a profile that includes their style preferences, favorite stores, and sizes. This information allows the app to suggest outfits tailored specifically to their tastes.

As users navigate the mall, the app utilizes GPS to detect their location and displays a map highlighting nearby stores with outfit suggestions. Users can point their phones at clothing racks, and the AR app overlays virtual models wearing different outfits in various colors and styles, enabling them to visualize how the clothes would look on them.

Additionally, users can scan their bodies to create a personalized avatar, allowing them to see how different outfits fit in real time. When they select an outfit, the app provides instant feedback, saying things like, “This bold color would look great on you!” or “Try these accessories to complete the look.”

If users fall in love with an outfit, they can purchase it directly through the app, with options for in-store pickup or delivery, making the shopping experience seamless and enjoyable.

In conclusion, this AR experience empowers users to make informed fashion choices while feeling excited and connected as they explore the mall.


2. Kitchen AR Experience: Smart Cooking Assistant

I imagined an AR scenario in a kitchen where users prepare meals with the help of a smart cooking assistant. As users start cooking, they open the AR app and scan their ingredients using their phone’s camera. The app recognizes items like vegetables, meats, and spices, suggesting recipes based on what they have available.

Once a recipe is selected, users can place their phones on a stand or countertop, allowing them to view the screen while cooking. The app overlays step-by-step cooking instructions directly onto the user’s kitchen space, showing a virtual chef demonstrating how to chop vegetables, making it easier to follow along.

As users cook, the app tracks their progress and provides tips, such as “Reduce the heat to avoid burning” or “Add a pinch of salt for flavor.” They can also interact with the app by tapping on the screen to receive variations to the recipe based on dietary preferences, like “Try using quinoa instead of rice for a healthier option.”

In conclusion, this AR cooking assistant makes meal preparation more accessible and enjoyable, empowering users to create delicious dishes while building their cooking skills.




FINAL AR IDEAS PROPOSAL

Fig T1.2 Final AR Ideas Proposal




FEEDBACK


WEEK 1

General Feedback 

  • no feedback

WEEK 2

General Feedback

  • no feedback

Specific Feedback 

  • download virtual server mac in windows

WEEK 3

General Feedback

  • download the software before class

Specific Feedback 

    • -

    WEEK 4

    General Feedback

    • try out the ARs 

    Specific Feedback 

      • download virtual server mac in windows

      WEEK 5

      General Feedback

      • no feedback

      Specific Feedback 

        • if Sonoma don't work try Sequoia


        REFLECTION

        Experience

        This module has provided me with insights into AR and interactive UIs. Although it was just the first task of the module, I find myself struggling to enjoy what I'm doing. I had difficulty keeping up with the process and steps during our tutor's lessons in class. Fortunately, recorded lectures and tutorial videos were provided on MyTimes. This allows those who couldn't follow the steps to watch, repeat, and pause the videos, and I was one of them. There were so many steps when creating ARs and don't think I can remember the codes in the script without following the tutorial steps. Since I'm a Windows and iPhone user, exporting the project to my phone was more complicated. I also had to download the virtual server, which was very time-consuming and complicated. I experienced several crashes when trying to open the virtual server. As for the AR ideas in the proposal, I still don’t quite understand what to do. It feels like I won’t be able to create the ideas out HAHA.

        Observation

        Throughout the weeks, I learned about marker-based and markerless augmented reality (AR) during the tutorials. I found that marker-based AR offered a structured interaction with digital content through specific markers, while markerless AR provided more flexibility in overlaying content in real-world environments. When creating buttons and navigation for different pages, I focused on intuitive design and user-friendly interfaces, ensuring that buttons were easily identifiable and transitions were seamless. This experience deepened my understanding of AR technology and the importance of user experience in developing engaging applications.

        Findings

        The module highlighted the practical applications of augmented reality (AR) in everyday life, showcasing its potential to enhance experiences in various contexts. From interactive learning tools for children to innovative shopping solutions that provide real-time information, AR can significantly improve how we interact with our environment. It can simplify complex tasks, like cooking or pet care, by offering visual guidance and step-by-step instructions. Overall, the insights gained from this module emphasized AR's ability to blend digital content with the physical world, making daily activities more engaging and efficient.



        QUICK LINKS

        Comments

        Popular posts from this blog

        APPLICATION DESIGN II TASK 1: APP DESIGN 1 SELF-EVALUATION & REFLECTION

        INFORMATION DESIGN EXERCISES

        APPLICATION DESIGN FINAL PROJECT & E-PORTFOLIO