EXPIRENTIAL DESIGN FINAL COMPILATION & REFLECTION

18.11.2024 - 02.12.2024  / Week 9 -Week 11

Emily Goh Jin Yee / 0357722 / Bachelor of Design (Honours) in Creative Media 

Experiential Design/ MMD60204 / Section 01

FINAL COMPILATION & REFLECTION


TABLE OF CONTENTS

1. LECTURES

2. INSTRUCTIONS

3. TASKS

4. FEEDBACK

5. REFLECTION


LECTURES


All lectures are completed in Task 1.


INSTRUCTIONS



TASKS

FINAL COMPILATION & REFLECTION

In the prototyping phase, our goal is to emphasize the main features of the AR application. At this stage, it's not required to include every asset or the full application flow. The focus should be on presenting the core functionalities that best represent the app's primary purpose and the user experience it aims to deliver.


HIGHLIGHGTS ON PREVIOUS TASK:

FINAL PROPOSAL DOCUMENT

Fig T2.8 Final Proposal Document
*mispelled 'solway' font, plesase ignore





PROCESS

Based on the feedback suggesting a story where the animation changes as users turn the carton, I've made some updates to my proposal to incorporate this dynamic element.

To begin the task, I’ve started by creating a frame-by-frame animation using Procreate. This animation will be triggered when the image target is detected. In the current version, Oatie appears and greets the users in a short sequence. While it’s still a work in progress and doesn't yet include sound, this initial animation sets the tone for the interactive experience. As I continue refining it, I plan to add more frames and transitions to ensure that the animation evolves as the user interacts with the carton, creating a more engaging and immersive experience.

Fig T3.1 Animation Sketch on Procreate


When image target(front of carton) is detected, the video appears:
Fig T3.2 Animation Outcome for the welcome page



UNITY PROCESS

Setting up Unity took really long for me as I have forgotten how to do so, after watching the recordings and with guidance of my classmate, I could successfully complete the setup.


1. after creating a new project in Unity, import Vuforia file into Unity's 'Resources'
2. add License Key 
3. import package into Assets 
4. Windows >Package Manager
5. in Build Settings, select desired platform and click on 'add open scenes'
6. back to Vuforia, in Target Manager, generate a new database and add a name
7. in the database, add image target









To kick off the project, I focused on the landing page, which plays a vital role in the AR app's user experience. I began by creating a scene named "Landing Page" and set up a canvas within it. On this canvas, I added the brand logo to establish the app's identity, along with a button that directs users to the scanning page. This simple yet functional design aims to provide an easy entry point into the app, ensuring users can quickly navigate to the next step in their interactive journey.

Fig T3.3 landing page process

Fig T3.4 button colour

On the inspector panel, I select the script of 'MySceneManager.cs' and dragged the Canvas below runtime, and wrote the scene where users will be directed to after clicking on the button.
Fig T3.5 inspector panel-on click
(button that directs users to the scanning page)

Fig T3.6 MySceneManager Script




Scanning Page Scene:

Hierarchy Panel>Vuforia Engine>AR Camera
After adding the AR Camera, delete main camera.

Hierarchy Panel>Vuforia Engine>Image Target
Inspector Panel>Type: From Database>select database and image target

Fig T3.7 image target

Fig T3.8 settings

Hierarchy Panel>Video Player
  • Drag the video from Files into Video Clip  under the Video Player Section.
  • Drag AR Camera to the camera section
  • Aspect Ratio: Fit Vertically/Stretched
  • Add New Script 'PlayVideoOnDetection'

Fig T3.9 Video Player process

Fig T3.10 Video Player Inspector


Creating the script turned out to be a challenging and time-consuming process, as it required more than 10 attempts. The TDS ChatGPT frequently provided outdated or incorrect scripts, which made troubleshooting particularly frustrating. I had to spend significant time navigating through Unity to identify and resolve various problems and errors in the script.

Eventually, I managed to get a version of the script that was error-free. However, when testing the AR camera, the video started playing automatically without requiring the image target to be scanned. To address this issue, I disabled the "Play on Awake" function, but the video still kept turning on by itself. After further trial and error, I turned to ChatGPT again for assistance, and finally, I was able to obtain a working script that resolved the issue. Here is the finalized script I used:

Fig T3.11 PlayVideoOnDetection Script



Next, I attempted to navigate from the landing page’s button to the scanning page, but initially, it wasn’t working. After some investigation, I realized that the issue was due to the MySceneManager script not being assigned to the inspector panel of the image target. Once I added the script to the correct location in the inspector, the navigation functioned properly, and the transition between the pages worked as expected!




OUTCOME OF THE CURRENT PROGRESS

Fig T3.12 Final Outcome of project for Task 3


Throughout the process, I encountered numerous challenges, which resulted in the prototype having fewer features and interactive elements than initially planned. However, this experience has helped shape a clearer direction for the final project. While the current prototype may seem quite basic, it serves as a foundational step, and I am committed to refining and expanding it in the final version to fully realize the project’s potential.



Continuation:
To continue the task, I drew the animation for the Ingredients page in Procreate. The animation is really simple and short, showing the main ingredients of the oatmilk. 


Illustrations and animation video from Procreate






On the videos, I applied audio source to ensure the music plays when the video is playing since I couldn't upload the video with background music. 



I wrote the descriptions and added two buttons: one to go back to the home page and another to navigate to the values page. For both buttons on the Ingredient Page, I applied animation by changing their colors to grab the users' attention. Initially, the buttons and page description were displayed immediately after scanning the image target, along with the video. To improve the flow, I created a script to ensure that the buttons and description only appear after the video has ended (on the last frame).







The "Our Values" page highlights key principles that showcase the brand's commitment to the health and well-being of its consumers. It includes points such as "No Added Sugar," "Plant-Based," and other values that emphasize the brand's dedication to providing wholesome and mindful products.

Lastly, I encountered an issue where I was unable to export the project successfully. As I mentioned in the first task, I had trouble installing the virtual server on my laptop. For this task, I attempted to test on three different Android devices, but none of them worked. I even reached out to a friend for assistance, but we couldn’t pinpoint the problem. Fortunately, Mr. Razif was understanding and allowed me to showcase my Unity project (AR app) outcome through an alternative method.

I tried screen recording on my laptop, but it was too laggy and didn't work well. So, I decided to use my iPad to record the screen of my laptop while in the Unity Editor. It took several attempts since the recording was still a bit unsmooth, but the current attempt turned out to be the best one.


Final AR App Walkthrough

Final AR App Walkthrough Presentation







FEEDBACK


WEEK 9

General Feedback

  • -

Specific Feedback 

    • cant just have buttons and information
    • use front and back of carton
    • when flip to different side, mascot moves in
    • video form or animate in unity
    • create story for product or select product with story
    • cut down info


    REFLECTION

    Experience

    Developing the prototype was a challenging and time-consuming process, even though the outcome seems simple. After completing all the tasks, I realized that working with AR and Unity isn't my strongest suit, despite watching numerous tutorials. This module demanded a lot of attention to detail, constant refinement of scripts, and managing various elements, which made it quite overwhelming. I was also juggling other modules, and my poor time management only added to the difficulty.  

    I struggled to keep up with the workload and, unfortunately, couldn’t complete everything within the given timeframe. While I know there’s plenty of room for improvement, I can honestly say I gave it my best, considering the limited time I had. Thankfully, Mr. Razif kindly allowed me to submit my work after the deadline, which gave me some breathing room.  

    The process of completing this task was hectic, filled with problems and the need to constantly learn new skills. The biggest challenge was not being able to export my completed project to a phone. From the first task, I had already faced difficulties with installing the virtual server, and this time, even after trying three different Android devices, I still couldn’t get it to work. Mr. Razif suggested showcasing the Unity Editor instead, so I decided to record the functionality using my iPad. After several attempts, I managed to create a recording that was as smooth as possible, and while it wasn’t perfect, it was the best I could do under the circumstances.  

    Although the results could be better, I’ve come to appreciate the process more than the outcome. Every step was a learning experience, and I’m proud of the effort I put into overcoming the challenges along the way.

    Observation

    Through this process, I observed that creating an AR prototype requires a strong grasp of scripting and attention to detail, as even minor issues can disrupt the entire functionality. It became clear that the software's technical limitations, such as difficulty exporting projects to mobile devices, can significantly hinder the workflow. I also noticed that troubleshooting requires patience and creative problem-solving, as well as adaptability when things don’t go as planned. Recording the Unity Editor as a workaround taught me to think outside the box when faced with unexpected challenges.

    Findings

    Throughout the development process, I found that AR projects require significant attention to detail, as even small errors in scripts or settings can cause major functionality issues. Exporting AR applications to mobile devices proved to be particularly challenging, highlighting the need for technical expertise and troubleshooting skills. Time management emerged as a critical factor, as balancing this module with other coursework was difficult, and better planning could have eased the workload. Additionally, I learned the importance of adaptability and creative problem-solving, as I had to find alternative ways, such as recording the Unity Editor, to present my work when technical barriers arose. Finally, the experience reinforced the value of user-centric design, where features like animations, button functionality, and seamless navigation are essential to improving the overall usability of an AR application.



    QUICK LINKS

    Comments

    Popular posts from this blog

    APPLICATION DESIGN II TASK 1: APP DESIGN 1 SELF-EVALUATION & REFLECTION

    INFORMATION DESIGN EXERCISES

    APPLICATION DESIGN FINAL PROJECT & E-PORTFOLIO