Experiential Design Task 3: MVP Prototype
31.05.2025 - 06.07.2025 (Week 6 - Week 11)
Kiew Ting Yi (Nicole) / 0361143 / Bachelor of Design (Honours) in Creative MediaExperiential Design
Task 3 : MVP Prototype
Table of Contents:
1. Class Activity & Proposal Recap
2. Instructions
3. Task
3: Experience Design Prototype
- Progress for Final Outcome
4. Reflection
2. Instructions
3. Task
3: Experience Design Prototype
- Progress for Final Outcome
4. Reflection
CLASS ACTIVITY & PROPOSAL RECAP
TASK 2 EXPERIENCE DESIGN PROJECT PROPOSAL
INSTRUCTIONS
According to the Module Information Booklet (MIB),
we are to:
-
Develop a functioning prototype of your AR
experience
-
Make a presentation/demo explaining your concept,
features, and user interaction
-
Develop a functional prototype that showcases key
interactions and experience flow
-
Implement the core AR functionality using Unity +
Vuforia
Demonstrate user
interaction with:
-
A recognisable trigger (e.g., image tracking or
ground placement)
-
A clear system response (e.g., visuals, audio,
feedback)
-
Present a working user journey (start,
interaction, output, exit/reset)
-
Ensure your prototype reflects your selected idea
from Task 2
-
Show evidence of testing and iteration
According to the Module Information Booklet (MIB), we are to:
- Develop a functioning prototype of your AR experience
-
Make a presentation/demo explaining your concept,
features, and user interaction
- Develop a functional prototype that showcases key interactions and experience flow
- Implement the core AR functionality using Unity + Vuforia
- A recognisable trigger (e.g., image tracking or ground placement)
- A clear system response (e.g., visuals, audio, feedback)
- Present a working user journey (start, interaction, output, exit/reset)
- Ensure your prototype reflects your selected idea from Task 2
- Show evidence of testing and iteration
TASK 3 MVP PROTOTYPE
For Task 3, I developed StillMode, one of the main feature in my AR
prototype FamiliAR. The goal was to allow users to scan a familiar
object and begin a guided breathing experience using AR
elements.
Setting Up the Project
First, I created a new Unity project and installed the Vuforia
Engine package through the Unity Package Manager. I enabled Vuforia
under XR Plug-in Management and added my license key through the
Vuforia Configuration. Then, I created an ARCamera and an Image Target in the scene. I
uploaded a custom image to the Vuforia Target Manager and used that as
my trackable target. I used my friend's artwork coz it reminded me to
take life less "seariously."
UI and Onboarding Flow
I designed the UI in Figma and exported the assets as PNGs. I built
an onboarding screen in Unity using a Canvas with RawImages,
including a Start button (to be changed to stillMode in the
future) When the Start button is pressed, the onboarding UI hides and a scan
prompt group appears (consisting of a scan vector and scan guide).
These help direct the user to scan their object.
Image Tracking & Trigger Flow
Once the image is detected, the scan prompt disappears and a
“Familiar Found” image appears briefly. Then the breathing orb
activates and the breathing cycle begins. This was controlled by a script attached to the orb object, which
manages three states: Inhale, Hold, and Exhale. I also added a counter
using TextMeshPro to visually display the seconds for each
phase.
Visual and Audio Elements
Each breathing phase had its own visual prompt ("Breathe", "Hold",
"Release") and synced with background music. The music plays during
Inhale and Exhale, and pauses during Hold. I made sure the orb
scaled over time using Vector3.Lerp and that each phase transitioned
based on timers. There's also a mute button available for users who
doesn't want to hear the song of meditation.
Reset & Exit Logic
I added an Exit/Back button that lets users return to the
onboarding state. Pressing Exit resets everything: the UI, the orb,
the audio, and the tracking state. I created a full loop so users
can start over without restarting the app. I also included a speaker toggle button that turns off the audio
output without interrupting the timing of the breath cycle.
Challenges I Faced
There were many problems I encountered:
-
The breathing animations were out of sync or not looping
correctly
-
The scan prompt group kept reappearing due to improper observer
state handling
-
Buttons stopped working when placed in the wrong canvas
layer
-
Unity didn’t recognise image tracking loss properly at
first
-
The music kept restarting with every cycle instead of smoothly
continuing
-
Unity wouldn’t export to iOS until I raised the minimum iOS
version to 15.0
- Having to redo because I accidentally deleted some packages
Fixing these issues involved:
-
Using the Vuforia Default Observer Event Handler for
event-based control
-
Deleting and recreating the Library/ folder to fix build
errors
-
Cleaning Scripting Define Symbols
-
Waiting for Unity to finish compiling before building
-
Using AudioSource.volume = 0 instead of stopping the background
music
-
Moving UI elements to separate canvases for layering and click
detection
Final Outcome
In the final version, StillMode includes:
-
Onboarding UI with start and pressed states
-
Scan guide and vector animations
-
Familiar object detection with image-based feedback
-
A glowing orb with smooth inhale, hold, and exhale
transitions
-
Synchronized audio and visual cues
-
Breath counter using TextMeshPro
-
Fully functional Exit and Speaker buttons
-
Complete restart loop after each cycle
Testing on iPhone
After resolving multiple build issues, I was finally able to export
the prototype to my iPhone using Xcode. I had to ensure:
-
Minimum iOS version was set to 15.0
-
NSCameraUsageDescription was added to the Info.plist
-
iOS Build Support was installed through Unity Hub
-
The build signed correctly under my Apple ID
After launching the app, it worked as expected on device.
Summary
StillMode is now fully functional and serves as a clean, testable
prototype for guided AR breathing. It helped me learn how to manage
Unity UI, Vuforia tracking, audio timing, button logic, and build
settings all at once.
This gives me a solid foundation to continue developing the next
feature: WalkMode.
For Task 3, I developed StillMode, one of the main feature in my AR
prototype FamiliAR. The goal was to allow users to scan a familiar
object and begin a guided breathing experience using AR
elements.
Setting Up the Project
First, I created a new Unity project and installed the Vuforia
Engine package through the Unity Package Manager. I enabled Vuforia
under XR Plug-in Management and added my license key through the
Vuforia Configuration. Then, I created an ARCamera and an Image Target in the scene. I
uploaded a custom image to the Vuforia Target Manager and used that as
my trackable target. I used my friend's artwork coz it reminded me to
take life less "seariously."
UI and Onboarding Flow
I designed the UI in Figma and exported the assets as PNGs. I built
an onboarding screen in Unity using a Canvas with RawImages,
including a Start button (to be changed to stillMode in the
future) When the Start button is pressed, the onboarding UI hides and a scan
prompt group appears (consisting of a scan vector and scan guide).
These help direct the user to scan their object.
Image Tracking & Trigger Flow
Once the image is detected, the scan prompt disappears and a
“Familiar Found” image appears briefly. Then the breathing orb
activates and the breathing cycle begins. This was controlled by a script attached to the orb object, which
manages three states: Inhale, Hold, and Exhale. I also added a counter
using TextMeshPro to visually display the seconds for each
phase.
Visual and Audio Elements
Each breathing phase had its own visual prompt ("Breathe", "Hold",
"Release") and synced with background music. The music plays during
Inhale and Exhale, and pauses during Hold. I made sure the orb
scaled over time using Vector3.Lerp and that each phase transitioned
based on timers. There's also a mute button available for users who
doesn't want to hear the song of meditation.
Reset & Exit Logic
I added an Exit/Back button that lets users return to the
onboarding state. Pressing Exit resets everything: the UI, the orb,
the audio, and the tracking state. I created a full loop so users
can start over without restarting the app. I also included a speaker toggle button that turns off the audio
output without interrupting the timing of the breath cycle.
Challenges I Faced
There were many problems I encountered:
- The breathing animations were out of sync or not looping correctly
- The scan prompt group kept reappearing due to improper observer state handling
- Buttons stopped working when placed in the wrong canvas layer
- Unity didn’t recognise image tracking loss properly at first
- The music kept restarting with every cycle instead of smoothly continuing
- Unity wouldn’t export to iOS until I raised the minimum iOS version to 15.0
- Having to redo because I accidentally deleted some packages
Fixing these issues involved:
- Using the Vuforia Default Observer Event Handler for event-based control
- Deleting and recreating the Library/ folder to fix build errors
- Cleaning Scripting Define Symbols
- Waiting for Unity to finish compiling before building
- Using AudioSource.volume = 0 instead of stopping the background music
- Moving UI elements to separate canvases for layering and click detection
Final Outcome
In the final version, StillMode includes:
- Onboarding UI with start and pressed states
- Scan guide and vector animations
- Familiar object detection with image-based feedback
- A glowing orb with smooth inhale, hold, and exhale transitions
- Synchronized audio and visual cues
- Breath counter using TextMeshPro
- Fully functional Exit and Speaker buttons
- Complete restart loop after each cycle
Testing on iPhone
After resolving multiple build issues, I was finally able to export
the prototype to my iPhone using Xcode. I had to ensure:
- Minimum iOS version was set to 15.0
- NSCameraUsageDescription was added to the Info.plist
- iOS Build Support was installed through Unity Hub
- The build signed correctly under my Apple ID
After launching the app, it worked as expected on device.
Summary
StillMode is now fully functional and serves as a clean, testable
prototype for guided AR breathing. It helped me learn how to manage
Unity UI, Vuforia tracking, audio timing, button logic, and build
settings all at once.
This gives me a solid foundation to continue developing the next
feature: WalkMode.
PROTOTYPE VIDEO
Final Alpha Prototype on iPhone 11
VIDEO PRESENTATION
REFLECTIONS
Looking back, I felt overwhelmed at many points during the process,
especially when things kept breaking and I didn’t know why. There were
moments I genuinely thought I wouldn’t be able to finish the prototype. I
was frustrated when the buttons wouldn’t respond, when I couldn't loop the
process of pressing back and then the whole cycle repeated, when tracking
failed without reason, when my file crashed because I removed some
packages, and when Unity or Xcode kept throwing silent errors. At the same
time, I felt determined to figure things out, and every small breakthrough
gave me a sense of progress. Getting the orb to sync with the audio,
seeing the app actually run on my phone, and watching it reset smoothly
made me feel proud. It wasn’t perfect, but I learned so much through the
struggles, and now I feel more confident with Unity and with building
interactions that feel complete. I'm excited to continue to build it for
my final and can't wait to see my actual prototype become a usable
MVP.