21.04.2025 - 12.05.2025
Michael Chan Henn Loong / 0363611
Experiential Design / BA of Design (HONS) in Creative Media / Taylor's
University
Task 1: Trending Experience
INSTRUCTIONS
<iframe
src="https://drive.google.com/file/d/1NlmiGLTVEt-aZ1HcWis5wnyquBrK9mCv/preview"
width="640" height="480" allow="autoplay"></iframe>
Weekly Reflections on Class Activities & Exercises
We were briefed on the MIB on week 1 and we were tasked to decide on that same
day whether we will be doing our tasks individually or in a pair with a
partner of our choice. Ian and I decided to work together throughout this
whole semester on Experiential design module so it would be easier for the
both of us.
I was absent on Week 2 so there's nothing much to share there.
On week 3 Mr Razif provided us some knowledge on what AR, MR and VR are
consisted of and also provided us with some short videos to watch so that we
can further understand whats happening in these topics.
Fig 1.0, T-Rex 3D Simulation in class
Fig 1.2, T-Rex 3D Simulation in class
After watching the videos Mr. Razif told us to search for AR animals like
dinosaurs, tigers, etc.. to check whether our phones support the AR feature so
that we can also double check now whether our phone are capable to do whatever
we are going to do in the future classes and tasks.
After that we started on setting up our unity by following the instructions
Mr. Razif provided in class. So at first we opened a 3D file and selected its
save location after that we entered Unity and we are asked to switch over to
the platform that we will be working on so I switched to iOS as I'm using an
iOS phone.
Fig 1.3, Platform Switching
Fig 1.4, Installing Module
After setting up all the modules and platforms we were asked to create a
Vuforia acc so that we can install the Vuforia engine for our Unity.
Fig 1.5, Vuforia Engine Install
After installing the Vuforia engine we pulled it into our Unity file and
Import the unity package into our file.
So after importing the unity package we "spawned" a unity AR camera on the
game object hierarchy and we were told that all the setting on the inspector
side can change how the AR camera behaves. After learning the hierarchy we
moved on to the Vuforia engine configuration and went back to Vuforia to
create our license key.
Fig 1.7, Vuforia License key
Fig 1.8, Unity Insert License Key
We also defined on which camera we are using, the double check is needed if
there are more than one camera being used in the laptop to make sure that
everything goes smooth during the process later as well as all our future
projects.
After checking the camera we created a image target and added a target from
database so we went back to Vuforia, went to target manager and generated a
database, type: Device and the database is generated. And we used Image for
our current exercise and I used a astronaut image for the practice.
Fig 1.9, Astronaut Target Manager
The image is active and ideally we would aim for five stars as its easier to
scan just having anything more then 3 star is fine anything lower then 3 would
be hard to scan and in the future project we might tweak things thinking that
our codes doesnt work but it might just be that the image cant be scanned.
After that we downloaded the database and made sure that it is for Unity
editor but not any Android Studio, Xcode or Visual studio since we will be
doing all our projects and exercises on Unity editor.
Fig 2.0, Download Database for Unity Editor
After the download just import it into the AR project and add the
target, and after adding the targets, we created a cube and dragged it into
the Image target. Cause whatever we want to spawn when scanning the image
target needs to be the "child" of the image target.
Fig 2.1, Spawn = Child of Image Target
So after the class Ian and I went for Mr Razif's consultation and gotten some feedbacks on our Ideas all the feedbacks are written down at the feedback section.
Research & Exploration
So I've did some resarch and I got like 5 stuff that I dig up throughout the process and I personally is a gamer myself so when I touched upon topics like AR the first thing that really came to my mind is Pokemon GO and Monster Hunter NOW both are games developed with Niantics help. To me Pokemon GO is still one of the most iconic example for AR integration. So the users just use their devices to find and catch any Pokemon that is overlaid on the real worlds location using the GPS and camera AR features. This particular game already proved to the world that AR can actually bring experience to us players in the real world not just the digital world.
As for Monster Hunter NOW, it's launched in 2023. I think its the latest AR game they have that is in collaboration with Capcom. Its mechanics are different from Pokemon GO, pokemon uses the catching mechanics but Mon Hun turns our real world locations into some sort of combat zone where we players need to hunt down the monsters as mentioned in the name. But we only uses swipe and tap for the whole mechanics which is easier for mobile devices users too.
Fig 2.3, Monster Hunter Now AR
I've also noticed that AR can actually change how people shop nowadays, especially in fashion and accessories. I think its a brilliant idea like how AR try ons can actually help users to see how the products would look on them before they proceed to purchasing enhancing the services of the brand, reducing return rates and also making the experience more fun and interactive for the users.
ZARA have actually introduced one in-store AR campaign before, replacing all the mannequins with AR-activated models. So its like pointing at specific markers and a virtual model will appear wearing the set of clothes and posing naturally which allows the users to actually check out the outfit and even purchase from AR interfaces. I think this idea is pretty good for outlets or stores that are trying to do minimalist designs and pop ups stores.
Fig 2.4, Zara AR
The other other one that I know about would be GUCCI, so by using the app we can actually try on virtual sneakers using their camera. The shoes anchor perfectly to our feet and would respond to our movements. Gucci also launched a virtual only shoes (e.g., the Gucci Virtual 25), only usable in digital spaces for photography, social medias and any other avatar usage. This truly shows me how AR actually opened a new approach for the fashion market.
Fig 2.5, Gucci AR
Last but not least would be Adobe Aero. So Adobe Aero is something like a AR design tool for designers that are not programmers, allowing any artist or, UI/UX designers and maybe even artist to create some fun and creative interactive AR experiences. It basically supports 2D/3D assets from Photoshop, Illustrator, or maybe some third party tools like blender.
There are a lot of stuff to play with Aero but for now I think the key features I've found out is that they have drag and drop features for adding in visuals, triggers and animation, they also have scene anchoring where we can use surfaces, images or geolocations. But the most important feature would be their interactivity without the needs of coding, we can create tap, hover, or any proximity based actions using a visual behavior builder.
I think we can really use it for any art exhibitions, AR storytelling or maybe even portfolio displays where our projects floats or responds to the users interaction. Adobe Aero really empowers creatives to prototype in space, not just on screens, bringing in the gap between static visuals and immersive experiences.
But sadly Adobe Aero currently only have beta version for desktop which I havent actually experienced it I've only played a little bit with the mobile one.
Fig 2.7, Aero Mobile & Desktop
Proposal for 3 AR Project Ideas
So after confirming with Ian that we will both be working together, we started brainstorming our ideas so that we can share our ideas with Mr. Razif on our next class. So Ian invited me into a figjam board so that we can put all our ideas and our future work inside here since we are working together. So in the early start of the task, I solely focused on generating broad and creative AR concepts inspired by real-world environments, everyday interests, and educational use cases. These were captured in five quick ideas:
Idea 5 Music Player + Language Learning App
Idea 6: AR Solar System
Idea 7: Ghost Skater
Idea 8: Invisible Wall
Idea 9: Cool Mural
Fig 2.8, Brainstorming ideas
So after creating these 5 ideas we were supposed to ask for feedback, but I was absent for week 2 so I started refining my ideas based on feasibility, personal interest, and experiential potential. I filtered down the list to three polished concepts:
1. Music + Language App (Augmented Karaoke)
Problem Statement:
Many language learning apps are static, unengaging, and disconnected from culture. Music, while powerful for learning, is underutilized in interactive ways.
Proposed Solution:
Create an AR karaoke-style experience where users scan album covers, see animated lyrics in two languages, and interact with words for translation and pronunciation. Think Duolingo meets Shazam meets karaoke.
2. Ghost Skater — AR Skate Spot Visualizer
Problem Statement:
Skaters often struggle to visualize new tricks or setups. There's no intuitive way to plan or simulate tricks at physical locations.
Proposed Solution:
Let users scan a real skate spot and view ghost skaters performing AR tricks—like kickflips, grinds, and combos. They can record their own trick paths or compete in ghost races with friends.
3. Lofi Phase Mural — Mood-Based Animated AR Wall
Problem Statement:
Urban murals are static and don't adapt to mood, time, or weather. They’re limited in how users can engage with them emotionally.
Proposed Solution:
Develop a mood-reactive mural that changes based on the time of day (e.g., sunrise, rain, sunset) using AR. Users can listen to chill lo-fi beats while the mural shifts between scenes like clouds drifting, stars fading in, or leaves falling.
Fig 2.9, Improved idea.
After making these details for my ideas we went for feedback on week 3 and all the feedbacks were recorded down below at the feedback section.
So after the feedback session I made some refinement to my ideas:
1. Music + Language App (Augmented Lyrics AR)
Problem:
Traditional music apps like Spotify are not designed for language learning. Lyrics are static and often only available in one language. Users can't interact or learn pronunciation or meaning easily.
Improved Solution:
Create an interactive AR music language app where users scan an album cover (or visual marker), and bilingual lyrics appear dynamically in AR space — along with pronunciation buttons, translation toggles, and animated word effects.
Instead of competing with Spotify, this app becomes a learning companion:
-
Users import Spotify tracks or link their Spotify playlist
-
The app overlays interactive educational content on top (not to replace Spotify, but enhance it)
What makes it unique:
-
Focused on language acquisition through music, not just music listening
-
AR lyrics that float around the cover, with clickable words for:
-
Translating
-
Hearing pronunciation
Pinning difficult words
-
Album covers come alive with subtle animated elements (e.g., pulses, rhythm waves)
How to make it immersive:
-
Support voice pronunciation recording for feedback
-
Animated lyrics respond to music tempo
-
AR stickers/reactions for sharing lyrics on social media
Fig 3.0, Music Language App
2. Ghost Skater — AR Skate Spot Visualizer (Lite Version)
Problem:
Full-scene scanning and real-time animated racing modes are too complex for current scopes. Animations are a workload challenge, especially in dynamic environments.
Improved Solution:
Scale the concept down to site-specific AR trick visualizations. Instead of full ghost races, users scan a known or preset marker (e.g., a real-world rail or bench), and the app shows a single trick animation (like a ghost skater doing a flip or grind).
This version is more practical and modular:
What makes it unique:
-
Focused on trick learning and visualization, not gaming
-
User-generated trick uploads: you can upload your own trick and share the path
-
Includes slowmo AR playback
How to make it immersive:
-
Use AR overlays like trails, sparks, or motion ghosts
-
Add spatial audio for the board's sound on different surfaces
-
Optional: allow users to draw paths for the ghost to follow
3. Lofi Phase Mural — AR Mood Wall (User-Contributed)
Problem:
Creating fully animated murals that adapt to user art is challenging, and animation syncing to original artwork can be abstract or unclear.
Improved Solution:
Instead of making murals user-generated from scratch, build the experience around preset mural templates that respond to environmental mood (time, weather, music). Allow users to contribute small elements (e.g., stickers, mood notes, or filter overlays) as part of a living wall.
Think of it like an ambient AR wall + community moodboard.
What makes it unique:
-
The mural shifts mood based on time of day and ambient sound levels
-
Users contribute lightweight edits (stickers, mood tags) instead of full artwork
-
Viewers can see different versions based on conditions — like a lo-fi music visualizer
How to make it immersive:
-
Layer soft audio-reactive animation loops (e.g., clouds drift faster with upbeat music)
-
Include an ambient lofi music player tied to mural’s look
-
Users can scan and save a moment in the mural as a “vibe snapshot”
Fig 3.2, Lofi Phase Mural
Feedback
Week 2:
Absent
Week 3:
Idea 1-
-
Why this AR project how does it makes itself unique comparing to other
music apps like spotify and any other music apps ?
- How to make the app more immersive and unique ?
-
Using applications like Spotify and stuff would be much more convenient.
Idea 2-
- It would be a challenge to make the animations of the ghost skater.
-
Maybe make it a Ghost skater that does moves at some specific spot rather
than making a ghost skater racing mode.
-
It would be difficult to actually scan the whole place for the ghost
skater to actually spawn as well it will be huge workload.
-
If we are still planning to do this we might need to pick up animations
throughout the process and make the animations ourself or maybe we can
find it online. Since nowadays its much more doable with all the AI and
stuff comparing to last time.
-
Can make a surrounding for the ghost skater so that it looks more
immersive and stuff rather then just having a ghost skater with its moves.
Idea 3-
-
Animation as well, it would be hard to do the animations for the murals.
-
Even if we get to do the animations for the mural, how are we gotta make
the animations related to the artwork that the users create ?
-
How do you make it obvious for people to notice that the users got the
inspirations from your mural animation?
Reflection
After doing all the stuff and looking back on this task, I've learned quite abit, I didnt expect that I will be learning anything other than Augmented Reality but I've also learnt about how to take ideas from something casual and fun and change them into something more meaningful and workable. At the very start I was just exploring what’s trending in AR. At first I thought of games like Pokémon GO and Monster Hunter Now, and also found some real world examples like virtual try-ons from Zara and Gucci. I also explored Adobe Aero, which I didn't know existed before, and I was really inspired by how accessible it made AR design feel. It didn’t seem as out of reach as I once thought.
As I dive deeper into these experiences, I started brainstorming my own AR project ideas. At first, I didn't care about anything just throwing out different concepts like solar systems, invisible walls, or mood murals. I was just having fun with whatever I can think of. But when I got feedback from Mr Razif, I realized that some of my ideas were either too broad, too ambitious, or didn’t have a clear purpose. That helped me pause and reflect on what I was actually trying to achieve.
So I refined everything and focused on three main ideas that I'm really interested in:
-
A music-based AR language learning app,
-
An AR skate spot visualizer called Ghost Skater, and
-
A mood-based animated mural experience.
These ideas came from things that I personally enjoy, which are music, skating, and visual art just me reshaping them into something that is more to my liking ? and probably also bring a new side of them to people. I thought about user flows, what would make each experience feel unique, and what AR could do that other mediums can’t.
One of the biggest lesson for me was understanding that just because AR is exciting doesn’t mean it always works on anything. It needs context. It needs clarity. And it needs to actually enhance the experience, not distract from it. I also learned that limitations (like animation complexity or scanning environments) aren’t dead ends—they’re just challenges to design around creatively.
Comments
Post a Comment