glühen

PROJECT: GLÜHEN

ROLE: UX/UI/MR/VISUALS

DURATION: 3 MONTHS

.01

objective

With a world heading toward mixed reality, how could we simplify the

interaction with smart lighting systems while using mixed reality (MR)

to create an intuitive and user-friendly experience?

Go watch Cyberpunk: Edgerunners for a better understanding of

imagining a futuristic world where physical screens are obsolete.

How can we approach and design that technology for human

interaction?

.02

pain points

Here's what's annoying… hosting a party and staring at your phone to

adjust the smart lighting. I want to dim the lights and set the mood,

but im staring at the phone screen for about 15 minutes navigating which

light matches with the one on the phone. Yeah, lights were listed, but

there is so much text information, it can be overwhelming.

My insights

• apps pull attention away from the moment and requires too much focus

on the interface

• not always clear which light corresponds to which control in the app

To validate my personal insights, I interviewed friends, family, and random

users with various smart lighting setups. They echoed similar pain points,

such as unclear control mapping and overwhelming interfaces. Check it out on

the affinity map below.

User Personas

After listening to their concerns, I developed user personas to better

understand and empathize with the needs of different users. These personas

represent the unique challenges and expectations that guide the design

process.

TAP TO VIEW

affinity map / user personas

.03

problem /

solution statement

problem statement

Smart lighting apps are often cluttered and confusing, with text-heavy

interfaces and unclear light mapping. Users struggle to identify which

light corresponds to which control, especially in larger setups where

the lack of a visual layout makes navigation difficult.

solution statement

Mixed reality offers a solution of mapping issues by visually representing

lights in their physical locations. As we move toward a future dominated

by VR and AR technologies, the reliance on phones to control smart devices

will diminish. This shift presents an opportunity to design a seamless,

intuitive lighting control system that allows users to interact with their

environment directly through gestures or visual interfaces, eliminating

confusion and simplifying interactions.

To bring this solution to life, I visualized how an MR-based experience

would guide users step by step from pairing devices, selecting lights, and

controlling those lights, all in an intuitive, immersive way. I'm basically

putting together real world mapping with gesture-based controls. This

eliminates confusion and streamlines interaction. Tap below to see the breakdown of

the user journey, showing how MR enhances usability.

TAP TO VIEW

visualization

.04

interactive prototype

In this phase, I brought the concept to life using a prototype built

with Bezi and the Oculus platform. This interactive experience

addresses the core problem of smart lighting apps: unclear mapping and

frustrating interfaces. By visually representing the lights in their

real-world positions, the prototype helps users easily identify and

interact with the correct light, eliminating guesswork and reducing

cognitive load. Users are guided through the core functionality of the

system, including login, activating light control, selecting a light,

and toggling it on or off using an intuitive hand-based interface.

The prototype also emphasizes the benefits of body-based UI. Users

access a control menu through gestures, seamlessly engaging with

interactable lights mapped to their physical environment. This dynamic

interface streamlines the lighting control experience, making it more

immersive, efficient, and engaging.