glühen

PROJECT: GLÜHEN

ROLE: UX/UI/MR/VISUALS

DURATION: 3 MONTHS

glühen

PROJECT: GLÜHEN

ROLE: UX/UI/MR/VISUALS

DURATION: 3 MONTHS

glühen

PROJECT: GLÜHEN

ROLE: UX/UI/MR/VISUALS

DURATION: 3 MONTHS

.01

Overview

Glühen is a speculative UX case study that explores how

mixed reality (MR) can improve smart lighting control.

Instead of relying on phones or cluttered apps, the

concept reimagines lighting interfaces as intuitive,

gesture-based experiences users can interact with

directly through an MR headset. This project began as

an experimental prompt to push the boundaries of early-

stage interaction design in immersive spaces.

Objective

Could we make controlling lights as seamless as

flipping a switch—without touching anything at all?

Glühen explores how to reduce friction in smart

lighting through spatial design and natural gestures,

offering an alternative to today's text-heavy apps.

.01

objective

Overview

Overview

With a world heading toward Mixed reality, how could we

Glühen is a speculative UX case study that explores how mixed

reality (MR) can improve smart lighting control. Instead of relying

on phones or cluttered apps, the concept reimagines lighting

interfaces as intuitive, gesture-based experiences users can

interact with directly through an MR headset. This project began as

an experimental prompt to push the boundaries of early-stage

interaction design in immersive spaces.

Glühen is a speculative UX case study that

explores how mixed reality (MR) can improve

smart lighting control. Instead of relying

on phones or cluttered apps, the concept

reimagines lighting interfaces as intuitive,

gesture-based experiences users can interact

with directly through an MR headset. This

project began as an experimental prompt to

push the boundaries of early-stage

interaction design in immersive spaces.

Objective

Could we make controlling lights as seamless as flipping a

switch—without touching anything at all? Glühen explores how to

reduce friction in smart lighting through spatial design and

natural gestures, offering an alternative to today's text-heavy

apps.

Could we make controlling lights as seamless

as flipping a switch—without touching

anything at all? Glühen explores how to

reduce friction in smart lighting through

spatial design and natural gestures,

offering an alternative to today's text-

heavy apps.

.02

Key Insights

• Smart lighting apps often pull focus from the moment, requiring users to look down and fumble through cluttered screens.

• Smart lighting apps often pull focus from the moment, requiring users

to look down and fumble through cluttered screens.

• Users struggle to identify which control maps to which light, especially in larger rooms or shared environments.

• Users struggle to identify which control maps to which light, especially

in larger rooms or shared environments.

• MR enables direct manipulation of lights in their actual spatial context, reducing cognitive load and guesswork.

• MR enables direct manipulation of lights in their actual spatial context,

reducing cognitive load and guesswork.

.02

pain points

Key Insights

Key Insights

Heres whats annoying… Hosting a party and staring at your phone to

• Smart lighting apps often pull focus from the moment, requiring users

to look down and fumble through cluttered screens.

adjusting the smart lighting. I want To dim the lights and set the mood,

• Users struggle to identify which control maps to which light, especially

in larger rooms or shared environments.

but im staring at the phone screen for about 15 minutes navigating which

• MR enables direct manipulation of lights in their actual spatial context,

reducing cognitive load and guesswork.

• Smart lighting apps often pull focus from the moment, requiring users to look down and fumble through cluttered screens.

• Users struggle to identify which control maps to which light, especially in larger rooms or shared environments.

• MR enables direct manipulation of lights in their actual spatial context, reducing cognitive load and guesswork.

.03

Design Approach

• Used hand tracking to simulate an MR interface that appears over the user’s non-dominant hand, providing quick access to controls.

• Developed user flows for selecting lights, adjusting brightness and color, and saving custom scenes.

• Prioritized simplicity and intuitive gestures—tap, pinch, and hover—over menus or settings trees.

.04

Prototype

The high-fidelity prototype was built using Bezi and visualized on an Oculus headset. It walks through:

• Entering the MR environment

• Selecting a light using a visual anchor

• Adjusting brightness with a pinch-and-slide gesture

• Activating a saved scene from a control dock on the hand

• Entering the MR environment

• Selecting a light using a visual anchor

• Adjusting brightness with a pinch-and-slide gesture

• Activating a saved scene from a control dock on the hand

TAP TO VIEW

affinity map / user personas

TAP TO VIEW

affinity map /

user personas

.03

problem /

solution statement

Design Approach

Design

Approach

interfaces and unclear light mapping. Users struggle to identify which

• Used hand tracking to simulate an MR interface that appears over the user’s non-dominant hand, providing quick access to controls.

• Developed user flows for selecting lights, adjusting brightness and color, and saving custom scenes.

• Prioritized simplicity and intuitive gestures—tap, pinch, and hover—over menus or settings trees.

• Used hand tracking to simulate an MR interface that appears over the user’s non-dominant hand, providing quick access to controls..

• Developed user flows for selecting lights, adjusting brightness and color, and saving custom scenes.

• Prioritized simplicity and intuitive gestures—tap, pinch, and hover—over menus or settings trees.

TAP TO VIEW

visualization

.04

.04

interactive prototype

Prototype

Prototype

The high-fidelity prototype was built using Bezi and visualized on an Oculus headset. It walks through:

• Entering the MR environment

• Selecting a light using a visual anchor

• Adjusting brightness with a pinch-and-slide gesture

• Activating a saved scene from a control dock on the hand

The prototype also emphasizes the benefits of body-based UI. Users

The high-fidelity prototype was built using Bezi and visualized on an Oculus headset. It walks through:

• Entering the MR environment

• Selecting a light using a visual anchor

• Adjusting brightness with a pinch-and-slide gesture

• Activating a saved scene from a control dock on the hand

access a control menu through gestures, seamlessly engaging with

interactable lights mapped to their physical environment. This dynamic

interface streamlines the lighting control experience, making it more

immersive, efficient, and engaging.

.05

Reflection

Unlike client work, Glühen wasn’t created to

ship—it was an exercise in imagining what a

better future could feel like. The project

helped me sharpen my early-stage UX skills,

from prototyping in immersive tools to

articulating abstract interaction patterns.

While there were no formal results, the goal

was exploration. I left the project with a

stronger sense of how to simplify complex

systems through spatial thinking.

.05

Reflection

Unlike client work, Glühen wasn’t created to ship—it was an exercise

in imagining what a better future could feel like. The project helped

me sharpen my early-stage UX skills, from prototyping in immersive

tools to articulating abstract interaction patterns. While there were

no formal results, the goal was exploration. I left the project with a

stronger sense of how to simplify complex systems through spatial

thinking.

.05

Reflection

Unlike client work, Glühen wasn’t created to ship—it was an exercise in imagining what a better future

could feel like. The project helped me sharpen my early-stage UX skills, from prototyping in immersive

tools to articulating abstract interaction patterns. While there were no formal results, the goal was

exploration. I left the project with a stronger sense of how to simplify complex systems through

spatial thinking.

could feel like. The project helped me sharpen my early-stage UX skills, from prototyping in immersive

tools to articulating abstract interaction patterns. While there were no formal results, the goal was

exploration. I left the project with a stronger sense of how to simplify complex systems through

spatial thinking.