
Transforming Vehicles into Social Agents for Future Mobility
Echo
2025
Collaboration with Cupra
Individual Work
The Project explores the evolving role of autonomous vehicles as social participants through multimodal interaction and shared emotional memory in the future.
Future Mobility
User Experience Design
Multimodal Interaction
Background
Roads are inherently social spaces, yet autonomous systems lack real-time emotional expression between vehicles, The project transforms cold transport into a social agent that triggers full-sensory resonance

Emotional Agency
Instantly triggering fluctuations in user emotion
All-Sensory Immersion
Multimodal integration for full-sensory participation
Sophisticated Tensions
Bold, high-challenge impact that stands out
Concept

Dogs leave behind various traces to store memories and social histories
Leaving Memories

Dynamic exterior signals provide immediate non-verbal social expression
Expressing Self

Dogs interact with their environment and others through powerful peripheral sensing capabilities
Perceiving World
Emulating canine social rituals, I translates multi-sensory perception and expression into a multi-modal environmental map for vehicular
>>>
Drivers can use multi-sensor interior interactions to record and express attitudes toward other vehicles, forming a dynamic social memory
<<<
System

By integrating perception, emotional mapping, and materialized recording, this project establishes a multi-sensory framework that translates real-time emotions into perceptible behaviors and enduring vehicular social relationships
A peripheral sensing layer that detects surrounding CUPRA peers and translates social recognition into adaptive haptic feedback

Input
Echo Field

Stable Encounter

Intense Encounter
Expression



Echo Field and Echo Skin together form an evolving closed-loop of vehicular social rituals
Upon receiving the signal, the interior lighting of the other vehicle changed accordingly
Sign

Layers


Frequency

Amplitude

Translate
Emotion to Geometry Algorithm
Ripple

Spark


Output
Echo Skin
A multi-modal expression surface that morphs the vehicle's geometry and lighting to broadcast real-time emotions and archive layered social interactions
Prototyping
Prototyping employs a hybrid digital-physical methodology to manifest the Echo Skin interaction concept

Digital Visualization
Through TouchDesigner, complex surface behaviors are simulated to visualize the system's intricate biological responses, including ripple diffusion, spike growth, and intensity-driven deformations



Physical Manifestation
The 3D-printed model incorporates pressure sensors and an LED lighting system, with modeling clay integrated to enhance the visual experience. Users trigger real-time dynamic feedback through touch



Reflection

I reflected on the system's evolution from a technical 'sensory loop' to a genuine social medium, identifying limitations in interaction complexity and memory logic while exploring the delicate balance between automated sensing and driver agency

How can the system evolve from simple one-way or back-and-forth responses toward richer, continuous, and multi-party interactive patterns?

What specific temporal and triggering mechanisms should define which interaction moments are significant enough to be archived as "social memory"?

How can the integration of physical properties—such as temperature and softness—further align the "vehicle skin" metaphor with the nuances of real human expression?