Neuro Palette Residency
Brain-computer interface wearable headsets translate alpha and gamma waves into kinetic light compositions, empowering artists to externalize empathy-driven narratives that resonate with international museum visitors.
Regions
Berlin · Kyoto · Montréal
Signals
EEG α/γ blending
Timeline
18-month residency
Narrative Overview
The residency paired neurodivergent and neurotypical artists with neuroscientists to co-create responsive installations. Each participant underwent calibrations to map neural states to color gradients, motion vectors, and sound modulation. The project centered on emotional translation, enabling visitors to experience empathy as a living spectrum.
Key Components
- ● Adaptive EEG headsets with thermal comfort layers for long sessions.
- ● Neural signal orchestration engine that translates brain activity into light, texture, and spatial audio.
- ● Inclusive onboarding with guided meditations and multilingual consent experiences.
- ● Traveling exhibit architecture featuring translucent projection fabrics and modular scaffolding.
Impact Metrics
Engagement
94%
Visitors reported meaningful emotional resonance.
Co-Creation
36
Artist-neuroscientist pairings across three continents.
Accessibility
18
Multisensory accommodations including tactile feedback panels.
Ethics & Safeguards
BCICaseLab coordinated a neural data trust ensuring that artists retained full rights to their signal mappings. Consent protocols included reversible participation, anonymized pattern storage, and an independent advisory circle representing disability justice perspectives.
Governance: Quarterly audits verified ethical compliance checkpoints.
Well-Being: Live biometrics ensured participants could pause whenever sensory load increased.
Cultural Stewardship: Residency outputs co-authored with local curators to honor community narratives.
Future Directions
The residency now evolves into an open-source toolkit that supports virtual exchanges. Museums partner with digital commons platforms to invite global visitors to co-create soundscapes via remote neural inputs, enabling collaborative storytelling beyond physical venues.