We design human-centered technology that helps veterans and trauma survivors regulate emotion, reconnect with their bodies, and reclaim meaning when words aren’t enough.
Resonance Research Institute exists to push beyond the limits of traditional talk therapy. We combine clinical social work, neuroscience, and machine learning to build systems that meet trauma where it actually lives—in the body, the nervous system, and the unspoken emotional field.
Our work focuses on veterans and trauma survivors who often can’t—or won’t—tell their story out loud. By translating brainwaves into color and designing music that responds in real time, we create new pathways for regulation, expression, and recovery.
Our work sits at the intersection of clinical practice, neuroscience, and audio engineering. We prototype, test, and refine technologies that can live in real therapy rooms—not just in lab papers.
We translate real-time EEG data into intuitive chromatic feedback that helps clients see and understand their internal state. This makes emotion regulation more concrete, especially for those who struggle to “feel their feelings” on command.
Our adaptive music engine analyzes both the music and the listener’s physiology, then adjusts tempo, harmony, and intensity to support grounding, stabilization, and post-traumatic growth across a session.
We build therapist-in-the-loop AI systems—briefing tools, pattern detectors, and decision supports—that respect client dignity, protect privacy, and center the clinician as the ultimate decision-maker.
Our founder is a combat veteran and trauma therapist. The Institute’s research is shaped by years of sitting with veterans who can brief a mission in perfect detail but freeze the moment they’re asked, “How are you sleeping? How are you really?”
Many of them don’t want to tell their story again. Some can’t. Our tools are designed for these moments: when language fails, when hyperarousal won’t drop, when anhedonia steals joy from the things that once kept them alive. We build technologies that honor their experience and help them move toward safety, regulation, and meaning—without turning them into data points.
Our systems aim to create a responsive feedback loop: EEG captures what the nervous system is doing, AI models interpret the signal, and adaptive audio responds—while the therapist guides the process and makes meaning with the client.
A real-time system where a listener’s neurophysiology guides how the music evolves, supporting trauma stabilization and emotional processing.
A therapist-support platform that helps clinicians brief sessions, detect patterns, and generate ethically aligned notes while preserving the “golden thread” of care.
A research protocol integrating trauma science, EEG, and adaptive music for veterans, forming the backbone of our long-term clinical trials.
A resilience initiative focused on veterans and first responders, exploring how music, community, and intelligent tools can reduce risk and support post-traumatic growth.
Early-stage work on a de-identified EEG dataset tailored to trauma and regulation, designed to improve how AI understands and supports nervous-system states in real contexts.
A stack of tools that connect EEG input with chromatic visualizations, music playback, and logging—turning therapy into a structured, data-informed collaboration.
We partner with clinicians, researchers, engineers, musicians, veterans’ organizations, and funders who believe that healing is both an art and a science. If you’d like to collaborate, pilot our tools, or support the work, we’d love to hear from you.