Resonance Research Institute · AL

Where trauma science, AI, EEG, and music converge.

We design human-centered technology that helps veterans and trauma survivors regulate emotion, reconnect with their bodies, and reclaim meaning when words aren’t enough.

Veteran-led · Clinically grounded · AI-driven
EEG · Music Neuroscience · Therapist-in-the-loop AI
Live EEG · Alpha, Beta, Gamma
Adaptive music engine: Online
EEG → Color
Real-time chromatic visualizations of emotional state for grounding and insight.
Music → Nervous System
Emotion-aware audio designed to support stabilization and recovery.
Clinician → Conductor
Therapist-in-the-loop AI that augments—not replaces—human judgment.
Our Mission

Engineering tools that save lives, not replace people.

Resonance Research Institute exists to push beyond the limits of traditional talk therapy. We combine clinical social work, neuroscience, and machine learning to build systems that meet trauma where it actually lives—in the body, the nervous system, and the unspoken emotional field.

Our work focuses on veterans and trauma survivors who often can’t—or won’t—tell their story out loud. By translating brainwaves into color and designing music that responds in real time, we create new pathways for regulation, expression, and recovery.

What if the brain could speak in color—and music could answer back?

That question drives our research. From EEG-based emotional biometrics to adaptive music engines and clinician-guided AI tools, every project at the Institute is built around a single goal: to bridge human experience and intelligent technology in service of healing.
What We Do

Core areas of impact

Our work sits at the intersection of clinical practice, neuroscience, and audio engineering. We prototype, test, and refine technologies that can live in real therapy rooms—not just in lab papers.

EEG-Driven Emotional Regulation

We translate real-time EEG data into intuitive chromatic feedback that helps clients see and understand their internal state. This makes emotion regulation more concrete, especially for those who struggle to “feel their feelings” on command.

AI-Enhanced Music Therapy

Our adaptive music engine analyzes both the music and the listener’s physiology, then adjusts tempo, harmony, and intensity to support grounding, stabilization, and post-traumatic growth across a session.

Clinical AI Research & Tooling

We build therapist-in-the-loop AI systems—briefing tools, pattern detectors, and decision supports—that respect client dignity, protect privacy, and center the clinician as the ultimate decision-maker.

Veteran-Focused Innovation

Born from combat trauma. Built for those who carry it.

Our founder is a combat veteran and trauma therapist. The Institute’s research is shaped by years of sitting with veterans who can brief a mission in perfect detail but freeze the moment they’re asked, “How are you sleeping? How are you really?”

Many of them don’t want to tell their story again. Some can’t. Our tools are designed for these moments: when language fails, when hyperarousal won’t drop, when anhedonia steals joy from the things that once kept them alive. We build technologies that honor their experience and help them move toward safety, regulation, and meaning—without turning them into data points.

The Technology

A closed loop between brain, sound, and clinician.

Our systems aim to create a responsive feedback loop: EEG captures what the nervous system is doing, AI models interpret the signal, and adaptive audio responds—while the therapist guides the process and makes meaning with the client.

EEG & Emotional Biometrics

Consumer-grade EEG devices capture live brainwave patterns. Our models focus on frontal alpha asymmetry, arousal patterns, and trauma-relevant signatures to estimate emotional state in real time.
Adaptive Music Engine

The engine analyzes audio features and the client’s physiological response to guide musical structure—intensity, harmonic tension, and rhythmic density— toward stabilization, grounding, or expansion.
Therapist-in-the-Loop AI

We design tools that surface patterns and possibilities, not prescriptions. Clinicians retain control over interventions, timing, and interpretation. The AI’s job is to assist, not to replace the human relationship.
Active Initiatives

Projects across the Resonance ecosystem

Active R&D

Adaptive Music Engine

A real-time system where a listener’s neurophysiology guides how the music evolves, supporting trauma stabilization and emotional processing.

Clinical Tooling

TheraNotes AI

A therapist-support platform that helps clinicians brief sessions, detect patterns, and generate ethically aligned notes while preserving the “golden thread” of care.

Framework

23Protocol

A research protocol integrating trauma science, EEG, and adaptive music for veterans, forming the backbone of our long-term clinical trials.

Resilience

23Strong

A resilience initiative focused on veterans and first responders, exploring how music, community, and intelligent tools can reduce risk and support post-traumatic growth.

Dataset

PTSD EEG Dataset Initiative

Early-stage work on a de-identified EEG dataset tailored to trauma and regulation, designed to improve how AI understands and supports nervous-system states in real contexts.

Bridge

EEG-Music Bridge Suite

A stack of tools that connect EEG input with chromatic visualizations, music playback, and logging—turning therapy into a structured, data-informed collaboration.

Join the Mission

Let’s build the next generation of trauma-informed technology.

We partner with clinicians, researchers, engineers, musicians, veterans’ organizations, and funders who believe that healing is both an art and a science. If you’d like to collaborate, pilot our tools, or support the work, we’d love to hear from you.