Resonance Research Institute publishes across academic channels, practitioner-friendly formats, and public writing. This page will grow as studies complete, tools mature, and collaborations expand.
Describes the theoretical framework and initial design of a closed-loop system in which EEG-derived emotional signals guide changes in musical structure during trauma therapy.
Articulates an ethical and practical framework for incorporating AI tools into trauma-focused psychotherapy while preserving client dignity and clinician authority.
Documents the process of adapting emotion-recognition models and datasets to better represent trauma states such as shutdown, hyperarousal, and fragile regulation.
Not all important work fits neatly into journals. We use essay-style writing to document the human side of this research and to invite clinicians, veterans, and technologists into the conversation.
A narrative overview of how clinical social work, AI research, and music-based practice converged into the current line of research at Resonance.
Read on SubstackA deep dive into the design of the adaptive music engine, including early sketches, constraints, and surprising findings from prototyping.
As invitations arise, this section will track conference talks, workshops for clinicians, and guest lectures on trauma-informed AI and EEG-based music interventions.
An accessible overview of how brainwaves, music, and AI can be combined to support people who find traditional talk therapy overwhelming or inaccessible.
A skills-based workshop for mental health professionals on using AI-assisted tools like TheraNotes in ways that respect ethics, boundaries, and clinician judgment.
We welcome collaboration with clinicians, researchers, and institutions interested in trauma, EEG, music, or human-centered AI. Preprints and technical docs can be shared upon request.
Request drafts or data