Development and Analysis of a Real-Time EEG-Driven Music Synthesis System
- Type:Bachelor/Master Thesis
- Supervisor:
I am looking for a motivated student (Master’s or Bachelor’s level) to develop and evaluate a novel system that generates music (& possibly accompanying visuals) based on real-time EEG (brain-activity) data. The goal is to create an interactive experience where users can influence music synthesis through their brain activity, fostering a deep connection between body, mind, and sound. The goal is to create and evaluate an experience for users, that can be used as an intervention to alleviate stress, practice mindfulness, or improve well-being or productivity in any other way (i.e. as a break one can take during work to enhance productivity).
Key Objectives:
• Develop a real-time pipeline for EEG data acquisition (OpenBCI or g.tec Unicorn), integrating Lab Streaming Layer (LSL) for data processing.
• Implement a mapping strategy to translate EEG features (e.g., Alpha and Beta power) into MIDI signals for music synthesis.
• Design a prototype demonstrator, which will be exhibited at the Unifest on the 27th of June 2025.
• Conduct a pilot study to evaluate the system’s usability and the user experience of brain-driven music creation (this could be the Unifest exhibition itself).
Requirements:
• Interest in Brain-Computer Interfaces (BCI), music technology, and human-computer interaction.
• Experience in Python and signal processing (MNE-Python, Brainflow, or similar)
• Knowledge of MIDI synthesis and audio programming (e.g., Max/MSP, Pure Data, SuperCollider, or Python-based MIDI frameworks).
• Ability to conduct user studies and analyze qualitative and quantitative data.
The lack of any of these skills is not a problem if you are willing and confident in your ability to learn them independently.
I am also ready to offer close supervision, especially regarding EEG, Music synthesis, and conducting the pilot study.
Interested?
If you are passionate about the intersection of music, brain activity, and interactive experiences, we encourage you to apply!
Please send a recent transcript of records and a brief motivation letter outlining your relevant skills and interests.
fabio.stano∂kit.edu
Key Objectives:
• Develop a real-time pipeline for EEG data acquisition (OpenBCI or g.tec Unicorn), integrating Lab Streaming Layer (LSL) for data processing.
• Implement a mapping strategy to translate EEG features (e.g., Alpha and Beta power) into MIDI signals for music synthesis.
• Design a prototype demonstrator, which will be exhibited at the Unifest on the 27th of June 2025.
• Conduct a pilot study to evaluate the system’s usability and the user experience of brain-driven music creation (this could be the Unifest exhibition itself).
Requirements:
• Interest in Brain-Computer Interfaces (BCI), music technology, and human-computer interaction.
• Experience in Python and signal processing (MNE-Python, Brainflow, or similar)
• Knowledge of MIDI synthesis and audio programming (e.g., Max/MSP, Pure Data, SuperCollider, or Python-based MIDI frameworks).
• Ability to conduct user studies and analyze qualitative and quantitative data.
The lack of any of these skills is not a problem if you are willing and confident in your ability to learn them independently.
I am also ready to offer close supervision, especially regarding EEG, Music synthesis, and conducting the pilot study.
Interested?
If you are passionate about the intersection of music, brain activity, and interactive experiences, we encourage you to apply!
Please send a recent transcript of records and a brief motivation letter outlining your relevant skills and interests.
fabio.stano∂kit.edu