top of page
Screen Shot 2022-12-20 at 16.43.20.png

Research

Brain-Computer Music Interfacing for Embodied Music Interaction applications

Prepared in relation with doctoral dissertation in Systematic Musicology at Jāzeps Vītols Latvian Academy of Music

Overview

 

This research aims to develop tools that harness the human electroencephalograph (EEG) signal for real-time music interaction. Over the last few years I have worked with neuroscientists and visual artists to develop a Brain-Computer Music Interface (BCMI) system that enables a single musician to to control musical and visual media using their EEG signals in a live performance context.

BCMI systems consist of EEG hardware, and computers running software which receives, filters and decodes the users brainwaves, then maps them to manipulate musical or visual parameters in real time. The videos listed in the media content below document the process of investigating, testing and demonstrating various solutions. The BCMI system developed within the frames of this research project was based on decoding the expressive intentions of a performer in two contrasting states: high arousal and low arousal. This was done by characterising spectral power during emotionally expressive music performance relative to emotionally neutral music performance.

The next step will be to extend these tools for multiple users, in which inter-brain dynamics during co-creative tasks can be used to manipulate immersive multimedia. In other words, shared brain activity can be used to play a role in the creation or experience of art. Taking this further, developing these tools will lead to solutions in the wider field of BCI and Human-Machine Interaction, towards direct mind-to-mind communication networks.

Media content

Videos prepared in relation to this work have been uploaded as visual aids to certain steps in the BCMI design process and can be accessed in order via the two playlists below.

The BCMI system maps EEG patterns related to expressive intent during music performance on the arousal dimension to multiple audio/visual outputs.

 

Created in collaboration with research teams led by Valdis Bernhofs from Jāzeps Vītols Latvian Academy of Music, Inga Griškova-Bulanova from Vilnius University, Yuan-Pin Lin from National Sun Yat-sen University, and data visualisation expert Mārtiņš Dāboliņš.

Playlist A: Videos 1 - 11

BCMI design explorations

BCMI design explorations

BCMI design explorations
1.  Controlling electric guitar effects with the EEG using facial muscles

1. Controlling electric guitar effects with the EEG using facial muscles

01:11
Play Video
2. Visualisation and musification of local synchronisation events v.1

2. Visualisation and musification of local synchronisation events v.1

01:01
Play Video
3. Visualisation and musification of local synchronisation events v.2

3. Visualisation and musification of local synchronisation events v.2

02:19
Play Video
4. SSVEP based multiplayer music loopstation- coming in together on cue

4. SSVEP based multiplayer music loopstation- coming in together on cue

00:50
Play Video

Playlist B: Videos 12 - 17

BCMI system demonstrations

BCMI system demonstrations

BCMI system demonstrations
12. BCMI demo: Performance on drum kit

12. BCMI demo: Performance on drum kit

01:14
Play Video
13. BCMI demo: Performance on electric guitar

13. BCMI demo: Performance on electric guitar

01:09
Play Video
14. BCMI demo: Performance on synthesiser

14. BCMI demo: Performance on synthesiser

01:16
Play Video
15. BCMI demo: Control monitor for drums

15. BCMI demo: Control monitor for drums

01:08
Play Video

Video 1: Controlling electric guitar effects with the EEG using facial muscles

Exploring the use of facial muscles to momentarily add reverb or distort the output of an electric guitar, using 2 frontal electrodes.

Relaxed face: Reverb added to guitar output.

Clenched jaw: Surge of reverb and distortion, followed by distortion.

Video 2: Visualisation and musification of local synchronisation events v.1

Exploring visualisation of local synchronisation events as colour coded lines and dots representing 8 electrode locations, and musification by assigning them to MIDI note events.

Visual: Dots and lines appear at local synchronisation events between 8 electrodes. Blue: theta, green: alpha, red: beta.

Music: MIDI note events mapped to each synchronisation event played on a virtual marimba instrument.

Video 3: Visualisation and musification of local synchronisation events v.2

Exploring visualisation of local synchronisation events as a colour coded cascade of horizontal bars representing 8 electrode locations, and musification by assigning them to MIDI note events and applying arpeggiation.

Visual: Rectangular bars appear at local synchronisation events between 8 electrodes. Red: delta, orange: theta, green: alpha, blue: beta, purple: gamma.

Music: MIDI note events mapped to each synchronisation event played on a virtual electric piano instrument.

Video 4: SSVEP-based multiplayer music loopstation- coming in together on cue

Exploring the SSVEP method for a 4-player musical loop station. The task is to come in together with the same selection on cue. Recorded by researchers at National Sun Yat-sen University, Taiwan.

Selections for 4 instruments:

Sun: Lively, dense rhythm

Moon: Moderate, balanced rhythm

Rain: Moody, sparse rhythm

Video 5: SSVEP-based multiplayer music loopstation- taking turns to come in

Exploring the SSVEP method for a 4-player musical loop station. The task is to take turns to come in with same selection on cue. Recorded by researchers at National Sun Yat-sen University, Taiwan.

Selections for 4 instruments:

Sun: Lively, dense rhythm

Moon: Moderate, balanced rhythm

Rain: Moody, sparse rhythm

Video 6: SSVEP-based multiplayer music loopstation- free improvisation

Exploring the SSVEP method for a 4-player musical loop station. The task is to improvise and interact with the selections made by others. Recorded by researchers at National Sun Yat-sen University, Taiwan.

Selections for 4 instruments:

Sun: Lively, dense rhythm

Moon: Moderate, balanced rhythm

Rain: Moody, sparse rhythm

Video 7: Coloured sparks as BCMI visual feedback

Visualisation of BCMI control feedback in the form of a particle system emitting colour coded sparks. The y-position of the coloured sparks in relation to the horizontal midline represents the users expressive intentions on the arousal dimension.

Low arousal: Sparks with cool hues emitted below the midline, reverb added to MIDI piano output actively played by a subject.

High arousal: Sparks with warm hues emitted above the midline, EQ high pass filter added to MIDI piano output actively played by a subject.

 

Video 8: Functions for tuning BCMI to individuals and measuring accuracy

Functions for visualising levels of detected expressive conditions in streaming EEG data, filtering out noisy electrodes, adjusting detection thresholds for individuals, and recording accuracy over time.

Low arousal: Blue ball descends below the horizontal midline.

High arousal: Blue ball ascends above the horizontal midline.

Video 9: Pianist improvising while controlling BCMI outputs

Example of the process of testing the accuracy of a BCMI system with expert musicians playing improvised music on different instruments.

Low arousal: Sparks with cool hues projected onto a white mask, increased reverb in the MIDI piano output.

High arousal: Sparks with warm hues projected onto a white mask, EQ high pass filter added to MIDI piano output.

 

Video 10: Saxophonist improvising while controlling BCMI outputs

Example of the process of testing the accuracy of a BCMI system with expert musicians playing improvised music on different instruments.

Low arousal: Sparks with cool hues projected onto a white mask, increased reverb in the saxophone output.

High arousal: Sparks with warm hues projected onto a white mask, no mapped audio parameter.

Video 11: Timelapse of a multi-modal BCMI system setup on a concert stage

Setup of projection surface, musical instruments, projector, motion tracking camera, DMX lights. Calibration of head-tracked visual projection, BCMI control feedback tuning, BCMI output mapping. Recording of audio/visual BCMI outputs and EEG data.

In all recordings, expressive intent on the arousal dimension (high versus low) detected in the EEG was mapped to control the hue of stage lighting (warm versus cool) as well as various MIDI effects such as distortion and reverb.

Video 12: BCMI demo: Performance on drums

User controlling a BCMI system while playing synthesiser and drum machine, from an audience’s perspective.

Low arousal: Blue hued DMX lights and head-tracked visual projection

High Arousal: Red hued DMX lights and head-tracked visual projection

Video 13: BCMI demo: Performance on electric guitar

User controlling a BCMI system while playing electric guitar, from an audience’s perspective.

BCMI outputs:

Low arousal: Blue hued DMX lights and head-tracked visual projection, increased reverb in the guitar output

High Arousal: Red hued DMX lights and head-tracked visual projection, increased distortion in the guitar output

Video 14: BCMI demo: Performance on synthesiser

User controlling a BCMI system while playing synthesiser and drum machine, from an audience’s perspective.

BCMI outputs:

Low arousal: Blue hued DMX lights and head-tracked visual projection, increased reverb in the synthesiser output

High Arousal: Red hued DMX lights and head-tracked visual projection, arpeggiator latch activation and higher octavisation

Video 15: BCMI demo: Control monitor for drums

Screen capture video of the BCMI control monitor as seen by the user while playing a standard drum kit.

BCMI outputs:

Low arousal: Blue hued DMX lights and head-tracked visual projection

High Arousal: Red hued DMX lights and head-tracked visual projection

Video 16: BCMI demo: Control monitor for electric guitar

Screen capture video of the BCMI control monitor as seen by the user while playing an electric guitar.

BCMI outputs:

Low arousal: Blue hued DMX lights and head-tracked visual projection, increased reverb in the guitar output

High Arousal: Red hued DMX lights and head-tracked visual projection, increased distortion in the guitar output

Video 17: BCMI demo: Control monitor for synthesiser

Screen capture video of the BCMI control monitor as seen by the user while playing a synthesiser.

BCMI outputs:

Low arousal: Blue hued DMX lights and head-tracked visual projection, increased reverb in the synthesiser output

High Arousal: Red hued DMX lights and head-tracked visual projection, arpeggiator latch activation and higher octavisation

Videos uploaded December 2022.

Contact

Jazeps Vitols Latvian Academy of Music

Jāzepa Vītola Latvijas Mūzikas akadēmija

bottom of page