Brain Analysis
You cannot read minds – or can you? For those who cannot communicate at all, advanced speech prosthetics controlled by 'mind-reading' could be the way out. Ground-breaking study conducted from our Brain Analysis team has added an important piece to the puzzle that will propel the technology forward.
Brain Analysis team explores novel machine learning methods combined with neuroscientific approaches to understand how the human brain functions. We contact data collection studies using different non-invasive modalities (e.g., EEG, fMRI, fNIRS) in a unimodal and bimodal approach.
Our aim is to develop novel non-invasive BCI methods to help people suffering from neurological diseases. Particular focus of our research is in the area of inner speech detection that could ´give voice´ to people suffering from Amyotrophic lateral sclerosis (ALS) or Locked-in syndrome (LIS).
Hardware
EEG/fMRI compatible 64 channels headset for synchronous recordings, Brain Products
Master theses
2024, Nathan Hiruy, EEG-based control of ground robot via Brain Computer Interface.
2023, Maxime Arnaud, Inner speech detection using bimodal inner speech dataset.
2022, Valeria Buenrostro-Leiter, Fellipe Rollin, Brain Signal Analysis for Inner detection, link External link..
2022, Lisa Jonsson, Using machine learning to analyse EEG brain signals for inner speech detection.
Awards
2022, 1st place winners BCI Hackathon, PyEPosers - An ECoG Hand Pose Data Analysis Project, link External link.
Contact
Current projects
-
MADHD-NET : Multimodal ADHD prediction model with Brain connectivity Networks
In this project, a novel framework for MADHD-NET will be developed with EEG and eyetracking measures. The project also highlights the different connectivity networks for ADHD subjects with different attention levels (based on cognitive events) and age groups.
-
Understanding the Neural Dynamics of Inner Speech: A Multimodal and Longitudinal Perspective
This project bridges research in BCI, neuroimaging, and machine learning, addressing the critical challenge of decoding complex and abstract thought processes, such as inner speech. While prior studies have translated brain activity into simple commands by recognizing p atterns linked t o specific ...
-
2nd study on Inner speech decoding
Bimodal synchronous electroencephalography-functional magnetic resonance imaging dataset for inner-speech recognition
-
NeuraMind: Decoding brain signals for the detection of Frontotemporal Dementia and Alzheimer’s disease with non-invasive Electroencephalography and Artificial Intelligence
The development and deployment of dementia and Alzheimer's detection models can contribute significantly to promoting an active and independent life for people living with disabilities in both the short and long term. Detection models can identify the early signs and risk factors associated with d...
Completed projects
-
1st study on Inner speech decoding
Bimodal asynchronous electroencephalography-functional magnetic resonance imaging dataset for inner-...
Updated: