Lukas (he/him) is a research associate at TU Berlin in the department of Biopsychology and Neuroergonomics headed by Prof. Klaus Gramann. He works at the intersection of human-computer interaction (HCI) and cognitive neuroscience. In his PhD research, he published award-winning multimodal interface technology, investigating neural- and movement signatures for novel natural interaction experiences with extended realities (XR).


︎︎︎ Email
︎︎︎ Google Scholar
︎︎︎ LinkedIn
︎︎︎ CV



Lukas works at the intersection of human-computer interaction (HCI) and cognitive neuroscience. In his PhD research, he published award-winning multimodal interface technology, investigating neural- and movement signatures for novel natural interaction experiences with extended realities (XR).


︎︎︎ Email
︎︎︎ Google Scholar
︎︎︎ LinkedIn
︎︎︎ Twitter

Publications


[12 | Oct 06 2022] The BeMoBIL Pipeline for automated analyses of multimodal mobile brain and body imaging data

Marius Klug, Sein Jeung, Anna Wunderlich, Lukas Gehrke, Janna Protzak, Zakaria Djebbara, Andreas Argubi-Wollesen, Bettina Wollesen, Klaus Gramann

︎ paper



[11 | Jun 06 2022] Neural sources of prediction errors detect unrealistic VR interactions

Lukas Gehrke, Pedro Lopes, Marius Klug, Sezen Akman, Klaus Gramann

Using prediction error negativity features, we classified VR glitches with 77% accuracy. We localized the EEG sources driving the classification and found midline cingulate EEG sources and a distributed network of parieto-occipital EEG sources to enable the classification success.

︎ paper | Talk at Neuroergonomics Conference ‘21 (Young Investigator and Best Talk Award sponsored by NIRx)




[10 | May 2022] Toward Human Augmentation Using Neural Fingerprints of Affordances

Lukas Gehrke, Pedro Lopes, Klaus Gramann

Affordances in Everyday Life, 173-180

We discussed the idea to model user’s cognitive states through physiological sensors and interface the models outputs directly with electrical muscle stimulation to intercept and manipulate the emergence of an affordance between user and object.

︎ Book Chapter



[9 | Mar 19 2021] Mobile Brain/Body Imaging of Landmark-Based Navigation with High-Density EEG

Alexandre Delaux, Jean-Baptiste de Saint Aubert, Stephen Ramanoël, Marcia Bécu, Lukas Gehrke, Marius Klug, Ricardo Chavarriaga, José-Alain Sahel, Klaus Gramann, Angelo Arleo

Eur. J.  Neurosci. 1–27 (2021).


We focused on landmark-based navigation in actively behaving young adults solving a virtual reality Y-maze task. Our results confirm that combining mobile high-density EEG and biometric measures can help unravel the brain network and neural modulations subtending ecological landmark-based navigation.

︎ paper



[8 | Feb 20 2021] Single-trial regression of Spatial Exploration Behavior Indicates Posterior EEG Alpha Modulation to Reflect Egocentric Coding

Lukas Gehrke, Klaus Gramann

Eur. J.  Neurosci. 1–18 (2021).


We showcased the capabilities of Mobile Brain/Body Imaging (MoBI) using Virtual Reality (VR), demonstrating several analyses approaches based on general linear models (GLM) to reveal behavior‐dependent brain dynamics. Confirming spatial learning via drawn sketch maps we employed motion capture to image spatial exploration behavior describing a shift from initial exploration to subsequent exploitation of the mental representation.

︎ paper



[7 | Jan 26 2021] The AudioMaze: An EEG and Motion Capture Study of Human Spatial Navigation in Sparse Augmented Reality
Makoto Miyakoshi, Lukas Gehrke, Klaus Gramann, Scott Makeig, John R. Iversen

Eur J Neurosci. 2021; 00: 1– 25.


We developed the Audiomaze, a novel paradigm in which participants freely explore a room-sized virtual maze while EEG is recorded synchronized to motion capture. Participants (n=16) were blindfolded and explored different mazes, each in three successive trials, using their right hand as a probe to ‘feel’ for virtual maze walls. We found behavioral evidence of navigational learning in a sparse-AR environment, and a neural correlate of navigational learning was found near lingual gyrus.

︎ paper



[6 | May 02 2019] Detecting Visuo-Haptic Mismatches in Virtual Reality using the Prediction Error Negativity of Event-Related Brain Potentials

Lukas Gehrke, Sezen Akman, Pedro Lopes, Albert Chen, Avinash Kumar Singh, Hsiang-Ting Chen, Chin-Teng Lin, Klaus Gramann

Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ‘19). ACM, New York, NY, USA, Paper 427, 11 pages.


We detected conflicts in visuo-haptic integration by analyzing event-related potentials (ERP) during interaction with virtual objects. In our EEG study, participants touched virtual objects and received either no haptic feedback, vibration, or vibration and EMS (Electrical muscle stimulation). We report a sensitiviy to unrealistic VR situations of an early negativity component at electrode FCz (prediction error), indicating we successfully detected haptic conflicts.

︎ paper ︎ data/code



[5 | May 20 2019] Extracting Motion-Related Subspaces from EEG in Mobile Brain/Body Imaging Studies using Source Power Comodulation.

Lukas Gehrke*, Luke Guerdan* and Klaus Gramann | * contributed equally

9th International IEEE/EMBS Conference on Neural Engineering (NER), San Francisco, CA, USA, 2019, pp. 344-347*


We proposed the use of a supervised spatial filtering method, Source Power Co-modulation (SPoC), for extracting source components that co-modulate with body motion. We illustrate the approach to investigate the link between hand and head movement kinematics and power dynamics of EEG sources while participants explore an invisible maze in virtual reality.


︎ paper


[4 | Nov 16 2019] Neurofeedback during Creative Expression as a Therapeutic Tool

Stephanie Scott, Lukas Gehrke

Springer Series on Bio- and Neurosystems, Vol. 10, Jose L. Contreras-Vidal et al. (Eds): Mobile Brain-Body Imaging and the Neuroscience of Art, Innovation and Creativity, 978-3-030-24325-8


Using EEG power we gave a live visual feedback of white lines on a black background, borrowing from Joy Division’s famous album cover. This closed-loop neurofeedback stimulates creativity by making aware one’s own brain activity.


︎ paper


[3 | Nov 30 2018] MoBI - Mobile Brain/Body Imaging

Evelyn Jungnickel, Lukas Gehrke, Marius Klug, Klaus Gramann

Academic Press, Neuroergonomics, 59—63, 2019


Mobile brain/body imaging (MoBI) is a method to record and analyze brain dynamics and motor behavior in naturalistic conditions. In this chapter we give an overview of its suitability to investigate a wide range of scientific problems., including analyses of human brain dynamics with the aid of information derived from movement and studies with an interest in motor behavior using brain imaging as an additional source of information.


︎ paper


[2 | Jul 28 2018] The Invisible Maze Task (IMT): Interactive Exploration of Sparse Virtual Environments to Investigate Action-Driven Formation of Spatial Representations.

Lukas Gehrke, John R. Iversen, Scott Makeig Klaus Gramann

Creem-Regehr S., Schöning J., Klippel A. (eds) Spatial Cognition XI. Spatial Cognition 2018. Lecture Notes in Computer Science, vol 11034. Springer, Cham


The neuroscientific study of human navigation has been limited by requiring participants to remain stationary during data recordings. With the Invisible Maze Task (IMT) we provide a novel VR paradigm to investigate freely moving, naturally interacting, navigators.


︎ paper ︎ code



[1 | Nov 16 2018] Human cortical dynamics during full-body heading changes

Klaus Gramann, Friederike U. Hohlefeld, Lukas Gehrke, Marius Klug

Scientific Reports 11 (1), 1-12


We contrasted physically rotating participants with a traditional joystick setup with rotations based on visual flow only. We show that varying rotation velocities were accompanied by pronounced beta synchronization during physical rotation but not joystick control.


︎ paper