NeuroXR: Current Research and Opportunities

marknb00 0 views 93 slides Oct 12, 2025
Slide 1
Slide 1 of 93
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68
Slide 69
69
Slide 70
70
Slide 71
71
Slide 72
72
Slide 73
73
Slide 74
74
Slide 75
75
Slide 76
76
Slide 77
77
Slide 78
78
Slide 79
79
Slide 80
80
Slide 81
81
Slide 82
82
Slide 83
83
Slide 84
84
Slide 85
85
Slide 86
86
Slide 87
87
Slide 88
88
Slide 89
89
Slide 90
90
Slide 91
91
Slide 92
92
Slide 93
93

About This Presentation

A keynote speech given by Mark Billinghurst at the NeuroXR workshop at the ISMAR 2025 conference. This talk was given on October 12th 2025. It describes current research and opportunities for the use of neurophysiological signals and affective computing in Extended Reality.


Slide Content

Mark Billinghurst
[email protected]
October 12
th 2025
Current Research and Opportunities
Neuro-XR

FOUNDATIONS

1967 – IBM 1401 – half of the computers in the world, $10,000/month to run

Jacques Vidal (1973)
Vidal, J. J. (1973). Toward direct brain-
computer communication.Annual review of
Biophysics and Bioengineering,2(1), 157-180.
Coined the term Brain Computer Interface

UCLA
BCI Lab

BCI Publications Per Year (1977 – 2025)
~ 30,000 papers

Major Research Trends
•1920s–1960s - Early Foundations
•1924 invention of EEG (Hans Berger)
•1970s – Establishing the Field
•1977 – Vidal’s first non-invasive human study
•1980s - 1990s - Development of Core Non-Invasive Paradigms
•1988 - Farwell and Donchin introduce the P300 Speller paradigm
•2000s–Present - Invasive BCI Breakthroughs and Clinical Translation
•2004 - BrainGate – first invasive human trial
•2016 onwards – rise of private companies (Neuralink, Synchron)

Current State of the Art
•Moderate cost reliable EEG hardware
•OpenBCI, Emotiv, Neurosity, Muse, etc.
•Excellent software tools
•OpenVibe, EEG, MNE-Python, etc.
•Fast input performance
•~60 wpm from BrainGate (2023)
•Implantable clinical trials underway
•BrainGate (2004), Synchron (2019), Neuralink (2024), etc.

BCI and XR Publications
~1,400 XR papers

BCI and AR/VR Publications

Roz Picard (1995)
•Coined term Affective Computing
Picard, R. W. (1995). Affective computing-
mit media laboratory perceptual
computing section technical report no.
321.Cambridge, MA,2139, 92.

Affective Computing
•Ros Picard – MIT Media Lab
•Systems that recognize emotion

Affective Computing Publications (1998 – 2025)
~ 7,500 papers

Major Research Trends
•Mid 1990s - Early Foundations
•1997 - Affective Computing book – Roz Picard
•Late 1990s – Early 2000s Early projects
•1998 – IBM BlueEyes project – emotion sensing in practical system
•2000s – 2010s – Advancing Interaction and Modeling
•2001 – USC-ICT – modeling emotions in virtual characters
•2010s – Present – Real World Integration and Modern Deep learning
•2010 – Establishment of IEEE Transactions on Affective Computing
•2009 onwards – commercialization (Affectiva, Empatica)

Current State of the Art
•Widely available physiological sensors
•Shimmer, Emotibit, Plux, Empatica, etc.
•Excellent software tools
•PyAffecCT, EmoSense, EmotiEfflib, etc.
•Consumer wearable devices
•Apple, Fitbit, Garmin, Samsung, etc
•Many companies operating
•Affectiva, nViso, Realeyes, Emteq, etc.

Affective Computing and XR Publications
~450 XR papers

Affective Computing and AR/VR Publications

Landscape Summary
•Long history in BCI/Affective computing research
•Large research communities, papers from 1980/90s.
•Many devices and tools available
•Relatively low cost, open source, commercially available
•But, little XR research
•~5% of current BCI/Affective Computing publications
•Especially in the AR space (~20% of BCI XR publications)
•Obstacles to Overcome
•Integrating multiple sensors
•Combining different raw data
•Need for multi-skilled researchers
Typical Neuro-XR setup

PhysioHMD (2019)
Toolkit for collecting
physiological data from HMDs
•EEG, EMG, EOG, EDA
•Open-source software
Bernal, G., Yang, T., Jain, A., & Maes, P. (2018, October). PhysioHMD:
a conformable, modular toolkit for collecting physiological data from
head-mounted displays. InProceedings of the 2018 ACM international
symposium on wearable computers(pp. 160-167).

OpenBCI Galea: Multiple Physiological Sensors in VR HMD
•Incorporate range of sensors on HMD faceplate and over the head
•EMG – muscle movement
•EOG – Eye movement
•EEG – Brain activity
•EDA, PPG – Heart rate

Cognixion Axon-R
•See-through AR HMD
•Integrated 8 EEG sensors
•8 additional channels (ECG, EMG, EOG)
•BCI studio software

RESEARCH PROJECTS

Example Research Projects
•Measuring Presence
•Towards an objective measure
•Adaptive VR Training
•Sensing and adapting to cognitive load
•Measuring Trust in AI Agents
•Using neurophysiological cues to measure trust
•Emotionally Responsive VR
•Adaptive VR experience based on emotional state

Neurophysiological Measures of Presence
•Measuring Presence using multiple
neurophysiological measures
•Combining physiological and neurological signals
Dey, A., Phoon, J., Saha, S., Dobbins, C., & Billinghurst, M. (2020, November). A
Neurophysiological Approach for Measuring Presence in Immersive Virtual
Environments. In2020 IEEE International Symposium on Mixed and Augmented
Reality (ISMAR)(pp. 474-485). IEEE.

Experiment Design
•Independent Variable
•Presence level of VE; High (HP), Low (LP)
•High quality
•Better visual, interaction, realistic hand
•Low quality
•Reduced visual, no interaction, cartoon hand
•Between subject’s design
•Reduce variability

Measures
•Physiological Measures
•raw electrocardiogram (ECG) (Shimmer)
•heartrate
•galvanic skin response (GSR) (Shimmer)
•phasic and tonic electrodermal activity (EDA)
•electroencephalogram (EEG) (Emotiv)
•brain activity
•Subjective Measures (Presence Surveys)
•Slater-Usoh-Steed (SUS) questionnaire
•Witmer & Singer survey
•Subjects
•24 subjects, aged 20-30, 2 groups of 12
HTC Vive + Emotiv

•Significant difference in Presence, both Witmer
& Singer and SUS surveys
Witmer & Singer
SUS
Results – Subjective Surveys

Results – EEG Analysis
•14 channels of EEG data
•Processing Alpha, Theta, Beta bands
•Multiple methods – Chirplet Transform, Power Spectral Density, Power Load Index

•Significant differences in brain activity between LP and HP environment
•Overall cognitive engagement is higher in the HP than the LP environment

Other Physiological Cues
•Significant difference in ECG
•No difference in EDA results
Heart rate value

Lessons Learned
•Key Findings
•Higher presence in calm virtual environments can be characterised by
increased heart rate, elevated beta and theta, alpha activities in the brain.
•Approaching a neurophysiological measure of presence
•Limitations
•Simple Virtual Environment
•Consumer grade EEG
•Participants seated/limited movement

EEG-based Adaptive VR Training
Dey, A., Chatburn, A., & Billinghurst, M. (2019). Exploration
of an EEG-based cognitively adaptive training system in
virtual reality. In2019 IEEE conference on Virtual Reality
and 3d user interfaces (VR)(pp. 220-226). IEEE.
Goal: Create an adaptive VR training system
using workload calculated from EEG

Our System
Oz, O1, O2, Pz,
P3, and P4

Adaption/Calibration
●Establish baseline (alpha power) – innate cognitive load capacity
●Two sets of n(1, 2)-back tasks to calibrate user’s own capacity
●Measured alpha activity (task load), calculate mean of two tasks
●Mean → CL Baseline
●In experimental task, adapt content
○load > baseline → decrease difficulty level
○load < baseline → increase difficulty level

Experimental Task
•Target selection
•number of objects, different colors
•shapes, and movement
Increasing levels (0 - 20)

Experimental Task
Difficulty - Low Difficulty - High

User Study
●Participants
●14 subjects (6 women)
●20 – 41 years old, 28 years average
●No experience with VR
●Measures
○Response time
○Brain activity (alpha power)
•5 minutes fixed trial time

Adaption

Results – Response Time
Increasing levels
Response Time (sec.)
No difference between
easiest and hardest levels

Results – Time Frequency Representation
•Task Load
•Significant alpha synchronisation in the hardest difficulty levels of
the task when compared to the easiest difficulty levels
Easiest Hardest Difference

Key Finding + Limitations
•Findings
•Similar task time but increased brain activity
•Increased cognitive effort at higher levels to sustain performance
•Adaptive VR training can increase the user’s cognitive load without
affecting task performance
•But:
•Task: Should be similar to real-world tasks
•Behaviour: Difficulty levels could be designed differently
•EEG: Only alpha activity ignoring theta, only 12 electrodes

Understanding: Trust and Agents
•Many Agents require trust
•Guidance, collaboration, etc
•Would you trust an agent?
•How can you measure trust?
•Subjective/Objective measures
According to AAA, 71% of surveyed
Americans are afraid to ride in a fully
self-driving vehicle.

Measuring Trust
•How to reliably measure trust?
•Using physiological sensors (EEG, GSR, HRV)
•Subjective measures (STS, SMEQ, NASA-TLX)
•Relationship between cognitive load (CL) and trust?
•Novelty:
•Use EEG, GSR, HRV to evaluate trust at different CL
•Implemented custom VR environment with virtual agent
•Compare physiological, behavioral, subjective measures
Gupta, K., Hajika, R., Pai, Y. S., Duenser, A., Lochner, M., & Billinghurst, M. (2020, March).
Measuring human trust in a virtual assistant using physiological sensing in virtual reality.
In2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)(pp. 756-765). IEEE.

Experimental Task
•Target selection + N back memory task
•Agent voice guidance

Experiment Design
•Two factors
•Cognitive Load (Low, High)
•Low = N-Back with N = 1
•High = N-Back with N = 2
•Agent Accuracy (No, Low, High)
•No = No agent
•Low = 50% accurate
•High = 100% accurate
•Within Subject Design
•24 subjects (12 Male), 23-35 years old
•All experienced with virtual assistant
2 x 3 Expt Design

Results
•Physiological Measures
•EEG sign. diff. in alpha band power level with CL
•GSR/HRV – sign. diff. in FFT mean/peak frequency
•Performance
•Better with more accurate agent, no effect of CL
•Subjective Measures
•Sign. diff. in STS scores with accuracy, and CL
•SMEQ had a significant effect of CL
•NASA-TLX significant effect of CL and accuracy
•Overall
•Trust for virtual agents can be measured using combo
of physiological, performance, and subjective measures
”I don’t trust you anymore!!”

Context-Aware Empathic VR
•VR application that identifies and
responds to user’s emotional changes
•Emotion prediction model (EEG, EDA, HRV)
•Context aware empathic agent
•Emotion adaptive VR environment
Gupta, K., Zhang, Y., Gunasekaran, T. S., Krishna,
N., Pai, Y. S., & Billinghurst, M. (2024). CAEVR:
Biosignals-driven Context-aware Empathy in Virtual
Reality.IEEE Transactions on Visualization and
Computer Graphics,30(5), 2671-2681.

Key Research Questions
•RQ1: How can physiological signals be used to predict emotions and
facilitate context-aware empathic interactions (CAEIxs) in VR?
•RQ2: What are the effects of CAEIxs on elicited emotions, cognitive
load, and empathy towards a virtual agent in VR?
•RQ3: How can the impact of CAEIxs on users’ emotional and
cognitive load during VR experiences be evaluated?

The CAEVR System

Empathic Appraisal
•BioEmVR: A generalized emotion recognition model
•Used EEG, HRV, and EDA Data
•Gradient boost classifier recognized 4 emotional states (92% accuracy)
•Happy, Stressed, Bored, Relaxed
•Self-Projected Appraisal: To assess context and user emotional state
•Determine situational emotions based on user actions and context
•E.g. moving slow and late might mean person will be stressed
•Empathic Emotion Features: Synthesizing contextual and emotion information
•Tailor the VR environment based on user’s emotional and situational needs
•E.g. user happy and stressed might benefit from positive reinforcement

Empathic Response
•Used to express affective states
•Changing lighting and colours, virtual agents verbal and non-verbal cues
•Emotion-Adaptive Responses
•VR environment colours change depending on user’s emotional state
•Happy = Yellow, Stress = Red
•Empathic Tone
•VR agent provides speech feedback with different tones
•Context-Aware Empathic Assistance
•Agent dialogue varies depending on user’s emotional state

AR-glass capable of adding color filter based on the participant’s emotional state.
-Low-saturation colors evoke a sense of depression, while high- saturation ones lead people to feel
cheerful and pleasant
Context-Aware Empathic VR Experience

https://www.youtube.com/watch?v=cLLNXl3c1mY

System Evaluation
•Photo taking task
•Take pictures monuments in VR
•Independent Variables
•Emotion-Adaptive Environment (EA)
•Context-Aware Empathic Agent (CAE)
•2x2 within-subjects,17 sub (7M)
•Hypotheses
•H1: EA and CAE will improve user’s emotional states, presence, empathy
•H2: EA and CAE will significantly affect cognitive aspects (cognitive load, flow)
ConditionsNo-CAE CAE
No-EAA: No-EA-No-CAE C: No-EA-CAE
EA B: EA-No-CAE D: EA-CAE

Measures
•Subjective Measures
•IPQ for Presence,
•NASA-TLX for Cognitive Load,
•SAM scale for Emotional State,
•Flow Short State Questionnaire (FSSQ) for flow state,
•Game Experience Questionnaire (GEQ) for affect and empathy
•Physiological Measures
•EEG, Electrodermal Activity (EDA), & Heart Rate Variability (HRV)

Key Results
•SAM: Significant effect of EA and CAE on Valence
•GEQ: Significant effect of CAE on Empathy
•EA and CAE influenced EDA and HRV metrics like RMSSD
•EEG: Significant effects of CAE on FA-Theta and FB-Beta

Conclusions
•Hypotheses
•Use of EA and CAE improved emotional experience (H1)
•CAE interventions improved user’s empathy towards agent (H1)
•Integrating EA and CAE can impact cognitive load (H2)
•No impact on Flow state or Presence
•Research Questions
•RQ1: BioEMoVR can predict emotions and facilitate CAEIxs (92% accuracy)
•RQ2: Using CAE virtual agents can enrich user experience (emotions, empathy)
•RQ3: Need to use a multi-faceted approach for evaluation (survey, physio. cues)
•Overall: user’s VR experience can be enhanced by adapting to real-
time emotional feedback – integrating EEG, EDA and HRV

EMPATHIC COMPUTING
An important research opportunity for Neuro-XR

Modern Communication Technology Trends
1.Improved Content Capture
•Move from sharing faces to sharing places
2.Increased Network Bandwidth
•Sharing natural communication cues
3.Implicit Understanding
•Recognizing behaviour and emotion

Natural
Collaboration
Experience
Capture
Implicit
Understanding
Empathic
Computing

“Empathy is Seeing with the
Eyes of another, Listening with
the Ears of another, and Feeling
with the Heart of another..”
Alfred Adler

Empathic Computing Research Focus
Can we develop systems that allow
us to share what we are seeing,
hearing and feeling with others?

Sharing Heart Rate in VR
•HTC Vive HMD
•Heart rate sensor
•Empatica E4
Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of
sharing physiological states of players in a collaborative virtual reality
gameplay. InProceedings of CHI 2017 (pp. 4045-4056).

VR Environments
•Butterfly World: calm scene, collect butterflies
•Zombie Attack: scary scene, fighting zombies

Experiment Design
•Key Question
•What is the impact of sharing heart rate feedback?
•Two Independent Variables
•Game Experience (Zombies vs butterflies)
•Heart Rate Feedback (On/Off)
•Measures
•Heart rate (player)
•PANAS Scale (Emotion)
•Inclusion of other in self scale (Connection)

Results
•Results
•Significant difference in Heart Rate
•Sharing HR improves positive affect (PANAS)
•Sharing HR created subjective connection between collaborators
Heart Rate Data
Likert Data

Using Neuro-XR for Empathic Computing
•Sharing cognitive state
•Enhancing remote collaboration in XR
•Measuring brain synchronisation
•Can real world synchronisation also happen in VR
•Responding to synchronisation
•Adaptive XR that encourages synchronisation

•Measure physiological cues
•Brain activity
•Heart rate
•Eye gaze
•Show user state
•Cognitive load
•Attention
Showing Cognitive Load in Collaboration
Sasikumar, P.,... & Billinghurst, M.
(2024). A user study on sharing
physiological cues in vr assembly tasks.
In2024 IEEE VRlity (pp. 765-773).

Demo

User Study
•Aim
•How visual cues of physiological state
affect collaboration and awareness
•Task (28 people/ 14 pairs)
•Motorbike repair
•Different levels of complexity
•Found
•Users had a preference for monitoring
their partner’s attentional state,
•but paid little attention to physiological
cues and unsure of how to interpret
% of time looking at physiological cues
User preference ranking

Brain Synchronization

Pre-training (Finger Pointing) Session Start

Post-Training (Finger Pointing) Session End

Brain Synchronization in VR

asfd

Empathic Shared MR Experiences NeuralDrum
•Using brain synchronicity to increase connection
•Collaborative VR drumming experience
•Measure brain activity using 3 EEG electrodes
•Use PLV to calculate synchronization
•More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. InSIGGRAPH Asia 2020 Technical Communications(pp. 1-4).

Set Up
•HTC Vive HMD
•OpenBCI
•3 EEG electrodes

Results
"It’s quite interesting, I actually felt like my
body was exchanged with my partner."
Poor Player Good Player

CONCLUSIONS

Conclusions
•Significant increase in BCI, AC research
•Hardware and software becoming widely available
•However, little research in XR
•Many applications for NeuroXR
•Measuring Presence
•Adaptive VR experiences
•Creating empathic VR systems
•Empathic Computing
•Focus on enhancing collaboration
•Opportunities to apply research in Neuro-XR

Empathic Computing Journal
•Looking for
submissions
•Any topic relevant to
Empathic Computing
•Open Access, free to
publish currently
Submit intent at https://forms.gle/XXHkWh5UVQazbuTx7

www.empathiccomputing.org
@marknb00
[email protected]

●Octopus-sensing
○Simple unified interface for
●Simultaneous data acquisition
●Simultaneous data Recording
○Study design components
●Octopus-sensing-monitoring
○Real-time monitoring
●Octopus-sensing-visualizer
○Offline synchronous data visualizer
●Octopus-sensing-processing
○Real-time processing
Tools for Neuro-XR: Octopus Sensing

Octopus Sensing Visualizer
●Visualizing Raw or processed data using a config file

●Multiplatform (Linux, Mac, Windows)
●Open-source (https://github.com/octopus-sensing)
●Supports various sensors
a.OpenBCI
b.Brainflow
c.Shimmer3
d.Camera
e.Audio
f.Network (Unity and Matlab)
Octopus Sensing
Saffaryazdi, N., Gharibnavaz, A., & Billinghurst, M. (2022). Octopus Sensing: A Python library for human behavior studies.
Journal of Open Source Software, 7(71), 4045.