IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
marknb00
128 views
72 slides
Jul 21, 2024
Slide 1 of 72
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
About This Presentation
IVE 2024 Short Course Lecture 9 on Empathic Computing in VR.
This lecture was given by Kunal Gupta on July 17th 2024 at the University of South Australia.
Size: 6.16 MB
Language: en
Added: Jul 21, 2024
Slides: 72 pages
Slide Content
Empathic Computing in VR
Kunal Gupta
A Bio-Sensing Approach
●VR Needs to Understand Users’ Emotions And Individual
Differences
●VR Needs to Adapt based on Psycho-Physiological Responses
●Trustworthy Virtual Agents Needs to Respond Empathetically
●Empathic Systems Should be Context-Aware
What is an Emotion?
5
Emotions are Real, but Not Objective
Emotions are Real in Same Sense as Money is Real
A Conscious Mental Reaction Subjectively Experienced as
Strong Feeling Usually Directed Toward A Specific Object and
Typically Accompanied by Physiological and Behavioral
Changes in the Body
8
Source: Verywell
10
The Mind Explained
11
—CHARLES DARWIN
“The Expression of Emotion in Man and Animals”
Emotions Are Universal
What Emotion Do You See in the Face?
22
Source: “How Emotions are Made?” -Lisa Feldman Barrett
23
What Emotion Do You See in her Face Now?
24
Source: “How Emotions are Made?” -Lisa Feldman Barrett
25
How can context-awareand empathic
VR systemsbe developed to accurately
assessand respondto users’
emotional states?
RQ 3
26
How can Context-Aware Empathic Interactions (CAEIxs) in VR
environments enhance the emotional and cognitive aspects of UX?
27
How can Context-Aware Empathic Interactions (CAEIxs) in VR
environments enhance the emotional and cognitive aspects of UX?RQ 3
28
Empathy is “An affective
response more
appropriate to another’s
situation than one’s
own”
Martin L Hauffman
30
Empathic Appraisal/
Cognitive Empathy
Empathic Response
PAM Model of Empathy: Perception + Action (De Waal, 2008)
Empathic appraisal detects user emotions,
appraises event causes by self-projection
and form empathic emotion
Empathic response involves adapting and
expressing responses to the perceived
empathic emotion.
Model of Empathy
31
AffectivelyVR
(Physiological
Emotions)
Self-Projected
Appraisal
(SPA)
[Situational
Emotions]
AffectivelyVR: A real-time VR
emotion recognition system
using EEG, EDA, and HRV with
up to 96.1% generalised
accuracy, enabling CAEIxs
SPA involves projecting oneself
into the user’s situation to
understand their emotional state
and appraise the event that
caused the emotional response.
Mainly responsible for providing
Contextual Understanding
+
Empathic Emotions
Features (EEFs)
Features provide real-time blend
of AffectivelyVR physiological
emotions & SPA's situational
emotions to guide empathic VR
interactions
Empathic Appraisal
AffectivelyVR: Learnings
●Brainwaves (EEG), Heart Rate Variability (HRV) and Skin
Sweat Responses (EDA) can sense emotional states in VR
●AffectivelyVR framework can be used to design emotion
recognition system
●Machine Learning Classification techniques such as SVM and
KNN can be implemented to model personalized emotions
with up to 96.5% cross-validation accuracy.
Self-Projected Appraisal
●Emulates user perspective to understand needs & desires.
●Contextual understanding defines interaction states.
●Goal:Elicit situational emotions (SE) based on user activity & goals.
●Examples:
○Navigation: Stress if insufficient time to reach destination; happiness if on track.
○Photo-taking: Sadness if insufficient photos; happiness upon capturing unique perspectives.
●Uses contextual appraisal rules:
○Navigation: Aim to visit maximum monuments; elicits emotions based on estimated time (ET) vs.
remaining journey time (RJT).
○Photo-taking: Emphasizes multiple shots for diverse perspectives; emotions based on remaining
photos (RP) and monuments left (ML).
36
Empathic Emotion Features
●Goal: Guide interactions based on user's emotional & contextual info.
●Categories:
○Empathic Emotion (EE): Reflects user's current emotion (PE or SE). Enhances emotional connection &
bonding.
○Comfort & Reassurance (CR): Provides emotional support and validation, promoting well-being.
○Motivation & Support (MS): Encourages action and goal-pursuit.
○Positive Reinforcement & Encouragement (PRE): Boosts self-esteem and confidence.
○Urgency & Pressure (UP): Drives immediate action when needed.
37
Empathic Emotion Features
●Selection Criteria:
○Positive SEs use PE; Negative SEs use SE for EE.
○CR, MS, PRE, UP: Boolean-based (1 = focus on category).
●Note that the table presented in this research is logically designed while considering the features
involved in the context-relevant empathic interactions.
38
Empathic Response Strategies
39
Emotion Contagion Emotion Regulation
Tendency to automatically mimic and
synchronise responses with another
person’s emotional states
process of managing one’s own emotional
reactions to better respond to a situation
Empathic Response
40
Empathic Tone Empathic Dialogue
Empathic tone enhances emotional
understanding, builds trust, and bolsters
system credibility. [Breazeal et. al,
Rosenzweig et. al]
Empathic dialogue in relational virtual agents
acknowledges user emotions, enhancing
trust and rapport.
Empathic Tone
●Background: Neural network-based speech emotion recognition explored in
prior research [104]. Gap identified: No specific research on the best TTS
tone style as an empathic response for each emotional state.
●Approach: Azure TTS utilized to enable a virtual agent to express emotion
through voice. Empathic Dialogues dataset chosen for its range of emotion-
laden conversations [Rashkin et. al.].
●Goal: Identify the most empathetic tone for four key emotional states.
●Examples:
○For the "Happy" emotion, conversations from the dataset tagged as
'happy' were used.
○For the "Stress" emotion, conversations labeled 'stress' were selected,
and so on.
41
Empathic Tone
●Pilot Study Method:
○Conversations played with
multiple tones like 'angry',
'cheerful', 'sad' using "en-US-
JennyNeural" voice.
○Six participants rated empathy
levels post-listening on a 5-point
Likert scale.
○Results mapped on a 2D plot
between Valence and Arousal to
determine the best tone for each
emotion.
42
●Findings: 'Excited' was the most empathetic for Happy,
'empathetic' for Stress, 'hopeful' for Sadness, and
'whispering' for Relaxation.
Empathic Dialogue
●Objective: Design virtual agents that offer contextually relevant empathic responses.
●Incorporating Empathy: Use Empathic Emotional Features (EEFs).
●Role:Guides the selection of empathic dialogues tailored to user emotions & context.
●Past Approaches:
○Rule-based state machine systems [187]
○AI techniques with deep learning [245]
●Our Structure: Context-Aware Empathic Assistance (CAEA)
○CAEA = Empathic Dialogue + Assistance + UP (if Applicable)
43
Empathic Dialogue
●EEFs Implementation:
○Based on: Type of assistance, user emotions
(both physiological & situational), desired
empathic response (e.g., CR, MS, PRE, UP).
●Example (Photo-taking):
○Scenario: User is happy with sufficient photos
remaining.
○EEFs Suggestion: Use PRE feature.
○Response: “Splendid shot! You have 7 photos
left. Do you want to save this?”
45
●Example (Navigation):
○Scenario: When the user is stressed in
a relaxed situation.
○EEFs Suggestion: Use CR feature
○Response: “It's okay, take a moment.
I'm here to help. , walk towards the
right”
●Environmental Colors -Emotions Relationship:
Research shows specific colors like orange evoke
tenderness, purple incites anguish, bright/saturated
colors are pleasant, green induces relaxation, and
yellow promotes feelings of joy and empathy.
●Reappraisal Technique:Used the "emotional
awareness and acceptance" technique, promoting
mindfulness of emotions and enhancing
understanding and regulation.
●Color Assignments:Based on insights, PVPA (Happy)
was linked with Yellow, NVPA (Stress) with Red, NVNA
(Sad) with Blue, and PVNA (Relax) with Green.
Emotion-Adaptive Responses
46
●Implementation of EAR:
○Texture Application:Used a texture on the
User's 'MainCamera', simulating the feeling of
wearing coloured sunglasses.
○Emotion-Adaptive Cues: Focused on color
adaptation for emotion cues rather than
empathic ones.
○Color Manipulation:Adjusted the texture's color
based on the physiological emotion (PE) derived
from AffectivelyVR, following established color
assignments.
Emotion-Adaptive Responses
47
VRdoGraphy
Context:Positive Psychotherapy
VR Photography competition where
participants will try their best to capture 8
Monuments.
●Photowalks: 4
●Monuments/ walk: 2
●Time per walk: 7 minutes
Top 3 photos will get a prize of $20
Westfield Voucher
Each participant will receive a postcard of
their favorite VR photo
Experimental Procedure
52
Pre
Study
Procedur
e
Rest
Condition
1
Questionnaire
s
Rest
Condition
2
Questionnaire
s
Rest
Condition
3
Questionnaire
s
Rest
Condition
4
Questionnaire
s
Post Study
Procedure
Finish
1 minute10 minutes4 minutes
Start
20 minutes
10 minutes
Total Duration: ~ 90 minutes
System Architecture
53
Biosignal Manager
54
●Core Functionality:Gathers and streams biosignals (16-channel EEG, EDA, PPG) during VR
sessions using the LabStreaming Layer (LSL) protocol and LSL4Unity API.
●Sampling Rates:EEG sampled at 125 Hz; EDA and PPG at 128 Hz.
●Data Synchronization: Implements event markers like start, stop, CFA, and photo capture
within Unity to address asynchronous data collection risks.
●Enhanced Features: Captures timestamps, predicted emotional states from AffectivelyVR,
and stores data in CSV for offline analysis.
●Mapping Emotions in VR Photowalk Experience:
○Happy (PVPA): Linked with joy and excitement, perhaps from capturing an exceptional photograph or
the thrill of potentially winning.
○Relaxation (PVNA): Representing a contented calm, possibly after securing a satisfactory shot or
immersing in the VR scenery.
○Stress (NVPA): Reflecting anxiety, possibly due to not getting the ideal shot, feeling time-constrained,
or facing competition.
○Sad (NVNA): Indicating disappointment, like missing a photo opportunity or not performing as hoped.
●Continuous to Discrete Emotion Integration:
○While emotions inherently span a continuous spectrum, discrete categorizations simplify data
interpretation, user feedback, and system interventions in VR.
○The chosen discrete emotions (Happy, Relaxation, Stress, Sad) are contextually relevant to the VR
photography contest experience and its rewards.
AffectivelyVR
●Emotion Classification:
○The ‘AffectivelyVR ML’ model predicts emotions like ‘happy’ (PVPA), ‘stress’ (NVPA), ‘sad’ (NVNA),
and ‘relaxation’ (PVNA) with 83.6% cross-validation accuracy.
○These emotions were tailored for the VR photography experience of the study.
●Data Collection & Storage:
○Biosignal data is collected from the Biosignal Manager system via LSL.
○Data is initially buffered for 60 seconds (7500 EEG samples & 7680 EDA/PPG samples).
●Data Processing & Overlapping Window:
○After reaching capacity, the first 30 seconds of data is deleted, ensuring a continuous 60-
second window with a 30-second overlap.
○Predicts emotional states every 30 seconds after the initial 60 seconds.
AffectivelyVR
●Emotion Prediction & Streaming:
○Data follows preprocessing, feature extraction, and is compared with the AffectivelyVR ML model.
○Predicted emotional states are streamed back to Unity with classifications: 1 for Happy, 2 for
Stress, 3 for Sad, and 4 for Relaxation.
AffectivelyVR
Companion Manager
58
●Functionality & Activation: Manages companion interactions, offers voice assistance,
oversees photos, and aids in time management; only activates upon assistance request.
●Contextual Data Collection:Extracts and streams navigation data like direction, journey
duration, on-time status, and photo details to the Companion Adaptor via LSL.
●Companion Visualization & Speech:Portrayed as an orb-shaped avatar with speech
enabled by Microsoft Azure's TTS SDK; allows customization and integration with SSML for
nuanced speech playback in Unity.
●Trigger Points & Management:Companion activates during events like photo capture,
halfway time mark, last minute, and ten seconds. Tracks and manages photo count and
storage locally.
Companion Adaptor
59
●System Foundation:
○Developed using Python 3.7.
○Aim:Collect context from the Companion Manager system, determine relevant TTS
tone style, empathic dialogue, and assistance content, and then produce the SSML file
for the TTS SDK.
●Data Collection and Emotional Analysis:
○Used LabStreaming Layer (LSL) to gather contextual data.
○Derived situational emotion (SE) as per Self-Projected Appraisal (SPA)
●Merged current emotional state (Physiological Emotion) with SE to generate Empathic
Emotional Features (EEFs) as per guidelines in EEF Table.
●Used Empathic Emotion (EE) to pick the right tone style and Empathic Dialogue
Companion Adaptor
60
●Assistance Content Determination:
○Content set based on activity type and given info. (e.g., For navigation, "Walk towards Right" if
direction to next monument is "Right").
○For photo activities, it provides photo count details (e.g., "You have 5 photos Left. Do you wish to
save this photo?").
●In CAE systems, empathic dialogue is combined with assistance content or remains as only assistance
for non-CAE systems.
●File Generation and Communication:
○Generated an SSML file with the chosen tone and content.
○Once file creation is complete, an acknowledgment is sent to Unity's Companion Manager system
through LSL.
RQ 3
●2x2 within-subject study with 14 participants (7 Female)
●Independent Variables
○Emotion-Adaptive Environment (EA)
○Empathic Companion (EC)
●Dependent Variables
○Physiology (Electroencephalogram (EEG), Electrodermal Activity (EDA), & Heart Rate Variability (HRV)
○Subjective: Self-Assessment Manikin (SAM) scale for Emotional State, IGroup Presence
Questionnaire (IPQ) for Presence, NASA-TLX for Cognitive Load, Game Experience Questionnaire
for Positive, Negative Affect and Empathy, Flow Short State Questionnaire (FSSQ) for flow state, ans
System Trust Scale (STS) for human trust on agent.
○Behavior: Task performance, error rate, rate of assistance
61
Experimental Methodology
ConditionsNo-EC EC
No-EA NEC-NEA EC-NEA
EA NEC-EA EC-EA
RQ 3
●Valence Shift: CAE substantially enhanced
valence (SAM ratings, p < 0.001), indicating an
improved emotional state.
●Empathy in VR:Significant increase in
empathy towards virtual agents when CAE is
present (GEQ scores, p = 0.004).
●Cognitive Demand:CAE linked to lower task
load perception (NASA-TLX scores, p = 0.01),
suggesting more effortless interactions.
●Trust Dynamics:CAE exerted a significant
effect on the System Trust Scale (STS ratings,
p = 0.01),
62
CAEIxs Enhancing User Experience in VR
RQ 3
●Physiological Indicators:EA influenced both
skin conductance (EDA-PA, p = 0.002) and
heart rate variability (LF/HF, p < 0.01), marking
physiological engagement.
●EEG Correlations: CAE significantly impacted
specific EEG markers like Frontal Central
Theta (FC-Theta, p < 0.006) and Frontal
Parietal Beta (FP-Beta, p < 0.006), possibly
related to cognitive processing and attention.
63
CAEIxs Enhancing User Experience in VR
RQ 3
64
CAEIxs Enhancing User Experience in VR
●Adaptability in Research
○Flexibility to adapt experiments and theories based on emerging data
○The importance of iterating design based on user feedback and physiological data insights
●Methodological Rigor:
○The need for comprehensive data collection methodologies to ensure the robustness of findings
○Importance of balanced qualitative insights with quantitative and behavioral data for a well-
rounded understanding
●Interdisciplinary Approach:
○Leveraging knowledge from psychology, HCI, and data analysis to enrich VR research
○Collaborating across disciplines to innovate and refine empathic VR technologies
66
Lessons Learned
●User-Centered Design
○Placing User Experience at the forefront when developing VR systems
○Understanding the underlying process to personalize is the key to effective empathic response in
VR
●Technological Challenges:
○Navigating the limitations of current VR and sensor technology.
○Recognizing the importance of advancing hardware for more nuanced data capture and
interaction.
●Ethical Considerations:
○Addressing privacy concerns with biometric data usage.
○Considering the psychological impact of immersive experiences on diverse user groups.67
Lessons Learned
Research Fellow at Empathic Computing Laboratory
within Auckland Bioengineering Institute at The
University of Auckland, working with Prof. Mark
Billinghurst.
Research revolves around the junction of Empathic
Computing, Virtual Reality, Digital Agents, and
Context-Aware Interactions.
93
ABOUT ME
Worked as UX Researcher at Google, India
●Various projects related to Google Search
Social Impact: Education, Civic Services, and
Health
Interaction Designer at Assistive Wearable Startup
in India
●Lechal -Smart Insoles assisting in eye-free
navigation through haptic feedback and
tracking fitness metrics
●Lechal 2.0 -Additional feature: Fall Risk
Intervention for elderly by understanding
user’s walking behavior and predicting the fall
risk
94
INDUSTRY EXPERIENCE
Experimental Conditions
A B
NEC-NEA
(No Empathic Companion -
No Emotion Adaptation)
-Baseline condition with goal and tasks
-No color filter
-VA will provide on-demand assistance. It
will recommend in navigation, task
assistance
NEC-EA
(No Empathic Companion -
Emotion Adaptation)
-Baseline condition with goal and tasks
-Color filters will be added depending on
the real-time emotional state
-VA will provide on-demand assistance. It
will recommend in navigation, task
assistance
99
Experimental Conditions
C D
EC-NEA
(Empathic Companion -
No Emotion Adaptation)
-Baseline condition with goal and tasks
-No color filter
-VA will provide on-demand assistance. It
will recommend in navigation, task
assistance but starting with a context-
relevant empathic dialogue and in
emotional tone
EC-EA
(Empathic Companion -
Emotion Adaptation)
-Baseline condition with goal and tasks
-Color filters will be added depending on
the real-time emotional state
-VA will provide on-demand assistance. It
will recommend in navigation, task
assistance but starting with a context-
relevant empathic dialogue and in
emotional tone
100
CAEVR System
101
AR-glass capable of adding color filter based on the participant’s emotional state.
-Low-saturation colors evoke a sense of depression, while high-saturation ones lead people to feel
cheerful and pleasant