IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.

marknb00 120 views 45 slides Jul 21, 2024
Slide 1
Slide 1 of 45
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45

About This Presentation

IVE 2024 short course on the Psychology of XR, Lecture18 on Hacking Emotions in VR Collaboration.

This lecture was given by Theo Teo on July 19th 2024 at the University of South Australia.


Slide Content

Hacking Emotions in VR Collaboration
Sharing Cues that Facilitate Emotional Consensus
Theo Teo
Empathic Computing Lab
The Psychology XR
19 July 2024

Existing Works on XR Social Computing

XR Social Computing since 2012
•Visual cue have been widely used for
sharing thoughts and emotions
•Audiotory and Tactile feedback are
minor but increasing since 2020
•2X publications in last 2 years
# of publications on XR Social ComputingThree major modalities used in XR Social Computing

So.. Is this area well-studied?
Not really…

Human Interaction

Human Interaction | Task Collaboration

Human Interaction | Task Collaboration | Psychology XR

Existing Gaps
•Limited studies on sharing emotions in a collaboration scenario
•Emotional consensus
•Conflict resolution
•Engagement and Communication
•Empathy
•Cognitive Empathy
•Emotional Empathy
•Why it happens
•How it happens

Journey towards Hacking Emotions
•Virtual Reality collaboration
•Sharing and understanding emotions between collaborators
•Addresses existing gaps
•Technique that is transferable (design implementation)

Identifying the Research Topic
•Sharing emotions were built on the
foundation of ‘putting the cue closer
to your body or perspective’
•Is there a way to bring this to
collaboration scenario where there
is always a task to focus (together)?

Prologue: Bodily Emotional Cues
•Sharing emotions
•Facial expressions
•Facial emojis
•Emotional cues / effects
•Attaching emotional cues
•Own Hand
•Grabbing Object

A Quick Glance
HandObjectGaze

Is there a way for
in-depth perception?

3D Face on Grabbing Object
3D Face on Hand

2D Face vs 3D Face - A Pilot Test
•2D Face
•Less obstructing
•More natural (akin to screen display)
•Harder to notice
•3D Face
•Able to see from wider angle
•More suitable for interaction tasks
•Emotions were understood better

Impact of Facial Cue Positions on
Understanding Emotions - A User Study
•What is the role of position in VR collaboration that concerns
emotional understanding?
•Three measurements (attachments)
•Baseline (No Facial Cue)
•Own’s Hand
•Own’s Grabbing Object

The Furniture Negotiation Task
•Dyad study
•Furniture Arrangement based on task requirements
•Instructions were unique to each side (Conflict, Non-conflict)

RTECMPSY
How satisfied
are you?
How satisfied
you think your
partner are?
1 - NOT VERY SATISFIED
5 - VERY SATISFIED
2 …
3 …
4 …
1 - NOT VERY SATISFIED
5 - VERY SATISFIED
2 …
3 …
4 …

CMTMVEPSY
This is ..
GREAT!!!
I preferred A
because B
and C..I’m so sorry for
breathing the air!!

CMTMVEPSY
This is ..
GREAT!!!
I preferred A
because B
and C..I’m so sorry for
breathing the air!!
Blink
Focus

CMTMVEPSY
This is ..
GREAT!!!
I preferred A
because B
and C..I’m so sorry for
breathing the air!!Blink
Heart Rate
Galvanic
Skin Response

Results
•Correlations were found on emotional consensus when the facial
cue attaches to hand (MH)
•Self-reported emotional consensus is better when there is facial cue
A: I feel… B: I think my partner feel… C: My partner feel… D: My parter think I feel…

Results - Continued
Unnatural feeling…
A bit scary to me to see other's face on my hand..
Facial cue is not that obvious
hand face was upsetting to look at
It makes me scare, so I don't pay attention to the avatar
I didn't like a face staring me on the object
looks a bit... mystical (like Harry Porter)
face was a bit creepy
Cue on Hand was the easiest way to see my partners emotions
having two separate representations of my partner
Facial cue on Hand is helpful
Seeing the face on the hand was cool.
Poke my partner
I could easily see my partner’s face at all times when moving objects
having facial cue pop up on object reminded me to check in with how my partner was feeling.

*

Key takeaways
•Attaching facial cues closer to view of interest can influence
awareness and expression of emotions between collaborators
•There might be also some effects of hand attached facial cue
on gaze behaviour too..
•Except…
•The 3D face is unnatural, creepy and scary
•Task can be built more ‘engaging’

Improving the Facial Cue from our Findings
•Less creepy while maintaining similar or better interpretation of
emotions
•Task to only include mutual goal but complex enough to require
discussion
•More variations on attachable positions for the facial cue

SelfHand

SelfObject

PartnerHand

PartnerObject

Revising the Study of Facial Cue
Positions on Understanding Emotions
•What is the role of position in VR collaboration that concerns
emotional understanding?
•Five measurements (attachments)
•Baseline (No Facial Cue)
•Own’s Hand
•Own’s Grabbing Object
•Task Partner’s Hand
•Task Partner’s Grabbing Object

Urban Planning Discussion Task
•Dyad study
•Discuss to arrange task object to the correct task grid
•Instructions were split into half (one for each side)
•A red button that randomly appears every 20-25s to increase
the work load
•Emotion Sync Indicator

Demo Submitted to SA24 - XR
Theo Teo, Allison Jing, Xuan Tien, Gun A. Lee

So.. next step?

Adaptive Environments: Create environments that change in real-time based on the group's emotional state.
For instance, calming visuals and sounds when stress levels are high.
Emotion Recognition: Implement AI systems that analyze facial expressions, body language, and voice tones to
assess and respond to participants' emotional states in real-time.
Group Biofeedback: Study the impact of displaying aggregated biofeedback data to the group to enhance empathy
and understanding of each other’s emotional states.

On top of that..
•Multimodal empathy / emotional sharing
•Visual + Haptic; Audio + Haptic; Haptic + Haptic etc..
•Expand the haptic modality
•Thermal, vibrotactile feedback, pressure, actuator and many more..
•Empathy XR that involves 3 or more users
•Group emotions vs individual emotion; selective emotion
•Further onto hacking emotions
•Embodiment, cognitive load, attention etc.

Summing Up
“Build empathy and better understanding through sharing each
other's emotional states to enhance collaboration in Mixed
Reality (Ens et al. 2019 survey toward future MR collaboration topics and directions)
•Studying the understanding emotions, particularly empathy in
immersive collaboration is still an expanding topic
•Positioning emotions closer to yourself or your partner could be
helpful and useful for understanding emotions

Hacking Emotions in VR Collaboration
Sharing Cues that Facilitate Emotional Consensus
Thank you!