IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
marknb00
120 views
45 slides
Jul 21, 2024
Slide 1 of 45
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
About This Presentation
IVE 2024 short course on the Psychology of XR, Lecture18 on Hacking Emotions in VR Collaboration.
This lecture was given by Theo Teo on July 19th 2024 at the University of South Australia.
Size: 11.39 MB
Language: en
Added: Jul 21, 2024
Slides: 45 pages
Slide Content
Hacking Emotions in VR Collaboration
Sharing Cues that Facilitate Emotional Consensus
Theo Teo
Empathic Computing Lab
The Psychology XR
19 July 2024
Existing Works on XR Social Computing
XR Social Computing since 2012
•Visual cue have been widely used for
sharing thoughts and emotions
•Audiotory and Tactile feedback are
minor but increasing since 2020
•2X publications in last 2 years
# of publications on XR Social ComputingThree major modalities used in XR Social Computing
So.. Is this area well-studied?
Not really…
Human Interaction
Human Interaction | Task Collaboration
Human Interaction | Task Collaboration | Psychology XR
Existing Gaps
•Limited studies on sharing emotions in a collaboration scenario
•Emotional consensus
•Conflict resolution
•Engagement and Communication
•Empathy
•Cognitive Empathy
•Emotional Empathy
•Why it happens
•How it happens
Journey towards Hacking Emotions
•Virtual Reality collaboration
•Sharing and understanding emotions between collaborators
•Addresses existing gaps
•Technique that is transferable (design implementation)
Identifying the Research Topic
•Sharing emotions were built on the
foundation of ‘putting the cue closer
to your body or perspective’
•Is there a way to bring this to
collaboration scenario where there
is always a task to focus (together)?
2D Face vs 3D Face - A Pilot Test
•2D Face
•Less obstructing
•More natural (akin to screen display)
•Harder to notice
•3D Face
•Able to see from wider angle
•More suitable for interaction tasks
•Emotions were understood better
Impact of Facial Cue Positions on
Understanding Emotions - A User Study
•What is the role of position in VR collaboration that concerns
emotional understanding?
•Three measurements (attachments)
•Baseline (No Facial Cue)
•Own’s Hand
•Own’s Grabbing Object
The Furniture Negotiation Task
•Dyad study
•Furniture Arrangement based on task requirements
•Instructions were unique to each side (Conflict, Non-conflict)
RTECMPSY
How satisfied
are you?
How satisfied
you think your
partner are?
1 - NOT VERY SATISFIED
5 - VERY SATISFIED
2 …
3 …
4 …
1 - NOT VERY SATISFIED
5 - VERY SATISFIED
2 …
3 …
4 …
CMTMVEPSY
This is ..
GREAT!!!
I preferred A
because B
and C..I’m so sorry for
breathing the air!!
CMTMVEPSY
This is ..
GREAT!!!
I preferred A
because B
and C..I’m so sorry for
breathing the air!!
Blink
Focus
CMTMVEPSY
This is ..
GREAT!!!
I preferred A
because B
and C..I’m so sorry for
breathing the air!!Blink
Heart Rate
Galvanic
Skin Response
Results
•Correlations were found on emotional consensus when the facial
cue attaches to hand (MH)
•Self-reported emotional consensus is better when there is facial cue
A: I feel… B: I think my partner feel… C: My partner feel… D: My parter think I feel…
Results - Continued
Unnatural feeling…
A bit scary to me to see other's face on my hand..
Facial cue is not that obvious
hand face was upsetting to look at
It makes me scare, so I don't pay attention to the avatar
I didn't like a face staring me on the object
looks a bit... mystical (like Harry Porter)
face was a bit creepy
Cue on Hand was the easiest way to see my partners emotions
having two separate representations of my partner
Facial cue on Hand is helpful
Seeing the face on the hand was cool.
Poke my partner
I could easily see my partner’s face at all times when moving objects
having facial cue pop up on object reminded me to check in with how my partner was feeling.
*
Key takeaways
•Attaching facial cues closer to view of interest can influence
awareness and expression of emotions between collaborators
•There might be also some effects of hand attached facial cue
on gaze behaviour too..
•Except…
•The 3D face is unnatural, creepy and scary
•Task can be built more ‘engaging’
Improving the Facial Cue from our Findings
•Less creepy while maintaining similar or better interpretation of
emotions
•Task to only include mutual goal but complex enough to require
discussion
•More variations on attachable positions for the facial cue
SelfHand
SelfObject
PartnerHand
PartnerObject
Revising the Study of Facial Cue
Positions on Understanding Emotions
•What is the role of position in VR collaboration that concerns
emotional understanding?
•Five measurements (attachments)
•Baseline (No Facial Cue)
•Own’s Hand
•Own’s Grabbing Object
•Task Partner’s Hand
•Task Partner’s Grabbing Object
Urban Planning Discussion Task
•Dyad study
•Discuss to arrange task object to the correct task grid
•Instructions were split into half (one for each side)
•A red button that randomly appears every 20-25s to increase
the work load
•Emotion Sync Indicator
Demo Submitted to SA24 - XR
Theo Teo, Allison Jing, Xuan Tien, Gun A. Lee
So.. next step?
Adaptive Environments: Create environments that change in real-time based on the group's emotional state.
For instance, calming visuals and sounds when stress levels are high.
Emotion Recognition: Implement AI systems that analyze facial expressions, body language, and voice tones to
assess and respond to participants' emotional states in real-time.
Group Biofeedback: Study the impact of displaying aggregated biofeedback data to the group to enhance empathy
and understanding of each other’s emotional states.
On top of that..
•Multimodal empathy / emotional sharing
•Visual + Haptic; Audio + Haptic; Haptic + Haptic etc..
•Expand the haptic modality
•Thermal, vibrotactile feedback, pressure, actuator and many more..
•Empathy XR that involves 3 or more users
•Group emotions vs individual emotion; selective emotion
•Further onto hacking emotions
•Embodiment, cognitive load, attention etc.
Summing Up
“Build empathy and better understanding through sharing each
other's emotional states to enhance collaboration in Mixed
Reality (Ens et al. 2019 survey toward future MR collaboration topics and directions)
•Studying the understanding emotions, particularly empathy in
immersive collaboration is still an expanding topic
•Positioning emotions closer to yourself or your partner could be
helpful and useful for understanding emotions
Hacking Emotions in VR Collaboration
Sharing Cues that Facilitate Emotional Consensus
Thank you!