Research Directions in Heads-Up Computing

marknb00 693 views 76 slides Oct 05, 2024
Slide 1
Slide 1 of 76
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68
Slide 69
69
Slide 70
70
Slide 71
71
Slide 72
72
Slide 73
73
Slide 74
74
Slide 75
75
Slide 76
76

About This Presentation

Keynote speech given by Mark Billinghurst at the workshop on Heads Up Computing at the UbiComp 2024 conference. Given on October 5th 2024. The talk discusses some research directions in Heads-Up Computing.


Slide Content

RESEARCH DIRECTIONS IN
HEADS-UP COMPUTING
Mark Billinghurst
[email protected]
Workshop on Heads-up Computing
Ubicomp 2024
October 5
th 2024

1967 – IBM 1401 – half of the computers in the world, $10,000/month to run

2013 Google Glass

The Incredible Disappearing Computer
1960-70’s
Room
1970-80’s
Desk
1980-90’s
Lap
1990-2000’s
Hand
2010 -
Head

Heads-Up Computing
A new interaction paradigm that offers
seamless computing support for
humans' daily activities.
Zhao, S., Tan, F., & Fennedy, K. (2023). Heads-Up Computing
Moving Beyond the Device-CenteredParadigm.Communications
of the ACM,66(9), 56-63.

Moving from device centric interactions to human centric

Devices Becoming Available
•Socially acceptable form factor, all day use, enhancing everyday activities
Meta Ray-ban
Vuzix z100
Xreal Air 2

Meta Orion

Many Possible Research Directions
•Collaborative System
•Novel Interfaces
•Cross Reality Systems
•Hybrid Interfaces
•Prototyping Tools
•Social issues

COLLABORATIVE SYSTEMS

A wide variety of communication cues used.
Speech
Paralinguistic
Para-verbals
Prosodics
Intonation
Audio
Gaze
Gesture
Face Expression
Body Position
Visual
Object Manipulation
Writing/Drawing
Spatial Relationship
Object Presence
Environmental
Face to Face Communication

Smart Glass for Remote Collaboration
•Camera + Processing + AR Display + Connectivity
•First person Ego-Vision Collaboration

AR ViewRemote Expert View

Empathy Glasses
•Combine together eye-tracking, display, face expression
•Implicit cues – eye gaze, face expression
++
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. InProceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.

Remote Collaboration
•Eye gaze pointer and remote pointing
•Face expression display
•Implicit cues for remote collaboration

Shared Sphere – 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a
live panorama. InSIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications(pp. 1-4).

3D Live Scene Capture
•Use cluster of RGBD sensors
•Fuse together 3D point cloud

Scene Capture and Sharing
Scene ReconstructionRemote ExpertLocal Worker

AR ViewRemote Expert View

View Sharing Evolution
•Increased immersion
•Improved scene understanding
•Better collaboration
2D 360 3D

NOVEL INTERFACES

Interface Challenge
•Need to support
•Natural user interaction
•Rich input expression
•In a way that
•Is socially acceptable
•Ergonomic, reliable
•Can be used all day
•Uses few devices

Avoiding Gorilla Arm
•Design interface to reduce mid-air gestures

Multi-Scale Gesture
•Combine different gesture types
•In-air gestures – natural but imprecise
•Micro-gesture – fine scale gestures
•Gross motion + fine tuning interaction
Ens, B., Quigley, A., Yeo, H. S., Irani, P., Piumsomboon, T., & Billinghurst, M. (2018). Counterpoint: Exploring Mixed-Scale Gesture Interaction
for AR Applications. InExtended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems(p. LBW120). ACM.

https://www.youtube.com/watch?v=TRfqNtt1VxY&t=23s

Gaze and Pinch Interaction
Pfeuffer, K., Mayer, B., Mardanbegi, D., & Gellersen, H. (2017, October). Gaze+ pinch interaction
in virtual reality. In Proceedings of the 5th symposium on spatial user interaction (pp. 99-108).

https://www.youtube.com/watch?v=NzLrZSF8aDM

Physiological Sensor Input
•Using physiological sensors for implicit input
•Systems that recognize user intent/activity
•EEG
•Measuring brain activity
•EMG
•Measuring muscle activity

https://www.youtube.com/watch?v=K34p7RwjWt0

Meta EMG Demo
•https://www.youtube.com/watch?v=WmxLiXAo9ko

Sensor Enhanced VR HMDs
Eye tracking, heart rate,
pupillometry, and face camera
HP Omnicept Project Galea
EEG, EEG, EMG, EDA, PPG,
EOG, eye gaze, etc.

Multiple Physiological Sensors into HMD
•Incorporate range of sensors on HMD faceplate and over head
•EMG – muscle movement
•EOG – Eye movement
•EEG – Brain activity
•EDA, PPG – Heart rate

•Measure physiological cues
•Brain activity
•Heart rate
•Eye gaze
•Show user state
•Cognitive load
•Attention
Showing Cognitive Load in Collaboration

Demo

CROSS REALITY SYSTEMS

Transitioning Along the RV Continuum
Reality VirtualityAugmented RealityAugmented Virtuality
Mixed Reality
Using the same device to experience Reality, AR and VR

Why Transition Along the RV Continuum ?
•Combining the best of AR/VR
•Augmented Reality is preferred for:
•augmenting real objects
•co-located collaboration
•Virtual Reality is preferred for:
•experiencing world immersively
•sharing views, remote collaboration

Collaboration Between Points on the RV Continuum
Reality VirtualityAugmented RealityAugmented Virtuality
Collaboration between people in reality, AR and VR
Mixed Reality

Why Collaboration Across the RV Continuum ?
•Enabling everyone to connect
•Shared experience, shared understanding
•Using the ideal interface for the task
•E.g. AR for interacting in real world, VR for providing remote help

The MagicBook (2001)
Reality VirtualityAugmented
Reality (AR)
Augmented
Virtuality (AV)
Billinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook: a transitional AR interface.Computers & Graphics,25(5), 745-753.

MagicBook Demo

Features
•Seamless transition from Reality to Virtuality
•Reliance on real decreases as virtual increases
•Supports egocentric and exocentric views
•User can pick appropriate view
•Independent Views
•Privacy, role division, scalability
•Collaboration on multiple levels:
•Physical Object, AR Object, Immersive Virtual Space
•Egocentric + exocentric collaboration
•multiple multi-scale users

Apple Vision Pro (2024)
•Transitioning from AR to VR
•Spatial Computing – interface seamlessly blending with real world

Cross Reality (CR)
An emerging technology focusing on the
concurrent usage of, or the transition between
multiple systemsat different points on the
reality-virtuality continuum (RVC).
Simeone, Adalberto L., Mohamed Khamis, Augusto Esteves, Florian Daiber, MatjažKljun, KlenČopičPucihar,
Poika Isokoski, and Jan Gugenheimer. "International workshop on cross-reality (xr) interaction." InCompanion
Proceedings of the 2020 Conference on Interactive Surfaces and Spaces, pp. 111-114. 2020.

HYBRID SYSTEMS

The Incredible Disappearing Computer
1960-70’s
Room
1970-80’s
Desk
1980-90’s
Lap
1990-2000’s
Hand
2010 -
Head

Computers Don’t Disappear
Room Desk Lap Hand Head
1960’s
1980’s
2010’s

Current Generation of Smart Glasses
•Trend towards lightweight thin displays
•Offload processing onto second device
•High bandwidth connectivity to cloud services

Hybrid Interfaces
AR HMDs tethered to handheld devices/phones
○New opportunity for interaction/display
○Wide field of view of HMD and Precise input of HHD

Motivation
Complementary nature of
HMDs/HHDs

Previous Work
Our previous work
•Use touch input on phone to interact with AR HMD content
•Use tablet to provide 2D view of 3D AR conferencing space
Bleeker 2013Budhiraja2013

Example: Secondsight
A prototyping cross-device interface
•Enables an AR HMD to "extend" the screen of a smartphone
Key Features
•Can simulate a range of HMD Field of View
•Enables World-fixed or Device-fixed content placement
•Supports touch screen input, free-hand gestures, head-pose selection
Reichherzer, C., Fraser, J., Rompapas, D. C., & Billinghurst, M. (2021, May). Secondsight: A framework for cross-device augmented
reality interfaces. InExtended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems(pp. 1-6).

Content Placement
© 2021 SIGGRAPH. All Rights Reserved.

Input

Map application
© 2021 SIGGRAPH. All Rights Reserved.

PROTOTYPING TOOLS

Challenges with Prototyping HUC Systems
•Cross platform support
•Need for programming skills
•Building collaborative systems
•Need to build multiple different interfaces
•Connecting multiple devices/components
•Difficult to prototype hardware/display systems

Second Sight Implementation
Hardware
•Meta2 AR Glasses (82
o FOV)
•Samsung Galaxy S8 phone
•OptiTrack motion capture
system
Software
•Unity game engine
•Mirror networking library

Google -The Cross-device Toolkit -XDTK
•Open-source toolkit to enable
communication between Android
devices and Unity
•Handles
•Device discovery/communication
•Sensor data streaming
•ARCorepose information
•https://github.com/google/xdtk
Gonzalez, E. J., Patel, K., Ahuja, K., & Gonzalez-Franco, M. (2024, March). XDTK: A Cross-Device Toolkit for Input & Interaction
in XR. In2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)(pp. 467-470). IEEE.

VRception Toolkit
•First multi-user and multi-environment rapid-prototyping toolkit for non-experts
•Designed for rapid prototyping of CR systems
•Supports prototyping in VR or in Unity3D
Gruenefeld, U., et al. (2022, April). Vrception: Rapid prototyping of cross-reality systems in virtual reality. InProceedings of the
2022 CHI Conference on Human Factors in Computing Systems(pp. 1-15).

https://www.youtube.com/watch?v=EWzP9_FAtL8

SOCIAL ISSUES

Social Acceptance
•People don’t want to look silly
•Only 12% of 4,600 adults would be willing to wear AR glasses
•20% of mobile AR browser users experience social issues
•Acceptance more due to Social than Technical issues
•Needs further study (ethnographic, field tests, longitudinal)

Ethical Issues
•Persuasive Technology
•Affecting emotions
•Behavior modification
•Privacy Concerns
•Facial recognition
•Space capture
•Personal data
•Safety Concerns
•Sim sickness, Distraction
•Long term effects
Pase, S. (2012). Ethical considerations in augmented reality applications. InProceedings of the International Conference on
e-Learning, e-Business, Enterprise Information Systems, and e-Government (EEE)(p. 1). The Steering Committee of The
World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp).

https://www.youtube.com/watch?v=tpOAiAAHDoA
Example: I-Xray Face Recognition

Diminished Reality

CONCLUSIONS

Summary
•Heads-Up Computing – new interaction paradigm
•Moving from device focused to human focused
•Important research areas
•Remote communication, Cross Reality, Novel Interface
•Prototyping Tools, Privacy and Security
•New research opportunities emerging
•XR + AI + Sensing + ??

www.empathiccomputing.org
@marknb00
[email protected]