Ai lecture about VR technology discuss.pptx

ALAMGIRHOSSAIN256982 42 views 108 slides Oct 12, 2024
Slide 1
Slide 1 of 108
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68
Slide 69
69
Slide 70
70
Slide 71
71
Slide 72
72
Slide 73
73
Slide 74
74
Slide 75
75
Slide 76
76
Slide 77
77
Slide 78
78
Slide 79
79
Slide 80
80
Slide 81
81
Slide 82
82
Slide 83
83
Slide 84
84
Slide 85
85
Slide 86
86
Slide 87
87
Slide 88
88
Slide 89
89
Slide 90
90
Slide 91
91
Slide 92
92
Slide 93
93
Slide 94
94
Slide 95
95
Slide 96
96
Slide 97
97
Slide 98
98
Slide 99
99
Slide 100
100
Slide 101
101
Slide 102
102
Slide 103
103
Slide 104
104
Slide 105
105
Slide 106
106
Slide 107
107
Slide 108
108

About This Presentation

lecture about VR


Slide Content

LECTURE 3: VR TECHNOLOGY Muhammad Asim Khan

Presence Perception and VR Human Perception Sight, hearing, touch, smell, taste VR Technology Visual display Recap – Lecture 2

Presence .. “The subjective experience of being in one place or environment even when physically situated in another” Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence questionnaire. Presence: Teleoperators and virtual environments , 7 (3), 225-240.

How do We Perceive Reality? We understand the world through our senses : Sight, Hearing, Touch, Taste, Smell (and others..) Two basic processes: Sensation – Gathering information Perception – Interpreting information

Simple Sensing/Perception Model

Creating the Illusion of Reality Fooling human perception by using technology to generate artificial sensations Computer generated sights, sounds, smell, etc

Reality vs. Virtual Reality In a VR system there are input and output devices between human perception and action

Using Technology to Stimulate Senses Simulate output E.g. simulate real scene Map output to devices Graphics to HMD Use devices to stimulate the senses HMD stimulates eyes Example: Visual Simulation Visual S i mu lati on 3D Graphics HMD Vision S ystem Brain Human-Machine Interface

Creating an Immersive Experience Head Mounted Display Immerse the eyes Projection/Large Screen Immerse the head/body Future Technologies Neural implants Contact lens displays, etc

HMD Basic Principles Use display with optics to create illusion of virtual screen

Key Properties of HMDs Lens Focal length, Field of View Occularity, Interpupillary distance Eye relief, Eye box Display Resolution, contrast Power, brightness Refresh rate Er gonom ics Size, weight Wearability

VR Display Taxonomy

TRACK I NG

Tracking in VR Head Tracking Hand Tracking Need for Tracking User turns their head and the VR graphics scene changes User wants to walking through a virtual scene User reaches out and grab a virtual object The user wants to use a real prop in VR All of these require technology to track the user or object Continuously provide information about position and orientation

Degree of Freedom = independent movement about an axis 3 DoF Orientation = roll, pitch, yaw (rotation about x, y, or z axis) 3 DoF Translation = movement along x,y,z axis Different requirements User turns their head in VR -> needs 3 DoF orientation tracker Moving in VR -> needs a 6 DoF tracker (r,p,y) and (x, y, z) Degrees of Freedom

Tracking and Rendering in VR Tracking fits into the graphics pipeline for VR

Tracking Technologies Active (device sends out signal) Mechanical, Magnetic, Ultrasonic GPS, Wifi, cell location Passive (device senses world) Inertial sensors (compass, accelerometer, gyro) Computer Vision Marker based, Natural feature tracking Hybrid Tracking Combined sensors (eg Vision + Inertial)

Tracking Types M a g n et ic Tracker Inertial T r a c k e r U l t r as o n ic Tracker Optical T r a c k e r M a r k e r - B ase d Tracking M a r k e r l es s Tracking Sp e ci a li z e d Tracking E d g e - B ase d Tracking T e mp l at e- B ase d Tracking Interest Point Tracking M e c h a n ic a l Tracker

Mechanical Tracker (Active) Idea: mechanical arms with joint sensors Microscribe Sutherland • ++: high accuracy, haptic feedback -- : cumbersome, expensive

Magnetic Tracker (Active) Idea: difference between a magnetic transmitter and a receiver • ++: 6DOF, robust -- : wired, sensible to metal, noisy, expensive -- : error increases with distance Flock of Birds (Ascension)

Example: Razer Hydra Developed by Sixense Magnetic source + 2 wired controllers Short range (1-2 m) Precision of 1mm and 1 o $600 USD

Razor Hydra Demo https:// www.youtube.com/watch?v=jnqFdSa5p7w

Inertial Tracker (Passive) Idea: measuring linear and angular orientation rates (accelerometer/gyroscope) • ++: no transmitter, cheap, small, high frequency, wireless -- : drift, hysteris only 3DOF IS300 (Intersense) Wii Remote

Optical Tracker (Passive) Idea: Image Processing and Computer Vision Specialized Infrared, Retro-Reflective, Stereoscopic Monocular Based Vision Tracking A R T H i -B a l l

Outside-In vs. Inside-Out Tracking

Example: Vive Lighthouse Tracking Outside-in tracking system 2 base stations Each with 2 laser scanners, LED array Headworn/handheld sensors 37 photo-sensors in HMD, 17 in hand Additional IMU sensors (500 Hz) Performance Tracking server fuses sensor samples Sampling rate 250 Hz, 4 ms latency See http://doc-ok.org/?p=1478

Lighthouse Components Base station IR LED array 2 x scanned lasers Head Mounted Display - 37 photo sensors - 9 axis IMU

Lighthouse Setup

Lighthouse Tracking Base station scanning https:// www.youtube.com/watch?v=avBt_P0wg_Y https:// www.youtube.com/watch?v=oqPaaMR4kY4 Room tracking

Example: Oculus Quest Inside out tracking Four cameras on corner of display Searching for visual features On setup creates map of room

Oculus Quest Tracking https:// www.youtube.com/watch?v=2jY3B_F3GZk

Occipital Bridge Engine/Structure Core Inside out tracking Uses structured light Better than room scale tracking Integrated into bridge HMD https://structure.io/

https:// www.youtube.com/watch?v=qbkwew3bfWU

Tracking Coordinate Frames There can be several coordinate frames to consider Head pose with respect to real world Coordinate fame of tracking system wrt HMD Position of hand in coordinate frame of hand tracker

Example: Finding your hand in VR Using Lighthouse and LeapMotion Multiple Coordinate Frames LeapMotion tracks hand in LeapMotion coordinate frame (H LM ) LeapMotion is fixed in HMD coordinate frame (LM HMD ) HMD is tracked in VR coordinate frame (HMD VR ) (using Lighthouse) Where is your hand in VR coordinate frame? Combine transformations in each coordinate frame H VR = H LM x LM HMD x HMD VR

HAPTIC/TACTILE DISPLAYS

Haptic Feedback Greatly improves realism Hands and wrist are most important High density of touch receptors Two kinds of feedback: Touch Feedback information on texture, temperature, etc. Does not resist user contact Force Feedback information on weight, and inertia. Actively resists contact motion

Active Haptics Actively resists motion Key properties Force resistance Frequency Response Degrees of Freedom Latency

Example: Phantom Omni Combined stylus input/haptic output 6 DOF haptic feedback

Phantom Omni Demo https:// www.youtube.com/watch?v=REA97hRX0WQ

Haptic Glove Many examples of haptic gloves Typically use mechanical device to provide haptic feedback

Passive Haptics Not controlled by system Use real props (Styrofoam for walls) Pros Cheap Large scale Accurate Cons Not dynamic Limited use

UNC Being There Project

Passive Haptic Paddle Using physical props to provide haptic feedback http://www.cs.wpi.edu/~gogo/hive/

Tactile Feedback Interfaces Goal: Stimulate skin tactile receptors Using different technologies Air bellows Jets Actuators (commercial) Micropin arrays Electrical (research) Neuromuscular stimulations (research)

Vibrotactile Cueing Devices Vibrotactile feedback has been incorporated into many devices Can we use this technology to provide scalable, wearable touch cues?

Vibrotactile Feedback Projects TactaBoard and TactaVest Navy TSAS Project

Example: HaptX Glove https:// www.youtube.com/watch?v=4K-MLVqD1_A

Teslasuit Full body haptic feedback - https://teslasuit.io/ Electrical muscle stimulation

https:// www.youtube.com/watch?v=74QvAfxHdQY

AUDIO DISPLAYS

Audio Displays Spatialization vs. Localization Spatialization is the processing of sound signals to make them emanate from a point in space This is a technical topic Localization is the ability of people to identify the source position of a sound This is a human topic, i.e., some people are better at it than others.

Audio Display Properties Presentation Properties Number of channels Sound stage Localization Masking Amplification Logistical Properties Noise pollution User mobility Interface with tracking Integration Portability Throughput Safety Cost

Audio Displays: Head-worn Ear Buds On Ear Ope n Back C lose d Bone C onductio n

Head-Related Transfer Functions (HRTFs) A set of functions that model how sound from a source at a known location reaches the eardrum

Measuring HRTFs Putting microphones in Manikin or human ears Playing sound from fixed positions Record response

Capturing 3D Audio for Playback Binaural recording 3D Sound recording, from microphones in simulated ears Hear some examples (use headphones) http://binauralenthusiast.com/examples/

OSSIC 3D Audio Headphones https:// www.ossic.com/3d-audio/

OSSIC Demo https:// www.youtube.com/watch?v=WjvofhhzTik

VR INPUT DEVICES

VR Input Devices Physical devices that convey information into the application and support interaction in the Virtual Environment

Mapping Between Input and Output I npu t Ou t put

M ot i vat i o n Mouse and keyboard are good for desktop UI tasks Text entry, selection, drag and drop, scrolling, rubber banding, … 2D mouse for 2D windows What devices are best for 3D input in VR? Use multiple 2D input devices? Use new types of devices? vs.

Input Device Characteristics Size and shape, encumbrance Degrees of Freedom Integrated (mouse) vs. separable (Etch-a-sketch) Direct vs. indirect manipulation Relative vs. Absolute input Relative: measure difference between current and last input (mouse) Absolute: measure input relative to a constant point of reference (tablet) Rate control vs. position control Isometric vs. Isotonic Isometric: measure pressure or force with no actual movement Isotonic: measure deflection from a center point (e.g. mouse)

Hand Input Devices Devices that integrate hand input into VR World-Grounded input devices Devices fixed in real world (e.g. joystick) Non-Tracked handheld controllers Devices held in hand, but not tracked in 3D (e.g. xbox controller) Tracked handheld controllers Physical device with 6 DOF tracking inside (e.g. Vive controllers) Hand-Worn Devices Gloves, EMG bands, rings, or devices worn on hand/arm Bare Hand Input Using technology to recognize natural hand input

World Grounded Devices Disney Aladdin Magic Carpet VR Ride Devices constrained or fixed in real world Not ideal for VR Constrains user motion Good for VR vehicle metaphor Used in location based entertainment (e.g. Disney Aladdin ride)

Non-Tracked Handheld Controllers Devices held in hand Buttons, joysticks, game controllers, etc. Traditional video game controllers Xbox controller

Tracked Handheld Controllers (3 or 6 DoF) HTC Vive Controllers Oculus Touch Controllers Handheld controller with 6 DOF tracking Combines button/joystick input plus tracking One of the best options for VR applications Physical prop enhancing VR presence Providing proprioceptive, passive haptic touch cues Direct mapping to real hand motion

Example: Sixense STEM Wireless motion tracking + button input Electromagnetic tracking, 8 foot range, 5 tracked receivers http://sixense.com/wireless

Sixense Demo Video https:// www.youtube.com/watch?v=2lY3XI0zDWw

Example: WMR Handheld Controllers Windows Mixed Reality Controllers Left and right hand Combine computer vision + IMU tracking Track both in and out of view Button input, Vibration feedback

https:// www.youtube.com/watch?v=rkDpRllbLII

Cubic Mouse Plastic box Polhemus Fastrack inside (magnetic 6 DOF tracking) 3 translating rods, 6 buttons Two handed interface Supports object rotation, zooming, cutting plane, etc. Fröhlich, B., & Plate, J. (2000). The cubic mouse: a new device for three-dimensional input. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 526- 531). ACM.

Cubic Mouse Video https:// www.youtube.com/watch?v=1WuH7ezv_Gs

Hand Worn Devices Devices worn on hands/arms Glove, EMG sensors, rings, etc. Advantages Natural input with potentially rich gesture interaction Hands can be held in comfortable positions – no line of sight issues Hands and fingers can fully interact with real objects

Myo Arm Band https:// www.youtube.com/watch?v=1f_bAXHckUY

Data Gloves Bend sensing gloves Passive input device Detecting hand posture and gestures Continuous raw data from bend sensors Fiber optic, resistive ink, strain-gauge Large DOF output, natural hand output Pinch gloves Conductive material at fingertips Determine if fingertips touching Used for discrete input Object selection, mode switching, etc.

How Pinch Gloves Work Contact between conductive fabric completes circuit Each finger receives voltage in turn (T 3 – T 7 ) Look for output voltage at different times

Example: Cyberglove Invented to support sign language Technology Thin electrical strain gauges over fingers Bending sensors changes resistence 18-22 sensors per glove, 120 Hz samples Sensor resolution 0.5 o Very expensive > $10,000/glov e http://www.cyberglovesystems.com

How CyberGlove Works Strain gauge at joints Connected to A/D converter

Demo Video https:// www.youtube.com/watch?v=IUNx4FgQmas

StretchSense Wearable motion capture sensors Capacitive sensors Measure stretch, pressure, bend, shear Many applications Garments, gloves, etc. http://stretchsense.com/

StretchSense Glove Demo https:// www.youtube.com/watch?v=wYsZS0p5uu8

Comparison of Glove Performance From Burdea, Virtual Reality Technology, 2003

Bare Hands Using computer vision to track bare hand input Creates compelling sense of Presence, natural interaction Challenges need to be solved Not having sense of touch Line of sight required to sensor Fatigue from holding hands in front of sensor

Leap Motion IR based sensor for hand tracking ($50 USD) HMD + Leap Motion = Hand input in VR Technology 3 IR LEDS and 2 wide angle cameras The LEDS generate patternless IR light IR reflections picked up by cameras Software performs hand tracking Performance 1m range, 0.7 mm accuracy, 200Hz https:// www.leapmotion.com/

Example: Leap Motion https:// www.youtube.com/watch?v=QD4qQBL0X80

Non-Hand Input Devices Capturing input from other parts of the body Head Tracking Use head motion for input Eye Tracking Largely unexplored for VR Microphones Audio input, speech Full-Body tracking Motion capture, body movement

Eye Tracking Technology Shine IR light into eye and look for reflections Advantages Provides natural hands-free input Gaze provides cues as to user attention Can be combined with other input technologies

Example: FOVE VR Headset Eye tracker integrated into VR HMD Gaze driven user interface, foveated rendering https:// www.youtube.com/watch?v=8dwdzPaqsDY

Pupil Labs VIVE/Oculus Add-ons Adds eye-tracking to HTC Vive/Oculus Rift HMDs Mono or stereo eye-tracking 120 Hz eye tracking, gaze accuracy of 0.6° with precision of 0.08° Open source software for eye-tracking https://pupil-labs.com/pupil/

HTC Vive Pro Eye HTC Vive Pro with integrated eye-tracking Tobii systems eye-tracker Easy calibration and set-up Auto-calibration software compensates for HMD motion

https:// www.youtube.com/watch?v=y_jdjjNrJyk

Full Body Tracking Adding full-body input into VR Creates illusion of self-embodiment Significantly enhances sense of Presence Technologies Motion capture suit, camera based systems Can track large number of significant feature points

Camera Based Motion Capture Use multiple cameras Reflective markers on body Eg – Opitrack (www.optitrack.com) 120 – 360 fps, < 10ms latency, < 1mm accuracy

Optitrack Demo https:// www.youtube.com/watch?v=tBAvjU0ScuI

Wearable Motion Capture: PrioVR Wearable motion capture system 8 – 17 inertial sensors + wireless data transmission 30 – 40m range, 7.5 ms latency, 0.09 o precision Supports full range of motion, no occlusion www.priovr.com

PrioVR Demo https:// www.youtube.com/watch?v=q72iErtvhNc

Pedestrian Devices Pedestrian input in VR Walking/running in VR Virtuix Omni Special shoes http://www.virtuix.com Cyberith Virtualizer Socks + slippery surface http://cyberith.com

Cyberith Virtualizer Demo https:// www.youtube.com/watch?v=R8lmf3OFrms

Virtusphere Fully immersive sphere Support walking, running in VR Person inside trackball http://www.virtusphere.com

Virtusphere Demo https:// www.youtube.com/watch?v=5PSFCnrk0GI

Omnidirectional Treadmills Infinadeck 2 axis treadmill, flexible material Tracks user to keep them in centre Limitless walking input in VR www.infinadeck.com

Infinadeck Demo https:// www.youtube.com/watch?v=seML5CQBzP8

Comparison Between Devices From Jerald (2015) Comparing between hand and non-hand input

Input Device Taxonomies Helps to determine: Which devices can be used for each other What devices to use for particular tasks Many different approaches Separate the input device from interaction technique (Foley 1974) Mapping basic interactive tasks to devices (Foley 1984) Basic tasks – select, position, orient, etc. Devices – mouse, joystick, touch panel, etc. Consider Degrees of Freedom and properties sensed (Buxton 1983) motion, position, pressure Distinguish bet. absolute/relative input, individual axes (Mackinlay 1990) separate translation, rotation axes instead of using DOF

Foley and Wallace Taxonomy (1974) Separate device from interaction technique

Buxton Input Device Taxonomy (Buxton 1983) Classified according to degrees of freedom and property sensed M = devise uses an intermediary between hand and sensing system T = touch sensitive
Tags