Artificial Intelligence Lecture Slide-06

asmshafi1 18 views 30 slides Jun 25, 2024
Slide 1
Slide 1 of 30
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30

About This Presentation

Artificial Intelligence


Slide Content

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
1
Vision and Knowledge-Based
Gesture Recognition for Human-
Robot Interaction

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
2
Contents
Introduction
Knowledge Modeling for Gesture-based HRI
Person-Centric Gesture Recognition/Interpretation
Person-Centric Gesture-Based Human-Robot
Interaction

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
3
Introduction
Motivation
Current technologies
Objectives
Proposed system architecture
Contributions

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
4
Motivation
Example Scenario of Human-Robot
Symbiosis System…
Knowledge-based
software platform
Dialog
Face &
Gesture
Collaboration

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
5
Architecture of Knowledge-Based
HRI System by Using Visual Gestures

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
6
Overview of the System
Knowledge management using SPAK (Software Platform for Agent and
Knowledge Management)
Knowledge base, Inference Engine, SPAK GUI, Network Gateway
Vision-based image analysis and recognition
Face detection from cluttered background
Person identification using eigenface method
Skin-like regions segmentation using person specific skin-color
information
Face and hand poses classification using Subspace method
Person-centricgesture interpretation/recognition
New users and gestures learner
Robot Components
Robovie (eyes, mouth, neck, arms) and Aibo (body)
Robot responds by Speech (text to speech converter) and Actions

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
7
Frame-Based Knowledge Model
for Gesture-Based HRI
SG=Static Gesture, DG=Dynamic Gesture
…Hasan

Root
Behavior PoseGestureRobotUser
Robovie
Face
Act1 SG1 SG
2
DG2DG1

AiboAct2AiboAct1Act2
Aibo Thumb
……
RobovieActAiboActStaticDynamic

Cho

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
8
Knowledge Representation
Class frames are defined for user, robot, gesture, behavior (robot
action) and pose.
The user frame includes instances of all known users (Hasan,
Cho, etc.)
The robot frame includes instances of all robots (Aibo, Robovie,
etc.)
The gesture frame includes all the static and dynamic gestures
The pose frame includes all the recognizable poses
The behavior frame includes all the predefined robot actions
Uses a frame-based Software Platform for Agent and Knowledge
Management (SPAK)

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
9
Example of SPAK Knowledge Editor

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
10
Instance-Frame “TwoHand”
Frame TwoHand
Type Instance
A-kind of Static_Gesture
Has-part Null
Semantic-link-from Null
Semantic-link-to Robot behavior
Name
Type
Condition
Argument
mFace
Instance
Any
FACE
Name
Type
Condition
Argument
mLeftHand
Instances
Any
LEFTHAND
Name
Type
Condition
Argument
mRightHand
Instance
Any
RIGHTHAND
Slot1:
Slot3:
Slot3:

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
11
Instance-Frame “RaiseTwoArms”
Frame RaiseTwoArms
Type Instance
A-kind of RobovieAct
Has-part Null
Semantic-link-from Null
Semantic-link-to Null
Name
Type
Argument
mRobot
Instance
Robovie
Name
Type
Argument
mUser
Instances
Hasan
Name
Type
Argument
mGesture
Instance
TwoHand
Name
Type
Argument
OnInstantiate
String
Function name
Slot1:
Slot2:
Slot3:
Slot4:

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
12
Gesture Frame “TwoHand” in SPAK
Robovie Action Frame “RaiseTwoArms”
in SPAK
Back

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
13
Person-Centric Gesture
Interpretation
Face detection and person identification.
Skin-like regions segmentation using person
specific skin-color information.
Face and hand poses classification using
Subspace method.
Gesture interpretation/recognition

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
14
Gesture Recognition
Static gestures are recognized using frame-based
approach with the combination of the pose classi
fication results of the three skin-like regions.
Example: If two hands and one face present in the image
then the gesture is ‘TwoHand’
Dynamic gestures are recognized from the transi
tion of face poses in a sequence of time steps.

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
15
Example of Static Gesture ‘TwoHand’
Back

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
16
Sample Visual Output of Gesture ‘One’

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
17
Static Gesture
Image analysis and recognition module sends
pose names after classified the poses
SPAK inference engine processes the information
and activates the corresponding pose frames
If the combination of pose frames support predefin
ed gesture then corresponding gesture frame will
be activated.
Example of gesture-frame ‘TwoHand’ in SPAK

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
18
Example of Activated Pose Frames
(face, left-hand, right-hand)

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
19
Example of Activated Gesture
Frame ‘TwoHand’
Back

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
20
Contents
Introduction
Knowledge Modeling for Gesture-based HRI
Person-Centric Gesture Recognition/Interpretation
User and Gestures Adaptation
Person-Centric Gesture-based Human-Robot
Interaction
Conclusions and Future Work

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
21
Person-Centric Gesture-Based
Human-Robot Interaction
Use a Software Platform for Agent and Knowledge Management
(SPAK) for decision making.
Image analysis unit sends pose names (or gesture name for
dynamic gesture) and user name to SPAK.
SPAK inference engine processes the facts and activates the
corresponding frames to interpret gesture and activates the robot
action. (Example scenarios in SPAK)
Robot responds to gesture commands with body actions and
speech.
Use an entertainment robot Aibo and a humanoid robot Robovie
for gesture-based HRI.

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
22
Example of Person-Centric Aibo
Robot Action Frames
Aibo Action for User “Cho”
Aibo Action for User “Hasan”

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
23
Human-Robovie Interaction
(User ‘Cho’)
Interaction Scenarios:
Cho:comes to Robovie
Robovie:Hi Cho, How are you? (speech+action)
Cho :Shows ‘ThumbUp’ gesture
Robovie:Oh good! Do you want to play now? (speech)
Cho :Shows ‘Ok’ gesture
Robovie:Oh thanks! (speech)
Cho :Shows ‘LeftHand’ gesture
Robovie:Raises left arms (speech + mimic action)
Cho :Shows ‘RightHand’ gesture
Robovie:Raises right arms (speech + mimic action)
Cho :Shows ‘TwoHand’ gesture
Robovie:Bye-bye (speech + action)

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
24
Human-Robovie Interaction
(User ‘Cho’)

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
25
Human-Robovie Interaction
(User ‘Hasan’)
Interaction Scenarios:
Hasan:comes to Robovie
Robovie:Hi Hasan, How are you? (speech+action)
Hasan:Shows ‘ThumbUp’ gesture
Robovie:Oh sad! Do you want to play now? (speech)
Hasan:Shows ‘Ok’ gesture
Robovie:Oh thanks! (speech)
Hasan:Shows ‘RightHand’ gesture
Robovie:Raises right arms (speech + mimic action)
Hasan:Shows ‘LeftHand’ gesture
Robovie:Raises left arms (speech + mimic action)
Hasan:Shows ‘TwoHand’ gesture
Robovie:Raises two arms (speech + mimic action)
Hasan:Shows ‘FistUp’ gesture
Robovie:Bye-bye (speech + action)

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
26
Human-Robovie Interaction
(User ‘Hasan’)

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
27
Human-Robot Interaction
(Interaction with Aibo)
Gesture Aibo Action
User_‘Hasan’ User_2 ‘Cho’
One STAND UP WALK FORWARD
Two WALK FORWARD WALK BACKWORD
Three WALK BACKWORD KICK
…. ----- ----------

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
28
Human-Aibo Interaction
(User Hasan)

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
29
Human-Aibo Interaction
(User Cho)

Md.Hasanuzzaman, Ph.D student,
31 January, 2006
30
Questions?
Thanks for yours kind attention.
Tags