Md.Hasanuzzaman, Ph.D student,
31 January, 2006
4
Motivation
Example Scenario of Human-Robot
Symbiosis System…
Knowledge-based
software platform
Dialog
Face &
Gesture
Collaboration
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
5
Architecture of Knowledge-Based
HRI System by Using Visual Gestures
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
6
Overview of the System
Knowledge management using SPAK (Software Platform for Agent and
Knowledge Management)
Knowledge base, Inference Engine, SPAK GUI, Network Gateway
Vision-based image analysis and recognition
Face detection from cluttered background
Person identification using eigenface method
Skin-like regions segmentation using person specific skin-color
information
Face and hand poses classification using Subspace method
Person-centricgesture interpretation/recognition
New users and gestures learner
Robot Components
Robovie (eyes, mouth, neck, arms) and Aibo (body)
Robot responds by Speech (text to speech converter) and Actions
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
7
Frame-Based Knowledge Model
for Gesture-Based HRI
SG=Static Gesture, DG=Dynamic Gesture
…Hasan
…
Root
Behavior PoseGestureRobotUser
Robovie
Face
Act1 SG1 SG
2
DG2DG1
…
AiboAct2AiboAct1Act2
Aibo Thumb
……
RobovieActAiboActStaticDynamic
…
Cho
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
8
Knowledge Representation
Class frames are defined for user, robot, gesture, behavior (robot
action) and pose.
The user frame includes instances of all known users (Hasan,
Cho, etc.)
The robot frame includes instances of all robots (Aibo, Robovie,
etc.)
The gesture frame includes all the static and dynamic gestures
The pose frame includes all the recognizable poses
The behavior frame includes all the predefined robot actions
Uses a frame-based Software Platform for Agent and Knowledge
Management (SPAK)
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
9
Example of SPAK Knowledge Editor
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
10
Instance-Frame “TwoHand”
Frame TwoHand
Type Instance
A-kind of Static_Gesture
Has-part Null
Semantic-link-from Null
Semantic-link-to Robot behavior
Name
Type
Condition
Argument
mFace
Instance
Any
FACE
Name
Type
Condition
Argument
mLeftHand
Instances
Any
LEFTHAND
Name
Type
Condition
Argument
mRightHand
Instance
Any
RIGHTHAND
Slot1:
Slot3:
Slot3:
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
11
Instance-Frame “RaiseTwoArms”
Frame RaiseTwoArms
Type Instance
A-kind of RobovieAct
Has-part Null
Semantic-link-from Null
Semantic-link-to Null
Name
Type
Argument
mRobot
Instance
Robovie
Name
Type
Argument
mUser
Instances
Hasan
Name
Type
Argument
mGesture
Instance
TwoHand
Name
Type
Argument
OnInstantiate
String
Function name
Slot1:
Slot2:
Slot3:
Slot4:
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
12
Gesture Frame “TwoHand” in SPAK
Robovie Action Frame “RaiseTwoArms”
in SPAK
Back
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
13
Person-Centric Gesture
Interpretation
Face detection and person identification.
Skin-like regions segmentation using person
specific skin-color information.
Face and hand poses classification using
Subspace method.
Gesture interpretation/recognition
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
14
Gesture Recognition
Static gestures are recognized using frame-based
approach with the combination of the pose classi
fication results of the three skin-like regions.
Example: If two hands and one face present in the image
then the gesture is ‘TwoHand’
Dynamic gestures are recognized from the transi
tion of face poses in a sequence of time steps.
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
15
Example of Static Gesture ‘TwoHand’
Back
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
17
Static Gesture
Image analysis and recognition module sends
pose names after classified the poses
SPAK inference engine processes the information
and activates the corresponding pose frames
If the combination of pose frames support predefin
ed gesture then corresponding gesture frame will
be activated.
Example of gesture-frame ‘TwoHand’ in SPAK
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
18
Example of Activated Pose Frames
(face, left-hand, right-hand)
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
19
Example of Activated Gesture
Frame ‘TwoHand’
Back
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
20
Contents
Introduction
Knowledge Modeling for Gesture-based HRI
Person-Centric Gesture Recognition/Interpretation
User and Gestures Adaptation
Person-Centric Gesture-based Human-Robot
Interaction
Conclusions and Future Work
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
21
Person-Centric Gesture-Based
Human-Robot Interaction
Use a Software Platform for Agent and Knowledge Management
(SPAK) for decision making.
Image analysis unit sends pose names (or gesture name for
dynamic gesture) and user name to SPAK.
SPAK inference engine processes the facts and activates the
corresponding frames to interpret gesture and activates the robot
action. (Example scenarios in SPAK)
Robot responds to gesture commands with body actions and
speech.
Use an entertainment robot Aibo and a humanoid robot Robovie
for gesture-based HRI.
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
22
Example of Person-Centric Aibo
Robot Action Frames
Aibo Action for User “Cho”
Aibo Action for User “Hasan”
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
23
Human-Robovie Interaction
(User ‘Cho’)
Interaction Scenarios:
Cho:comes to Robovie
Robovie:Hi Cho, How are you? (speech+action)
Cho :Shows ‘ThumbUp’ gesture
Robovie:Oh good! Do you want to play now? (speech)
Cho :Shows ‘Ok’ gesture
Robovie:Oh thanks! (speech)
Cho :Shows ‘LeftHand’ gesture
Robovie:Raises left arms (speech + mimic action)
Cho :Shows ‘RightHand’ gesture
Robovie:Raises right arms (speech + mimic action)
Cho :Shows ‘TwoHand’ gesture
Robovie:Bye-bye (speech + action)
Md.Hasanuzzaman, Ph.D student,
31 January, 2006
27
Human-Robot Interaction
(Interaction with Aibo)
Gesture Aibo Action
User_‘Hasan’ User_2 ‘Cho’
One STAND UP WALK FORWARD
Two WALK FORWARD WALK BACKWORD
Three WALK BACKWORD KICK
…. ----- ----------