Hand gesture recognition PROJECT PPT.pptx

3,316 views 23 slides Apr 28, 2024
Slide 1
Slide 1 of 23
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23

About This Presentation

Nothing


Slide Content

HAND GESTURE RECOGNITION WITH CONVOLUTION NEURAL NETWORKS BATCH: 20 19HU1A05B6 – SHAIK SHABBEER ALI 19HU1A0579 – GATTA SAI KUMAR 19HU1A05C1 – THOLUCHURI GOWTHAM 19HU1A0568 – CHAKKIRALA YESU BABU MENTOR : DR. K. VENKATESWARA RAO SIR E

1 INTRODUCTION PROBLEM STATEMENT EXISTING SYSTEM PROPOSED SYSTEM METHODOLOGY RESULTS CONTENTS 2 3 4 6 7 12 5 LITERATURE SURVEY 9 10 8 11 SYSTEM TESTING SYSTEM CONFIGURATION CONCLUSION REFERENCES FUTURE ENHANCEMENTS

PROBLEM STATEMENT Speech impaired people use hand signs and gestures to C ommunicate. Normal people face difficulty in understanding their language. Hence there is a need of a system which recognizes the different signs, gestures and conveys the information to the normal people. It bridges the gap between physically challenged people and normal people.

INTRODUCTION Communication is imparting ,sharing and conveying of information ,news, ideas, and feelings. Sign language is one of the way of non verbal communication which is gaining impetus and strong foothold due to its applications in large number of fields. Most prominent application of this method is its usage by differently disabled persons like deaf. Gesture means a movement of hand or head that expresses something

SL NO. TITLE AUTHOR YEAR 01 HAND GESTURE RECOGNITION BASED ON COMPUTER VISION MUNIR OUDAH, ALI AL-NAJI AND JAVAAN CHAHL 2020 02 DESIGN OF HUMAN MACHINE INTERACTIVE SYSTEM BASED ON HAND GESTURE RECOGNITION XIAOFEI JI, ZHIBO WANG 2019 03 HAND GESTURE RECOGNITION FOR REAL TIME HUMAN INTERACTION SYSTEM POONAM SONWALKAR, TANUJA SAKHARE, ASHWINI PATIL, SONAL KALE 2015 LITERATURE SURVEY

EXISTING SYSTEM MODEL 1:Hand Gesture Recognition on Digital Image Processing Using MATLAB It was found by Team of Researches and Engineers Working in Field of Computer vision and Image Processing This model is combination of digital image processing techniques and machine learning algorithms Limitations: Limited Recognition of dynamic gestures High Computational Requirements Sensitivity to Hand Orientation and Position

EXISTING SYSTEM MODEL 2:System For Recognition Of Indian SignLanguage Of Deaf People Using OTSU’S Algorithm It Was Found By Team of Researches and Engineers From SIT ,India OTSU’S Algorithm Uses Image Processing Techniques To Classify Hand Gestures Limitations: Low Accuracy Difficulty in Adapting To New Users Limited No of Hand Gestures

PROPOSED SYSTEM MODEL NAME MODEL ARCHITECTURE MODEL DESIGN

PROPOSED SYSTEM MODEL NAME Our proposed system is sign language recognition system using convolution neural networks which recognizes various hand gestures by capturing video and converting it into frames. Then the hand pixels are segmented and the image it obtained and sent for comparison to the trained model. Thus our system is more robust in getting exact text labels of letters.

MODEL ARCHITECTURE

MODEL DESIGN

SEQUENCE DIAGRAM

STATE CHART DIAGRAM

SYSTEM CONFIGURATION SOFTWARE REQUIREMENTS : OS: Windows or Mac SDK: Open CV, TensorFlow, Numpy , Keros HARDWARE REQUIREMENTS: CAMERA:3MP RAM:8 GB PROCESSOR: INTEL 4 HDD:10GB GPU:4GB

METHODOLOGY TRAINING MODEL PREPROCESSING IMAGE SCALING SEGMENTATION ALGORITHM CNN

RESULTS Screenshot of the result obtained for letter A

RESULTS Screenshot of the result obtained for letter W

RESULTS Screenshot of the result obtained for letter L

UNIT TESTING INTEGRATION TESTING WHITE BOX TESTING BLACKBOX TESTING ACCEPTANCE TESTING SYSTEM TESTING 2 4 1 3 5

CONCLUSION I developed an effective method for dynamic hand gesture recognition with 2D Convolutional Neural Networks. which accurately gives result in all conditions . My future work will include more adaptive selection of the optimal hyper-parameters of the CNNs, and investigating robust classifiers that can classify higher level dynamic gestures including activities and motion contexts

FUTURE ENHANCEMENTS The proposed sign language recognition system used to recognize sign language letters can be further extended to recognize gestures facial expressions. Instead of displaying letter labels it will be more appropriate to display sentences as more appropriate translation of language. This also increases readability. The scope of different sign languages can be increased. More training data can be added to detect the letter with more accuracy. This project can further be extended to convert the signs to speech.

REFERENCES [1] S. Mitra and T. Acharya. Gesture recognition: A survey. IEEE Systems, Man, and Cybernetics, 37:311–324, 2007. [2] V. I. Pavlovic, R. Sharma, and T. S. Huang. Visual interpretation of hand gestures for human-computer interaction: A review. PAMI, 19:677–695, 1997. [3J. J. LaViola Jr. An introduction to 3D gestural interfaces. In SIGGRAPH Course, 2014. [4] S. B. Wang, A. Quattoni , L. Morency , D. Demirdjian , and T. Darrell. Hidden conditional random fields for gesture recognition. In CVPR, pages 1521–1527, 2006

THANK YOU
Tags