Sign-to-Speech Conversion System with Mimic Robotic Hand.pptx
calvinctttafirei
8 views
21 slides
Sep 04, 2024
Slide 1 of 21
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
About This Presentation
A Project Proposal on sign to speech language conversion roboglove
Size: 50.77 KB
Language: en
Added: Sep 04, 2024
Slides: 21 pages
Slide Content
Project Proposal: Sign-to-Speech Conversion System with Mimic Robotic Hand for Biomedical Applications**
1. Project Title: D ual-Functional Assistive Device: Sign-to-Speech Conversion System Combined with a Mimic Robotic Hand for Patients with Speech Impairment and Limb Loss
#### **2. Project Overview:**
The proposed project aims to develop a dual-functional assistive device that integrates a sign-to-speech conversion system with a mimic robotic hand. The project addresses the communication challenges faced by two distinct patient groups: those who can make hand gestures but cannot speak (e.g., patients with speech impairments) and those who have lost their hands or fingers and need a robotic prosthetic to communicate and interact with their environment.
The sign-to-speech system uses a glove embedded with flex sensors and an Arduino Uno to detect finger gestures and convert them into audible speech via a Bluetooth-connected app. The mimic robotic hand, equipped with servo motors, replicates these gestures, providing an alternative communication method for individuals without hands. Together, these systems form a versatile, inclusive assistive device for various medical applications
3. Objectives:
1. Design and Implement a Sign-to-Speech System: Develop a glove-based system using flex sensors, Arduino Uno, LCD, and a Bluetooth module to detect specific hand gestures and convert them into pre-recorded speech
.
2. Develop a Mimic Robotic Hand: Create a robotic prosthetic hand controlled by servo motors and flex sensors to replicate hand gestures, providing an alternative communication mode for individuals without hands
3. I ntegrate Both Systems into a Unified Device: Ensure seamless integration between the sign-to-speech system and the mimic robotic hand for a comprehensive assistive communication tool
.
.
4. **Create a Mobile Application:** Develop an Android app to display the text of the detected gestures and output the corresponding speech, enhancing the usability and accessibility of the system
.
5. **Evaluate Usability and Effectiveness:** Conduct tests and simulations to evaluate the device's effectiveness, accuracy, and usability for patients in different healthcare settings
.
#### **4. Background and Justification:**
Communication is a fundamental need for every individual. However, patients with speech impairments or those who have lost their hands face significant challenges in communicating their needs. The proposed project combines two solutions—sign-to-speech conversion and a mimic robotic hand—into a single device that addresses these challenges comprehensively:
- **For Patients with Speech Impairments:** The sign-to-speech system enables individuals to communicate by making specific hand gestures, which are then translated into audible speech. This can be critical in hospital settings, rehabilitation centers, and daily life
.
- **For Patients with Limb Loss:** The mimic robotic hand serves as a prosthetic device that can replicate gestures, helping patients without hands communicate or interact with their environment. This has applications in rehabilitation, prosthetic development, and adaptive devices.
#### **5. Technical Approach:**
1. **Hardware Components:**
- **Flex Sensors:** Attached to the glove to detect the bending of fingers and translate them into corresponding voltage changes.
- **Arduino Uno:** Serves as the microcontroller for processing sensor data and controlling both the LCD display and Bluetooth communication
.
- **Bluetooth Module (HC-05/HC-06):** Connects to a smartphone app to transmit data and play corresponding speech.
- **LCD Display:** Provides visual feedback of the detected gestures.
- **Servo Motors (for Robotic Hand):** Used to replicate hand movements and gestures based on input from the flex sensors.
- **Mimic Robotic Hand:** A 3D-printed or mechanically constructed hand that performs gestures based on the detected inputs from the glove
.
2. **Software Components:**
- **Arduino IDE:** Programming environment to code the logic for reading sensor data, gesture recognition, and communication with the Bluetooth module.
- **Android Application:** A custom app developed to receive Bluetooth data, display the gesture text, and output the speech.
**Servo Control Algorithms:** Control algorithms to manage the movements of the robotic hand in sync with the gestures detected by the glove
.
3. **Algorithm and System Design:**
- **Gesture Recognition Algorithm:** Map sensor values to predefined gestures using threshold-based detection to distinguish between different finger movements.
- **Data Communication Protocol:** Establish a communication protocol between the Arduino, Bluetooth module, and Android app to ensure accurate and real-time gesture translation
- .
- **Robotic Hand Control:** Implement control algorithms that accurately translate gestures detected by the glove into servo motor movements on the robotic hand
.
4. **Testing and Validation:**
- **Simulation Testing:** Use Proteus software to simulate the circuit and verify the functionality of both the glove-based system and the robotic hand.
- **Physical Testing:** Build prototypes for both systems and conduct usability tests with real users (e.g., volunteers with speech impairment or limb loss) to validate the effectiveness and reliability of the device
.
#### **6. Project Timeline:**
| **Phase** | **Duration** | **Tasks** |
|-----------------------------|------------------|---------------------------------------------------------------------------------------|
| **Research and Design** | 4 weeks | Literature review, defining requirements, system design, and component selection. |
| **Hardware Development** | 6 weeks | Building the glove system, setting up Arduino, integrating LCD and Bluetooth modules. |
| **Robotic Hand Development**| 6 weeks | Design and construction of the robotic hand, servo motor integration. |
| **Software Development** | 8 weeks | Developing Arduino code, Android app, and gesture recognition algorithms. |
| **Integration and Testing** | 4 weeks | Integrating both systems, debugging, and performing simulation tests in Proteus. |
| **User Testing and Validation** | 4 weeks | Conduct usability tests, gather feedback, and make necessary improvements. |
| **Final Documentation and Presentation** | 2 weeks | Preparing project report, presentation, and demonstration setup
. |
#### **7. Expected Outcomes:**
- A fully functional glove-based sign-to-speech system integrated with a mimic robotic hand.
- An Android application that effectively displays and converts gestures to speech.
- Demonstration of the system's effectiveness through user testing and real-world simulations.
- A project report detailing the design, implementation, and evaluation phases
#### **9. Conclusion:**
This project aims to bridge the gap in communication technology for patients with speech impairments and limb loss. By combining a sign-to-speech system with a mimic robotic hand, it offers a unique, dual-functional solution that can enhance the quality of life and independence of patients. The project aligns with the goals of Biomedical Engineering to innovate and provide real-world solutions to healthcare challenges, making it a valuable contribution to the field