HUMAN.pptx A robot project which will be act like toy

tulsamma584101 18 views 24 slides Jul 21, 2024
Slide 1
Slide 1 of 24
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24

About This Presentation

A voice controlled humanoid robot with facial expression.


Slide Content

VISVESVARAYA TECHNOLOGICAL UNIVERSITY "Jnana Sangama ", Belgaum: 590 018 H.K.E SOCIETY’S SIR M VISVESVARAYA COLLEGE OF ENGINEERING (Affiliated to VTU - Belagavi, Approved by AICTE, Accredited by NAAC) Yeramarus Camp, Raichur-584135, Karnataka 2023-2024 FINAL YEAR PROJECT PHASE- 2 PRESENTATION ON “ VOICE CONTROLLED HUMANOID ROBOT WITH FACIAL EXPRESSIONS” UNDER THE GUIDENCE OF DR. VISHWANATH P DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING

VOICE CONTROLLED HUMANOID ROBOT WITH FACIAL EXPRESSIONS

TEAM MEMBERS MD SOUDAGAR KAIF MD SHOAIB RAMYA TULSAMMA SAHANA

CONTENTS INTRODUCTION LITERATURE SURVEY OBJECTIVES IMPLEMENTATION OF PROPOSED SYSTEM HARDWARE AND SOFTWARE REQUIREMENTS SIGNIFICANCE RESULTS CONCLUSION REFERENCES

INTRODUCTION In the era of rapid technological advancement, the convergence of robotics and Internet of Things (IoT) has opened up boundless possibilities, revolutionizing the way we interact with machines. At the forefront of this evolution stands the humanoid robot, a marvel of engineering that bridges the gap between humans and machines like never before. Generally speaking, robots of today are still far from achieving the intelligence and flexibility of their fictional counterparts.

Intro.... Using IoT in robotic automation improves productivity, increases the lifespan of robots, and optimizes maintenance. The analysis, in integration with the Arduino coding aids for implementation of various gaits on it.  Iot based Humanoid robots are used as a research tool in several scientific areas.

LITERATURE REVIEW [1] T. Furuta and T. Tawara worked on “Design And Compact Body Humanoid Robots” implemented onto them in the ESYS humanoid project , which mainly worked on biped walk control and four biped locomotion control strategies. [2] Qinghua Li proposed “Learning Control Methods For Compensative Trunk Motion For A Biped Walking Robot” that as a trunk, based on Zero Moment Point and developed a biped walking robot with a ZMP measurement. [3] Sebastian Lohmeier presented the “25-DOF Full-size Humanoid Robot LOLA “ is characterized by a redundant kinematic configuration, an extremely lightweight design and improved mass distribution to achieve good dynamic performance. [4] O. Eiberger presented a “Humanoid Two-arm System” based on Modular DLR-Lightweight-Robot-III And The DLR-Hand-II, developed for studying dexterous two-handed manipulation.

1 2 3 4 OBJECTIVES Leveraging IoT Connectivity Prototype Locomotion Systems Showcasing predefined facial expressions Safety Features

BLOCK DIAGRAM FOR HUMAN LOCOMOTIONS Designing the Control Interface Sending Commands via Bluetooth Executing Robot Actions Interpreting User Input Parsing Incoming Data on Arduino

METHODOLOGY 1. Designing the Control Interface: Designing user-friendly interface in the Bluetooth app that allows users to input commands or select actions for the robot. This interface include s buttons, sliders, dropdown menus, or text input fields, etc., 2. Sending Commands via Bluetooth: Implement the necessary functionality in the app to send commands or instructions to the Arduino Mega via Bluetooth. This involves establishing a Bluetooth connection with the HC-05 module and sending data packets containing the user's input. 3. Parsing Incoming Data on Arduino: passsing the commands to the Arduino Mega to receive and interpret the data sent by the Bluetooth app. it involve reading characters or strings from the serial buffer and parsing them to extract the user's commands or instructions.

4. Interpreting User Input: Once the data is received, the Arduino code interpret s the user's input to determine the desired action for the robot. This involve s mapping specific input values or commands to corresponding robot movements or behaviors. 5. Executing Actions: Based on the interpreted user input, trigger the appropriate actions on the robot.

BLOCK DIAGRAM FOR PREDEFINED EXPRESSIONS Choosing facial expressions Choosing text Scaling Bitmap Format Waveform format Convert to 8-bit PCM Hexcode ESP32 PAM OLED display Output

METHODOLOGY Choosing facial expressions: I n this 1st step , some of the facial expressions are choose to display on the OLED display . Scaling: I norder to match the resolution images to resolution of the display, the images are converted to a resolution of 128*64 pixels . Bitmap format: T he omages are converted to the bitmap format, where the pixels color information is encoded as a binary value ( 1 for illumknated, 0 for non-illuminated ). Choosing text: G athering the predefined texts that we want the robot to utter .

Waveformat: U sing a text to speech converter, we convert the text to corresponding audio waveform . Convert to 8-bit PCM : U nsigned 8-bit PCM format is one of ghe simplest audio format, making it easier to work with ESP 32 . Hexcode: I terating through the audio samples and convert each samples to hexadecimal format . Hexadecimal representation allows for compact storage of audio data .

HARDWARE REQUIREMENTS HC05 BLUETOOTH ARDUINO MEGA SERVO MOTOR

SPEAKER PAM (AMPLIFIER) ESP32 CONTROLLER OLED DIPLAY

SOFTWARE REQUIREMENTS Arduino IDE To write the computer code and upload the code to the physical board. Notevibes Is an AI voice generator that allows to read digital text a loud and save the audio you produce as either a WAV or MP3 file. Audacity Is the software used for recording and editing audio. Bit Map Array It is used to convert image in the form of code (store digital images) Tomeko Converting hexadecimal log into series of bits. Used to debug unexpected bit shift when writing and reading back data to SPI memory chip. MIT App I nventor MIT App Inventor is an intuitive, visual programming environment that allows everyone to build fully functional apps for smartphones and tablets.

ADVANTAGES This template has been created by Slidesgo customizable and Adaptable Low cost accesseble design Interfacing with arduino mega Remote control

APPLICATIONS 1 Education tool 2 Physical Therapy and Rehabilitation 3 Medical Institutions 4 Assistive technology 5 Entertainment industry

RESULTS

REFERENCES 1. M. Meghana et al,2020, Hand gesture recognition and voice-controlled robot, Materials Today: Proceedings, 2214-7853. 2. Bhanu chandu , Kirupa Ganapathy,2020, Voice Controlled Human Assistence Robot, International Conference on Advanced Computing & Communication Systems (ICACCS), 978-1- 7281-5197-7/20. 3.P. Mahesh Reddy, Suram Pavan Kalyan Reddy, G R Sai Karthik , Priya B.K,2020, Intuitive Voice Controlled Robot for Obstacle, Smoke and Fire Detection for Physically Challenged People, International Conference on Trends in Electronics and Informatics (ICOEI ), ISBN: 978-1-7281- 5518-0. 4.Linda John et al,2020, Voice Control Human Assistance Robot, National Conference on Technical Advancements for Social Upliftment , Proceedings of the 2 nd VNC; VNC-2020.

CONCLUSION Considering the Arduino Mega , the project would likely focus on replicating basic human gaits like walking, due to the limitations of the platform in terms of processing power and motor control capabilities.It represents a compelling example of how robotics and IoT technologies can be harnessed to address diverse challenges and create opportunities for learning, growth, and societal advancement.This feature opens up opportunities for practical applications in fields such as rehabilitation, sports training, entertainment, and assistive technology, demonstrating the relevance and impact of STEM education in addressing real-world challenges.

THANK YOU..