An Introduction to Neural Nets using the Neurons.pptx
rahulmishraalpha
16 views
24 slides
Jun 15, 2024
Slide 1 of 24
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
About This Presentation
This ppt provides an introduction to the Cognitive Neurocomputing especially sensing the mood of a person by analysis of the brainwaves so generated using Deep Learning Tools.
Size: 10.27 MB
Language: en
Added: Jun 15, 2024
Slides: 24 pages
Slide Content
An Introduction to Neural Networks using the Neurons of our Brain Rahul Mishra B.Tech CSE 2201114017
Our aim is to somehow predict of the emotional state and mood of subject from EEG Signal…
We know that there are some kind of electrical signals going along the axon of the neuron and some chemical activities at the dendrites and the cell end. Rudimentarily we can figure it that these are the building blocks of the thought process and the all neural processes in our brain. How the neurons work
Different layers of neurons are arranged in a parallel fashion giving rise to a common source of electrical signal that can be caught by the individual electrodes of the EEG. What is this Electroencephalography EEG?
This is the most used EEG device i.e. EMOTIV EPOC X Brainwear ® , -The EEG has 14 channels that measures the various brainwave signals coming from different zones of the cerebral cortex.
From the historical studies we can find these graphs and their correlation with the various states of body. Some get to knows about the signals
The tedious job: Making a Data File
Tools used,
Our result
We need to somehow, make a model which predicts the emotional states of the subject based on the given data field. What all we need to know now? And how shall be we doing that? Neural Networks Oh man again those neurons
But before anything The Machine Learning is simply using statistical algorithms and techniques in order to learn from data and generalize to unseen data , allowing systems to perform tasks without explicit instructions The Machine Learning Algorithm or Technique that finds pattern from datasets and generalizes to make decisions on unseen data is called model. The generalization and yield of decisions on unknown data is prediction . A model is said to Learn from data if supply of more data increases the prediction accuracy in manifolds. The process of yielding data for the model to learn is called Training the model. The process of using another set of similar but unknown data for checking the precision of prediction of model is Validating the Model.
Statistics + Programming = Machine Learning AI ML + Domain Knowledge = Data Science
Here is the Neural Network 2500 Scaled To 784 Mood Value Ranges To put Scale of +,-,0
One Simple Mathematical Eq. describes the Neural Network
Thought Experiment: Sit and set all these dials, weights and biases by hand, so as to get our desired result out.. Scaled From 2500 to
So, How does model exactly knows which, neural network(s) works out as the training model? Cost Function
We can roughly say that learning is minimizing this cost function…. The gradient descent is STOCHIASTIC, i.e. the downhill of the man in graph is like a Drunkard man quickly trying to get down instead of a meticulously precise but slow Descent downhill. Look at Adam SGD
Fine but, how does this adjustment of weights and biases happen
Back propagation
Oh man enough of Maths, where is the code piece… Splitting the data into datasets for training, Validating and testing s done here.. Creation of different layers In the deep learning model And association of Activation f unction 3 2549
Things which almost made to the slide are over fitting, batch normalization, dropout The final validation of data Putting the SGD technique The actual Learning and Validation Takes place here…
Look Ahead to Neuralink …. The first leap into the Cyborg
Kaggle Course on Deep Learning https :// www.kaggle.com/learn/intro-to-deep-learning Code and dataset from GABRIEL ATKIN https :// www.kaggle.com/code/gcdatkin/eeg-emotion-prediction Graphics from 3Blue1Brown https :// www.youtube.com/c/3blue1brown References