AI Convolutional Neural Network for Rice Grain Classification.

mahir779471 88 views 12 slides Sep 22, 2024
Slide 1
Slide 1 of 12
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12

About This Presentation

This project focuses on the classification of rice grains using an AI-driven Convolutional Neural Network (CNN). As rice is a staple food for billions globally, accurate classification of different varieties is essential for quality control and market expansion. This project specifically explores th...


Slide Content

Rice Grain CNN Classification Model
By Mahir Kardame P2725883

Overview of the
Project
Situation
•KRBL Limited required a way to identify & classify different types of rice grains.
•Arborio, Basmati, Ipsala, Jasmine, and Karacdag.
•Goal : Expand & grow their agricultural and exporting business into other rice markets.
•Aim - Develop a CNN rice grain classifier model to help KRBL Limited with business
expansion.
Solution
•Utilized image dataset consisting of 75,000 segmented rice grain
images.
•Developed a Convolutional Neural Network (CNN) to classify rice
grains based on images using MATLAB’s Deep Learning Toolkit.
Findings
•RGCNN model successfully classifies 5 rice grain varieties
based off images.
•Excellent accuracy & precision in quick convergence times.
•Implementation : Helps ensure KRBL maintains consistent
high quality standards across all rice products.

Splitting Rice Image Dataset for Model Evaluation
Total of 75,000 segmented rice grain images were provided
•15,000 images per rice class (total 5 rice grain varieties’)
•Time Taken increases with Images used
•Quick training & validation times required
to make it practical use in industry.
Future Recommendations
•Testing (15%) - Ensures unbiased
analysis on unseen rice grain images.
1,500 rice grain images have been split into
different ratios:
•Training (70%) – CNN model learns &
identifies patterns in images.
•Validation (15%) – Crucial to
minimize potential overfitting.
•Identify and address any potential outliers
•Potential duplicated Images being used
•Images affected by noise :
1)Blurred images
2) Low-resolution images
3) Images featuring unusually sized grain
shapes

Pre-processing by Data Image
Augmentation
Techniques implemented :
•Reflection
•Rotation
•Scaling
•Translation (x,y)
Benefits
Pre-
Augmentation
Post-
Augmentation
Reduces influence of noise in input data
•Improves models' ability to learn diverse features of
rice grains.
•Helps augment the diversity of the training image
data for the RGCNN model.
-Reduces chance of overfitting occurring.
-Improves generalization to unseen rice grains.
-Obtains very stable & non-chaotic graphs.
-RGCNN model achieves higher accuracy &
precision results.

RGCNN Model Network Architecture
AlexNet
GoogleNet
- Simple Architecture Structure : Consists of 8 layers,
5 CNN layers & 3 fully connected layers.
•Better classification results with longer convergence times
-Complex Architecture Structure : Consists
of 22 layers, including convolutional layers,
pooling layers, inception modules & fully
connected layers.
Advantages of AlexNet implementation:
•Faster training & validation times due to simple
structure•Max pooling layers allows model to capture visual
patterns of rice grains (texture & shape).
•Therefore, allowing for better classification accuracy
& precision results.
• Unstable & chaotic training & validation graph
•Prone to overfitting
•Model becomes too specialized to training image data
(capturing noise) rather than capturing patterns.

1) Stochastic Gradient Descent with Momentum
(SGDM)
- Optimization algorithm used for faster convergence by
incorporating momentum to accelerate gradient descent.
2) Adaptive Moment Estimation (ADAM)
with Dynamic Learning Rate
-Optimisation algorithm known for its
adaptive learning rates.
•Results are extremely excellent (98.90 %).
•Requires more memory.
•More training time & convergence time is
required.
•Results are still good (94.60 %)
•Still displays chaotic training behavior (not
stable)
•Requires less memory
•Convergence times are quicker (42.9 s)
TRAINING OPTIONS
-Dynamically adjusts learning rates after
certain epoch value is reached.

Parameters Performance Metrics
1)Epochs
More Epochs = Slower convergence times
-Excellent accuracy % results
-Higher chance of overfitting/unstable
behavior
Less Epochs = Quicker convergence times
-Stable validation but chaotic training results
-Due to insufficient training
2) Learning Rate
High LR = Weaker accuracy & precision
results
- Displays chaotic & unstable graph
Low LR = Quicker Convergence times with
strong accuracy
3)
MiniBatchSizeHigh MBZ = Extremely quick convergence time but
slightly weaker accuracy & precision results
-Provides extremely stable validation plot.
Low MBZ = Very slow convergence time
- Initial chaotic & unstable training & validation
2) Confusion Matrix – Detailed insights
into true/false predictions.
1)Accuracy – Measure of how well RGCNN model
identifies different rice varieties
-Provides good evaluation of overall
classification performance (bar chart).
-Summarizes models' ability to classify grains accurately
- Helps visualize which rice grains are
correctly classified and which ones are being
confused with each other.

RESULTS TABLES
•Arborio & Ipsala accuracies of 98.8%.
•Average Accuracy of 98.95 %
•Average Precision of 98.90 %
•Average Time Taken for training & testing runs of 76
seconds
•Displayed stable & un-chaotic graphs
•Karacdag accuracy of 99.6 % (near perfect).
•Basmati & Jasmine lowest accuracies of 97.5% -
97.8%•RGCNN Model was getting confused between both
rice grains the most.

Pre – Tuned RGCNN Model Fully Optimized RGCNN
Model
•Displays ‘chaotic & un-stable’ behavior
•Obtains excellent Accuracy & Precision results
•Indicates that the model's learning process is
unpredictable •Model was overfitting and failing to generalize well to
unseen images.
•RGCNN model displays strong stable behavior when doing
training & validation tests

Rice Grain Classification Model Performance
•Strengths
- RGCNN Model displays confusion matrix of each rice grain
class.
- Time efficient (average of 76 seconds). Practical use in
industry.
- Average Precision results obtained were 98.52%.
- Average Accuracy results obtained were 98.95%.
- RGCNN model was getting mixed up with Arborio & Ipsala and Basmati & Jasmine as
demonstrated by the confusion matrix.
- Although time taken to run training & validation is quick
compared to other classifying models,
Training times can still be reduced with funding for better
GPU’s.
- Due to similarities in appearance of both rice grains.
•Limitations

Final Conclusions
Key Limitations
Recommendations & Effects
Implementation of the RGCNN model:
•Near perfect classification accuracy in rice grain
sorting•Very time effective therefore industry
ready .•Detailed analysis on any
misclassification/confusion between rice
grains.
Integration into agricultural process:
•Reliable & Efficient Tool for the scientists and
farmers at KRBL Limited factories.
•KRBL Limited can improve quality control standards.
Effects :
•Quality assurance – Helps ensure KRBL maintains
consistent quality standards across all products
•Compliance & Regulation – Ensures it meets regulatory
standards and export requirements in new markets (Ipsala
in Turkey).
•Confusion between mainly Basmati & Jasmine rice grains.
•Due to similarities in their appearance (shape, texture, size)
•Can be made more time effective (quicker training &
testing times can be achieved).
•Hyperspectral imaging
Reduces confusion as RGCNN model will easily be able to distinguish rice
grains and improve precision values. Leads to improved classification results

•Investment in more powerful GPU’s
Leads to quicker classification as training & validation will reduce
time taken. Makes it practical for industry use.
•Identify and remove outlier images
Reduces potential overfitting and improves accuracy &
generalization of the RGCNN Model.

References
1- https://pub.towardsai.net/impact-of-optimizers-in-image-classifiers-3b04ed20823a
2-
https://uk.mathworks.com/help/deeplearning/ug/train-residual-network-for-image-clas
sification.html
3-
https://uk.mathworks.com/help/deeplearning/ug/pretrained-convolutional-neural-net
works.html
4- https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8701132/
5-
https://www.linkedin.com/pulse/importance-data-preprocessing-augmentation-comput
er-vision-vaddoriya/