Deep Learning Powered Offline Signature Verification Pipeline Presentation
Size: 6.51 MB
Language: en
Added: Sep 27, 2025
Slides: 36 pages
Slide Content
Deep Learning Powered Offline Signature Verification Pipeline Evaluated by : Supervised by : In collaoration with :
01 Introduction 04 Proposed Solution 02 Problem Statement 05 Conclusion and Perspectives 03 Fundamentals 06 Acknowledgements Plan
01 Introduction Manual processing of claims typically takes 2–3 weeks per claim, and Operational costs can reach up to 15% of the total claim amount
02 Problem Statement With all the challenges that signatures present, we could automatically verify them using AI-powered systems to reduce fraud, manual effort, and processing delays? Real-world scenarios Signatures intra-personal inconsistencie ( No two signatures from the same individual are exactly alike) manual and slow process
03 Signature Types Ink-based signature done manually on paper. Handwritten Digital expression of consent via devices. Electronic Signature using physical or behavioral traits. Biometric
04 Signature Verification Types Online: Captures dynamic signing traits in real-time. Offline: Analyzes static images of completed signatures. Online vs Offline WD: Trained on a specific signer’s data. WI: Works across unknown or new signers. Writer dependent / Writer independent
05 Genuine: Authentic signature written by the actual person. Skilled forgery: Carefully copied by someone trained or practiced. Unskilled forgery: Poor imitation without knowledge or precision. Random forgery: Completely unrelated signature by another person. Signature Forgeries Types
06 Fundamentals - Machine Learning Machine learning is a branch of AI trained on statistical models and algorithms, which enable it to make predictions and decisions. By identifying patterns in its training data, ML algorithms can improve and adapt over time, enriching its capabilities. Receive data Analyze data Find patterns Make predictions Send answer The machine lerning process
07 Fundamentals - Deep Learning Deep learning is a subset of machine learning that uses an artificial neural network to autonomously learn, make intelligent decisions, and determine prediction accuracy without human intervention. Because deep learning models analyze data continuously, they build extensive knowledge over time and draw conclusions by taking in information, consulting data reserves, and determining an answer. Cat Dog Bird
08 Fundamentals - Transfer Learning Transfer learning enables models to leverage knowledge from large, general-purpose datasets and apply it to more specific tasks with limited data. By reusing pre-trained features, we improve performance and reduce the need for extensive labeled data in domain-specific applications.
09 Fundamentals - Siamese Neural Networks Siamese Neural Networks (SNNs) are a specialized type of neural network designed to compare two inputs and determine their similarity. Unlike traditional neural networks, which process a single input to produce an output, SNNs take two inputs and pass them through identical subnetworks. Identical networks Similarity function
10 Proposed Motivation Real-world scenarios Signatures Existing signature verification systems While signature verification has been widely studied, most existing research focuses narrowly on isolated signature image comparison, overlooking document context, layout structure, signature extraction, and cleaning processes.
Proposed Approach 11
12 Part 1: Signature Detection This is a task of object detection; for this task, we will use transfer learning, as it allows leveraging pre-trained models to achieve high accuracy with limited data. MODEL SELECTION DETR YOLOS Yolo8l ....... Conditional DETR Yolo10s ......
13 Part 1: Signature Detection After testing different models, we chose YOLOv8s for its speed, reduced inference time, and high precision.
14 90° Rotate: Clockwise, Counter-Clockwise, Upside Down Rotation: Between -10° and +10° Shear: ±4° Horizontal, ±3° Vertical Brightness: Between -8% and +8% Exposure: Between -13% and +13% Blur: Up to 1.1px Noise: Up to 0.97% of pixels Part 1: Signature Detection Annotated Public Dataset Augmentations ( 396 training ) (1118 training images )
15 Part 1: Signature Detection To enhance the performance of the signature detection model, we employed an open-source hyperparameter optimization framework designed specifically for machine learning workflows and conducted 20 diffrent experiments. Hyperparmeter optimisation
16 Part 1: Signature Detection Results: High precision, recall and accuracy No confusion between signature, text and logos
17 Single signature selection Part 1: Signature Detection Post-Processing Cropping and resizing
18 Proposed Approach
19 Part 2: Signature Cleaning We can frame the task of signature cleaning as as an image translation problem where our goal is to learn a transformation from a source domain (noisy signatures) to a target domain (clean signatures)
20 Part 2: Signature Cleaning The Approach: Our approach involves two steps: Data Generation: using prior knowledge of real-world noise (e.g., text, lines, and salt-and-pepper artifacts), we create synthetic noisy signature samples. Translation/Learning: an image translation model learns to map noisy inputs to clean signatures using these paired examples.
21 Part 2: Signature Cleaning Data Generator + salt-and-pepper noise Addition of lines Superimposition of text = Generating Syntheic Data:
23 Part 2: Signature Cleaning We take a paired data simulation approach using a convolutional autoencoder (CAE) because it allows us to learn a mapping from noisy to clean signature images. A CAE has two parts: Encoder : compresses the input (noisy signature) into a compact feature representation. Decoder : reconstructs a clean version of the signature from that compressed form. This encoder-decoder structure is ideal for denoising tasks
24 Part 2: Signature Cleaning Noisy image Predicted Original Noisy image Predicted Original Noisy image Predicted Original Noisy image Predicted Original
25 Part 2: Signature Cleaning
26 Proposed Approach
27 Part 3: Signature Verification The Contrastive loss function encourages genuine signature pairs to have similar representations (closer in embedding space) and forged ones to remain distant. To assess the performance of different similarity measures and activation functions, we conducted controlled experiments.
28 Part 3: Signature Verification
29 Evaluation To validate our approach, we recruited volunteers and asked each participant to sign a set of real-world documents from an insurance company three times. Additionally,we provided each participant with a signed document from another individual in theexperiment Original Forgries
30 Evaluation
Demonstration
32 Conclusions YOLOv8s for object detection model, has proven to be highly efficient and accurate in localizing signature In order to to clean up these noise, we employed a convolutional autoencoder model, trained on synthetically augmented data, improving the accuracy We used a convolutional Siamese network, designed to distinguish between genuine and forged signatures
33 Lime Explination True: Forged Pred: Forged Documents provided by the client Public datasets Perspectives Domain-Specific Adaptation and Fine-Tuning Continuous Model Retraining for Adaptive Forgery Detection Explainable AI for Trustworthy Results
Thank you for your attention and a special thank to: