Google ML-Kit - Understanding on-device machine learning
VishrutGoyani1
110 views
43 slides
Jul 07, 2024
Slide 1 of 46
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
About This Presentation
This is the guide on how you can use google's ML kit for machine learning applications on mobile.
Size: 20.61 MB
Language: en
Added: Jul 07, 2024
Slides: 43 pages
Slide Content
Understanding on-device Machine machine learning ML-Kit by Google Vishrut Goyani Android developer vishrutgoyani9 vishrut.goyani3
What is ML-Kit?
Tell me more, I'm listening ML Kit is a mobile SDK that brings Google's on-device machine learning expertise to Android and iOS apps. Use powerful yet easy to use Vision and Natural Language APIs to solve common challenges in your apps or create brand-new user experiences. ML Kit's APIs all run on-device, allowing for real-time use cases where you want to process a live camera stream for example. This also means that the functionality is available offline.
Offerings Text recognition Face detection Face mesh detection Pose detection Selfie segmentation Subject segmentation Document scanner Barcode scanning Image labelling Object detection Digital ink recognition Language identification Translation Smart reply Entity extraction Vision Natural language
Wait, there is one more!
Firebase ML
Firebase ML - Is it same?
What is it? Firebase Machine Learning is a mobile SDK that brings Google's machine learning expertise to Android and Apple apps in a powerful yet easy-to-use package. Firebase ML provides the ability to deploy custom models to your users' devices by uploading them to our servers. It allows you to keep your app's initial install size small, and you can swap the ML model without having to republish your app.
Offerings Text recognition Image labeling Object detection and tracking Face detection and contour tracing Barcode scanning Language identification Translation Smart reply Vision Natural language
Well, Here's the difference ML-Kit On-device machine learning for common tasks Faster for simpler tasks Excellent offline performance Simpler to set up and use Need to update the app when updating the mode Firebase ML On-device and cloud-based machine learning Higher accuracy for complex tasks Limited offline performance More complex for cloud-based models Update your models on cloud without requiring the app update
Image labelling using ML-Kit
Let’s Cook! Ingredients : Android Studio Java/Kotlin Knowledge of Asynchronous programming like Threads or Coroutines Android device or emulator (Obviously the device will be better)
First things first - Dependencies!
The difference :-
Base model capabilities :-
Custom model implementation :- Download custom model Implementation guide
Dependencies:-
In your app level gradle file To ensure that gradle doesn’t compress the model file when building the app
Create Image object :-
Calculate image rotation
To load the downloaded custom model
Make an instance of image labeler For base model: For custom model:
Run the image labeler :-
Get information about labeled objects Now, you can use this information to group images by label names
Finally, We have done it…!
Wait, You got 10000+ photos? I’m screwed.
Time to go asynchronous 🙌🏻
Threads or Coroutine?
The difference Execution of Threads is heavyweight It’s complex and can lead to unexpected behavior upon cancellation Best suitable for simple & long running tasks OS managed Basic tasks requires fine-grained control Threads Coroutines Coroutines are lightweight Structured and graceful cancellation Can be used for complex tasks with efficiency Dispatcher managed Handle complex asynchronous tasks efficiently
Hence proved, Coroutine is the way to go 🙌🏻 Implementation : In your gradle file,
Basics : launch, async and runBlocking To work with coroutines, Kotlin provides three basic building blocks: launch, async, and runBlocking. 1. Launch Launch is used to fire and forget coroutine. It’s perfect for cases where you don’t need to compute any result.
1. Async async is used when you need a result computed in a coroutine. It starts a new coroutine and returns a Deferred<T>, which is a non-blocking future that represents a promise to provide a result later.
1. RunBlocking runBlocking is a bridge between non-coroutine world and coroutine world. It's a way to start top-level main coroutine.
Now, Let’s talk about dispatchers
Example :
Want to know more? Scan this code and go ahead 👍🏼
Time to apply it in our code
Replacing linear execution in coroutines
Now you can say, Yesss! It’s done 🤩
Thank you! I’m here vishrutgoyani9 vishrut_goyani9 vishrut.goyani3 vishrut.goyani9