Innovation in Mechine Learning ML 1.pptx

prasetyoganteng 22 views 10 slides Aug 26, 2024
Slide 1
Slide 1 of 10
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10

About This Presentation

INOVASI MECHINE LEARNING


Slide Content

Innovation in ML Autonomous Vehicles: can operate without the involvement of a person Currently, society is most familiar with the autopilot function on car types such as Tesla, which acts as an extra assistant for the driver in certain situations.  Recently , Wolkswagen  announced that it would start to introduce ChatGPT as a voice assistant to some of its vehicles in 2024.

In the USA and EU,  40-45% of fatal crashes happen in the dark, despite there being 60% less traffic on the road in some cases. Self Driving Car

Internet of Things ( IoT ) and machine learning Using IoT technology with AI together aims to open up more unique opportunities . AI has the ability to enhance IoT’s capabilities by enabling data analysis, predictive insights, automaton and intelligent decision-making . AI can also compound the benefits of IoT by adding human-like awareness and decision-making to the environment.

IoT for Smart Home

Deep learning Deep learning is a method of AI that teaches computers to process data much like the human brain does . Deep learning models can recognise complex patterns in pictures, text, sounds and other data to produce accurate insights and predictions. Plenty of companies use deep learning in order to harness AI, including the likes of  Google DeepMind ,  OpenAI  and  IBM . Deep learning is used in plenty of image recognition tools, natural language processing (NLP) and speech recognition software and thrives in more complex environments.

Architecture Strengths Weaknesses Performance Training Time Resource Requirements Convolutional Neural Networks (CNNs) V.Good in image-related projects Require substantial data for optimal performance High accuracy in image-related projects Moderate training time Moderate to high GPU requirements Recurrent Neural Networks (RNNs) Effective for sequential data Sensitive to vanishing/exploding gradient problems Strong in projects involving time dependencies Longer training time for deep networks Moderate GPU requirements Comparison of Deep Learning Architecture

Long Short-Term Memory (LSTM) Mitigate vanishing/exploding gradient issues Higher computational demands compared to basic RNNs Efficient in capturing long-term dependencies Longer training time for complex networks Moderate to high GPU requirements Gated Recurrent Unit (GRU) Simplifies architecture compared to LSTM May not perform as well on certain projects Balances performance and computational cost Faster training than LSTM Moderate GPU requirements Autoencoders Useful for unsupervised learning, data compression Sensitive to noise and outliers Efficient in learning data representations Moderate training time Moderate GPU requirements Generative Adversarial Networks (GANs) Exceptional at generating realistic data Prone to mode collapse, training instability High-quality data generation Sensitive to hyperparameter tuning High GPU requirements Transformer Architecture Enables efficient parallelization May struggle with very long sequences State-of-the-art in NLP projects Faster training due to parallelization Moderate to high GPU requirements

Unmanned systems forces UAV Elbit System Military warfare
Tags