AI and Machine Learning Libaries in 21st Centur.ppt

TodayTechnology 29 views 11 slides Jun 13, 2024
Slide 1
Slide 1 of 11
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11

About This Presentation

AI and Machine Learning Libraries


Slide Content

AI & Machine Learning Libraries
By Logan Kearsley

Purpose
The purpose of this project is to design a system that combines the
capabilities of multiple types of AI and machine learning systems, such as
nervous networks and subsumption architectures, to produce a more
flexible and versatile hybrid system.

Goals
The end goal is to produce a set of basic library functions and architecture
descriptions for the easy manipulation of the AI/ML subsystems (particularly
neural networks), and use those to build an AI system capable of teaching
itself how to complete tasks specified by a human-defined heuristic and
altering learned behaviors to cope with changes in its operational
environment with minimal human intervention.

Other Projects
Don't know of any other similar projects.
Builds on previous work done on multilayer perceptrons and subsumption
architecture.
Varies in trying to find ways to combine the different approaches to AI.

Design & Programming
Modular / Black Box Design
The end user should be able to put together a working AI system with
minimal knowledge of how the internals work
Programming done in C

Testing
Perceptron Neural Nets
Forced learning: make sure it will learn arbitrary input-output
mappings after a certain number of exposures
Subsumption Architecture
Simple test problems: does it run the right code for each sub-
problem?

Algorithms
Perceptrons:
Delta-rule learning: weights are adjusted based on the distance
between the net's current output and the optimal output
Matrix simulation: weights are stored in an I (# of inputs) by O (# of
outputs) matrix for each layer, rather than simulating each neuron
individually.
Subsumption Architecture:
Scheduler takes a list of function pointers to task-specific functions
Task functions return an output or null
Highest-prioritynon-null task has its output executed each iteration

Algorithms
Perceptron structure
Individual Neurons
vs.
Weight Matrix

Algorithms
Subsumption Architecture:

Problems
Back-Propagation is really confusing!

Results & Conclusions
Single-layer perceptron works well
Capable of learning arbitrary mappings, but not an arbitrary
combination of them
Multi-layer nets should learn arbitrary combinations, but learning
algorithm for hidden layers is confusing.
Can't re-use all of the same single-layer functions
Plan Change
Originally, wanted to create a working system
Now, project goal is to produce useful function libraries-working
systems are just for testing the code