Perceptron Network Weights between input & output units are adjusted. Weights between sensory associator units are fixed. Goal of Perceptron net is to classify the input pattern as a member on not a member to a particular class. X1 Xi 1 Xn Y X0 X1 Xi Xn y b W1 W2 Wn
Adaline Network Receives input from several units and one unit called bias. Inputs are +1 or -1, weights have sign + or - Net input calculated is applied to quantizer function to restore output to +1 or -1 Compares actual oputut with target output
Madaline Network Contains “n” units of input layer,”m ” units of adaline layers, “1” unit of Madaline Layer . Each neuron in the Adaline and madaline layer have a bias of excitation 1. Adaline layer is present between input and output Madaline Layer. Used in Communication Systems , equilizers and noise cancellation devices.
Back Propagation Network A multilayer Feed forward network consisting of Input, hidden and output layers. Hidden and output layers have biases whose activation is 1. Signals are reversed in learning phase. Inputs sent to BPN and o utputs obtained could be binary or bi-polar.
Auto Associative Memory Network Training input and target output vectors are same. Input layers consist of n input units & output layer consist of n output units. Input and output units are connected through weighted interconnections. Input and output vectors are perfectly correlated with each other component by component.
Maxnet Symmetrical weights are present over the weighted interconnections. Weights between neurons are inhibitory and fixed. The maxnet with this structure can be used as a subnet to select a particular node whose net input is the largest. X1 Xm Xi Xj 1 1 1 1
Mexican Hat Net Neurons are arranged in a linear order such that positive connections exist between Xi and neighborhood units & negative between Xi and far away units. Positive region is Cooperation and negative region is Competition. Size of these regions depend on the magnitude that exist between positive and negative weights. X i X i+1 X i+2 X i+3 X i-1 X i-2 X i-3 W 3 W 3 W 2 W 2 W1 W1 W