Multi-Layer Perceptron MLP is a feedforward neural network with at least one hidden layer (Li Min Fu) Limitations of Single-Layer Perceptron Neural Network for Nonlinier Pattern Recognition XOR Problem
Solution for XOR Problem X1X2X1 XOR X2 11 1 1 11 1 1 x1x1 x2x2
Solution from XOR Problem +1 x1x1 x2x2 0,1 1 if v > 0 (v) = -1 if v 0 is the sign function.
Learning Algorithm Backpropagation Algorithm It adjusts the weights of the NN in order to minimize the average squared error Function signals Forward Step Error signals Backward Step
BP has two phases Forward pass phase: computes ‘functional signal’, feedforward propagation of input pattern signals through network Backward pass phase: computes ‘error signal’, propagates the error backwards through network starting at output units (where the error is the difference between actual and desired output values)
Activation Function Sigmoidal Function -10 -8 -6 -4 -2 2 4 6 8 10 1 Increasing a