# MULTILAYER PERCEPTRON

## Presentasi berjudul: "MULTILAYER PERCEPTRON"— Transcript presentasi:

MULTILAYER PERCEPTRON
Nurochman, Teknik Informatika UIN Sunan Kalijaga Yogyakarta

Σ Review SLP Σ xi.wi X1 w1 X2 w2 f(y) . wi X3 weight output
activation func Σ xi.wi wi X3 weight

Fungsi Aktivasi Fungsi undak biner (hard limit)
Fungsi undak biner (threshold)

Fungsi Aktivasi Fungsi bipolar Fungsi bipolar dengan threshold

Fungsi Aktivasi Fungsi Linier (identitas) Fungsi Sigmoid biner

Learning Algorithm Inisialisasi laju pembelajaran (α), nilai ambang (𝛉), bobot serta bias Menghitung

Learning Algorithm Jika y ≠ target, lakukan update bobot dan bias
Wi baru = Wlama + α.t.Xi b baru = b lama + α.t Ulang dari langkah 2 sampai tidak ada update bobot lagi

Problem “OR” X1 X2 net Y, 1 jika net >=1, 0 jika net < 1
=2 1 =1 1 =1 1 =0 0 Ternyata BERHASIL mengenali pola X1 X2 Y 1

Problem “AND” X1 X2 net Y, 1 jika net >=2, 0 jika net < 2
=2 1 =1 0 =1 0 =0 0 Ternyata BERHASIL mengenali pola X1 X2 Y 2 1

Problem “X1 and not(X2)” X1 X2 net Y, 1 jika net >=2, 0 jika net < 2 =1 0 =2 1 =-1 0 =0 0 Ternyata BERHASIL mengenali pola X1 X2 Y 2 -1

Problem “XOR” X1 X2 Y 1 1 0 1 0 1 0 1 1 0 0 0 GAGAL! F(1,1) = 0
GAGAL!

Solusi XOR = (x1 ^ ~x2) V (~x1 ^ x2)
Ternyata dibutuhkan sebuah layer tersembunyi X1 X2 Z1 Z2 Y 2 -1 1

Tabel

Multi-Layer Perceptron
MLP is a feedforward neural network with at least one hidden layer (Li Min Fu) Limitations of Single-Layer Perceptron Neural Network for Nonlinier Pattern Recognition XOR Problem

Solution for XOR Problem
X1 XOR X2 -1 1 1 -1 x1 x2

Solution from XOR Problem
+1 -1 x1 x2 0,1 1 if v > 0 (v) = -1 if v  0  is the sign function.

Input to Hidden layer -1 1 x1 x2 Net1 f1 Net2 f2 (-1.1+-1.-1) +-1=-1
( )+-1 = -1 1 ( )+-1= -3 ( )+-1 = 1 ( ) +-1= 1 ( )+-1 = -3 ( )+-1 = -1 ( )+-1 = -1

Hidden to Output layer -1 1 Z1 Z2 Net Y (-1.1+-1.1) = -1,9
( ) = 0,1 ( ) = 0,1

Learning Algorithm Backpropagation Algorithm
It adjusts the weights of the NN in order to minimize the average squared error Function signals Forward Step Error signals Backward Step

BP has two phases Forward pass phase: computes ‘functional signal’, feedforward propagation of input pattern signals through network Backward pass phase: computes ‘error signal’, propagates the error backwards through network starting at output units (where the error is the difference between actual and desired output values)

Activation Function Sigmoidal Function Increasing a 1
1 Increasing a

Presentasi serupa