Presentasi sedang didownload. Silahkan tunggu

Presentasi sedang didownload. Silahkan tunggu

MULTILAYER PERCEPTRON

Presentasi serupa


Presentasi berjudul: "MULTILAYER PERCEPTRON"— Transcript presentasi:

1 MULTILAYER PERCEPTRON
Nurochman, Teknik Informatika UIN Sunan Kalijaga Yogyakarta

2 Σ Review SLP Σ xi.wi X1 w1 X2 w2 f(y) . wi X3 weight output
activation func Σ xi.wi wi X3 weight

3 Fungsi Aktivasi Fungsi undak biner (hard limit)
Fungsi undak biner (threshold)

4 Fungsi Aktivasi Fungsi bipolar Fungsi bipolar dengan threshold

5 Fungsi Aktivasi Fungsi Linier (identitas) Fungsi Sigmoid biner

6 Learning Algorithm Inisialisasi laju pembelajaran (α), nilai ambang (𝛉), bobot serta bias Menghitung

7 Learning Algorithm Jika y ≠ target, lakukan update bobot dan bias
Wi baru = Wlama + α.t.Xi b baru = b lama + α.t Ulang dari langkah 2 sampai tidak ada update bobot lagi

8 Problem “OR” X1 X2 net Y, 1 jika net >=1, 0 jika net < 1
=2 1 =1 1 =1 1 =0 0 Ternyata BERHASIL mengenali pola X1 X2 Y 1

9 Problem “AND” X1 X2 net Y, 1 jika net >=2, 0 jika net < 2
=2 1 =1 0 =1 0 =0 0 Ternyata BERHASIL mengenali pola X1 X2 Y 2 1

10 Problem “X1 and not(X2)” X1 X2 net Y, 1 jika net >=2, 0 jika net < 2 =1 0 =2 1 =-1 0 =0 0 Ternyata BERHASIL mengenali pola X1 X2 Y 2 -1

11 How about XOR?

12 Problem “XOR” X1 X2 Y 1 1 0 1 0 1 0 1 1 0 0 0 GAGAL! F(1,1) = 0
GAGAL!

13 Solusi XOR = (x1 ^ ~x2) V (~x1 ^ x2)
Ternyata dibutuhkan sebuah layer tersembunyi X1 X2 Z1 Z2 Y 2 -1 1

14 Tabel

15 Multi-Layer Perceptron
MLP is a feedforward neural network with at least one hidden layer (Li Min Fu) Limitations of Single-Layer Perceptron Neural Network for Nonlinier Pattern Recognition XOR Problem

16 Solution for XOR Problem
X1 XOR X2 -1 1 1 -1 x1 x2

17 Solution from XOR Problem
+1 -1 x1 x2 0,1 1 if v > 0 (v) = -1 if v  0  is the sign function.

18 Input to Hidden layer -1 1 x1 x2 Net1 f1 Net2 f2 (-1.1+-1.-1) +-1=-1
( )+-1 = -1 1 ( )+-1= -3 ( )+-1 = 1 ( ) +-1= 1 ( )+-1 = -3 ( )+-1 = -1 ( )+-1 = -1

19 Hidden to Output layer -1 1 Z1 Z2 Net Y (-1.1+-1.1) = -1,9
( ) = 0,1 ( ) = 0,1

20 Learning Algorithm Backpropagation Algorithm
It adjusts the weights of the NN in order to minimize the average squared error Function signals Forward Step Error signals Backward Step

21 BP has two phases Forward pass phase: computes ‘functional signal’, feedforward propagation of input pattern signals through network Backward pass phase: computes ‘error signal’, propagates the error backwards through network starting at output units (where the error is the difference between actual and desired output values)

22 Activation Function Sigmoidal Function Increasing a 1
1 Increasing a


Download ppt "MULTILAYER PERCEPTRON"

Presentasi serupa


Iklan oleh Google