Presentasi sedang didownload. Silahkan tunggu

Presentasi sedang didownload. Silahkan tunggu

MULTILAYER PERCEPTRON Nurochman, Teknik Informatika UIN Sunan Kalijaga Yogyakarta.

Presentasi serupa


Presentasi berjudul: "MULTILAYER PERCEPTRON Nurochman, Teknik Informatika UIN Sunan Kalijaga Yogyakarta."— Transcript presentasi:

1 MULTILAYER PERCEPTRON Nurochman, Teknik Informatika UIN Sunan Kalijaga Yogyakarta

2 Review SLP Σ X w1w1 w2w2 wiwi weight f(y) Σ x i.w i activation func X3 X1 output

3 Fungsi Aktivasi  Fungsi undak biner (hard limit)  Fungsi undak biner (threshold) 

4 Fungsi Aktivasi  Fungsi bipolar  Fungsi bipolar dengan threshold

5 Fungsi Aktivasi  Fungsi Linier (identitas)  Fungsi Sigmoid biner

6 Learning Algorithm  Inisialisasi laju pembelajaran ( α ), nilai ambang (), bobot serta bias  Menghitung

7 Learning Algorithm  Jika y ≠ target, lakukan update bobot dan bias Wi baru = Wlama + α.t.Xi b baru = b lama + α.t  Ulang dari langkah 2 sampai tidak ada update bobot lagi

8 Problem “OR” X1X2net Y, 1 jika net >=1, 0 jika net < = = = =00 Ternyata BERHASIL mengenali pola X1 X2 Y 1 1 1

9 Problem “AND” X1X2net Y, 1 jika net >=2, 0 jika net < = = = =00 Ternyata BERHASIL mengenali pola X1X1 X2X2 Y 2 1 1

10 Problem “X1 and not(X2)” X1X2net Y, 1 jika net >=2, 0 jika net < = = = =00 Ternyata BERHASIL mengenali pola X1 X2 Y 2 2

11 HOW ABOUT XOR?

12 Problem “XOR” X1X2 Y GAGAL! F(1,1) = 0 F(1,0) = 1F(0,0) = 0 F(0,1) = 1

13 Solusi  XOR = (x1 ^ ~x2) V (~x1 ^ x2)  Ternyata dibutuhkan sebuah layer tersembunyi X1 X2 Z1 Z2 Y

14 Tabel

15 Multi-Layer Perceptron  MLP is a feedforward neural network with at least one hidden layer (Li Min Fu)  Limitations of Single-Layer Perceptron  Neural Network for Nonlinier Pattern Recognition  XOR Problem

16 Solution for XOR Problem X1X2X1 XOR X x1x1 x2x2

17 Solution from XOR Problem +1 x1x1 x2x2 0,1 1 if v > 0  (v) = -1 if v  0  is the sign function.

18 Input to Hidden layer x1x2Net1f1Net2f2 ( ) +-1=-1( )+-1 = -1 1 ( )+-1= -3( )+-1 = 11 1 ( ) +-1= 11( )+-1 = ( )+-1 = -1( )+-1 = -1

19 Hidden to Output layer Z1Z2NetY ( ) = -1,9 1 ( ) = 0,11 1 ( ) = 0,11 ( ) = -1,9

20 Learning Algorithm  Backpropagation Algorithm  It adjusts the weights of the NN in order to minimize the average squared error Function signals Forward Step Error signals Backward Step

21 BP has two phases  Forward pass phase: computes ‘functional signal’, feedforward propagation of input pattern signals through network  Backward pass phase: computes ‘error signal’, propagates the error backwards through network starting at output units (where the error is the difference between actual and desired output values)

22 Activation Function  Sigmoidal Function Increasing a


Download ppt "MULTILAYER PERCEPTRON Nurochman, Teknik Informatika UIN Sunan Kalijaga Yogyakarta."

Presentasi serupa


Iklan oleh Google