Presentasi sedang didownload. Silahkan tunggu

Presentasi sedang didownload. Silahkan tunggu

MLP Feed-Forward Back Propagation Neural Net

Presentasi serupa


Presentasi berjudul: "MLP Feed-Forward Back Propagation Neural Net"— Transcript presentasi:

1 MLP Feed-Forward Back Propagation Neural Net
Nurochman

2 Multi-Layer Perceptron
Marvin Minsky dan Seymour Papert dlm buku: “Perceptrons: Introduction to Computational Geometry” (1969) ttg kelebihan dan keterbatasan Single Layer Perceptron SLP tidak mampu pola-pola yg secara linier tdk dapat dipisahkan Contoh kasus XOR tdk bisa dg SLP Solusi: menambahkan lapisan tengah/tersembunyi (hidden layer) SLP mengenal AND, OR, NOT X1 XOR X2: (NOT (X1 AND X2)) AND (X1 OR X2)

3 (NOT (X1 AND X2)) AND (X1 OR X2)
Solusi XOR (NOT (X1 AND X2)) AND (X1 OR X2) NOT X1 OR X2 X1 AND X2 X2 X1

4 Arsitektur MLP x1 xn

5 Algoritma BackPropagation
Inisialisasi bobot-bobot tentukan laju pembelajaran (α) tentukan nilai ambang/ nilai toleransi (𝛉) atau tentukan epoch maksimal While kondisi berhenti tdk terpenuhi do langah 3 – 10 Untuk setiap pasangan pola pelatihan, lakukan langkah 4 – 9 Tahap umpan maju Setiap unit input Xi dari i=1 sampai n mengirim sinyal ke lapisan tersembunyi

6 Menghitung sinyal output pada lapisan tersembunyi
Menghitung sinyal output pada lapisan output Tahap propagasi balik Menghitung error pada lapisan output, menghitung besar koreksi bobot dan bias antara lapisan tersembunyi dan output

7 Menghitung error pada lapisan tersembunyi, menghitung besar koreksi bobot dan bias antara lapisan input dan tersembunyi

8 Tahap update bobot dan bias
Update bobot dari lapisan tersembunyi ke lapisan output Update bobot dari lapisan input ke lapisan tersembunyi Tes kondisi berhenti

9 B 1 X1 Z1 Y Z2 X2 B 3 B 2

10 XOR Alpha=1, teta = 0,1 bobot = 0, bias 0
Z1 = 0 + ( ) = 0 -> 0 Z2 = 0 + ( ) = 0 -> 0 Y = 0 + ( ) = 0 -> 0 WZ1baru = WZ1lama = 0 WZ2 baru = WZ2lama = 0 B3baru = = -1 WX1Z1 baru = = -1 WX2Z1 baru = = -1 B1 baru = = 1

11 WX1Z2 baru = = -1 WX2Z2 baru = = -1 B2 baru = = 1 Z1 = 1 + ( ) = 3 -> 1 Z2 = 1 + ( ) = 3 -> 1 Y = -1 + ( ) = -1 -> -1

12 Z1 = 1 + ( ) = 1 -> 1 Z2 = 1 + ( ) = 1 -> 1 Y = -1 + ( ) = -1 -> -1 WZ1baru = = 1 WZ2 baru = = 1 B3baru = = 0 WX1Z1 baru = = -2 WX2Z1 baru = = 0 B1 baru = = 2 WX1Z2 baru = = -2 WX2Z2 baru = = 0 B2 baru = = 2 Z1 = 2 + ( ) = 4 -> 1 Z2 = 2 + ( ) = 4 -> 1 Y = 0 + ( ) = 2 -> 1

13 Z1 = 2 + ( ) = 0 -> 0 Z2 = 2 + ( ) = 0 -> 0 Y = 0 + ( ) = 0 -> 0 WZ1baru = = 1 WZ2 baru = = 1 B3baru = = 1 WX1Z1 baru = = -1 WX2Z1 baru = = -1 B1 baru = = 3 WX1Z2 baru = = -1 WX2Z2 baru = = -1 B2 baru = = 3 Z1 = 3 + ( ) = 3 -> 1 Z2 = 3 + ( ) = 3 -> 1 Y = 1 + ( ) = 3 -> 1

14 Z1 = 3 + ( ) = 1 -> 1 Z2 = 3 + ( ) = 1 -> 1 Y = 1 + ( ) = 3 -> 1 B1 baru = = 2 WX1Z1 baru = = 2 WX2Z1 baru = = 2 B2 baru = = 2 WX1Z2 baru = = 2 WX2Z2 baru = = 2 WZ1baru = = 0 WZ2 baru = = 0 B3baru = = 0 Z1 = 2 + ( ) = 6 -> 1 Z2 = 2 + ( ) = 6 -> 1 Y = 0 + ( ) = 0 -> 0

15 B1 baru = = 1 WX1Z1 baru = = 1 WX2Z1 baru = = 1 B2 baru = = 1 WX1Z2 baru = = 1 WX2Z2 baru = = 1 WZ1baru = = -1 WZ2 baru = = -1 B3baru = = -1 Z1 = 1 + ( ) = 3 -> 1 Z2 = 1 + ( ) = 3 -> 1 Y = -1 + ( ) = -3 -> -1

16 Any Questions?


Download ppt "MLP Feed-Forward Back Propagation Neural Net"

Presentasi serupa


Iklan oleh Google