Self-Organizing Network Model (SOM) Pertemuan 10 Matakuliah : T0293/Neuro Computing Tahun : 2005 Self-Organizing Network Model (SOM) Pertemuan 10
Unsupervised Learning of Clusters Neural Networks yang telah dibahas pada bab terdahulu melakukan fungsinya setelah dilakukan training (supervised) SOM adalah neural networks tanpa training (unsupervised) SOM melakukan learning berdasarkan pengelompokan data input (clustering of input data) Clustering pada dasarnya adalah pengelompok-kan dari objek-objek yang serupa &memisahkan objek-objek yang berbeda
Clustering dan Pengukuran Keserupaan Misalnya kita diberikan sekumpulan pola tanpa ada informasi adanya sejumlah cluster/pengelompokan. Masalah clustering dalam kasus ini adalah pengidentifikasian jumlah cluster berdasarkan kriteria-kriteria tertentu. Himpunan pola { X1, X2, ... , XN }, jika akan dikelompokkan memerlukan ‘decision function’. Pengukuran keserupaan yang umum digunakan adalah ‘Eucledian distance’. || X - Xi || = ( X - Xi )t ( X - Xi ) Semakin kecil ‘distance’ semakin serupa pola tersebut.
Struktur Pola Pola-pola dalam ‘Pattern Space’ dapat didistribusikan ke dalam dua struktur yang berbeda. Perhatikan struktur berikut. Cluster Straight lines Lines Class 1 Class 2 Class 3 (a) (b) Figure 7.10 Pattern structure: (a) natural similarity and (b) no natural similarity Struktur natural selalu berkaitan dengan pengelompokan pola (pattern clustering).
Algoritma SOM Compute Distances to All Nodes Compute distances dj between the input and each output node j using dj = (xi(t) - wij(t))2 where xi(t) is the input to node i at time t and wij(t)is the weight from input node i to output node j at time t N - 1 i = 0 Step 3. Step 1. Initialize Weights Initialize weights from N inputs to the M output nodes shown in Fig. 17 to small random values. Set the initial radius of the neighborhood shown in Fig. 18 Step 2. Present New Input
Step 4. Select Output Node with Minimum Distance Select node j* as that output node with minimum dj. Step 5. Update Weights to Node j* and Neighbors Weights are updated for node j* and all nodes in the neighborhood defined by Nej*(t) as shown in Fig. 18. New weights are wij (t+1) = wij (t) + (t) (xi (t) - wij (t)) For j Nej* (t) 0 i N - 1 The term (t) is a gain term ( 0 < (t) < 1) that decreases in time Step 6. Repeat by Going to Step 2
Algoritma Kohonen menghasilkan ‘vektor quantizer’ dengan cara menyesuaikan bobot. Figure 17. Two-dimensional array of output nodes used to form feature maps. Every input is connected to every output node via a variable connection weight. OUTPUT NODES X0 X1 Xn-1
Algoritma Kohonen yang menghasilkan ‘Feature Maps’ memerlukan neigbourhood di sekitar node-node Figure 18. Topological neighborhoods at different times as feature maps are formed. NEj (t) is the set of nodes considered to be in the neighborhood of node j at time t. The neighborhood starts large and slowly decreases in size over time. In this example, 0 < t1 < t2. NEj (0) NEj (t1) NEj (t2) j