1 Pertemuan 10 Statistical Reasoning Matakuliah: T0264/Inteligensia Semu Tahun: Juli 2006 Versi: 2/1.

Slides:



Advertisements
Presentasi serupa
Metode Inferensi dan Penalaran
Advertisements

KETIDAKPASTIAN PERTEMUAN 14.
Pertemuan X “INFERENSI DENGAN KETIDAK PASTIAN”
Ketidakpastian Stmik-mdp, Palembang
FAKTOR KEPASTIAN (CERTAINTY FACTOR)
Team Teaching Ketidakpastian.
KETIDAKPASTIAN PERTEMUAN 6.
Kuliah Sistem Pakar “INFERENSI DENGAN KETIDAK PASTIAN”
K-Map Using different rules and properties in Boolean algebra can simplify Boolean equations May involve many of rules / properties during simplification.
© 2002 Prentice-Hall, Inc.Chap 4-1 Bab 4 Probabilitas.
Pertemuan 11 “INFERENSI DENGAN KETIDAK PASTIAN”
KECERDASAN BUATAN (Artificial Intelligence) Materi 4
Presented By : Group 2. A solution of an equation in two variables of the form. Ax + By = C and Ax + By + C = 0 A and B are not both zero, is an ordered.
KETIDAKPASTIAN (UNCERTAINTY)
Data Mining: Klasifikasi dan Prediksi Naive Bayesian & Bayesian Network . April 13, 2017.
1 Pertemuan 02 Ukuran Pemusatan dan Lokasi Matakuliah: I Statistika Tahun: 2008 Versi: Revisi.
Pertemuan 05 Sebaran Peubah Acak Diskrit
1 Pertemuan 12 Pengkodean & Implementasi Matakuliah: T0234 / Sistem Informasi Geografis Tahun: 2005 Versi: 01/revisi 1.
Ruang Contoh dan Peluang Pertemuan 05
Pendugaan Parameter Proporsi dan Varians (Ragam) Pertemuan 14 Matakuliah: L0104 / Statistika Psikologi Tahun : 2008.
Pertemuan 22 FUZZIFIKASI DAN DEFUZZIFIKASI
1 Pertemuan 6 Using Predicate logic Matakuliah: T0264/Inteligensia Semu Tahun: Juli 2006 Versi: 2/1.
1 Pertemuan 10 Fungsi Kepekatan Khusus Matakuliah: I0134 – Metode Statistika Tahun: 2007.
1 Pertemuan 22 Analisis Studi Kasus 2 Matakuliah: H0204/ Rekayasa Sistem Komputer Tahun: 2005 Versi: v0 / Revisi 1.
BAB 6 KOMBINATORIAL DAN PELUANG DISKRIT. KOMBINATORIAL (COMBINATORIC) : ADALAH CABANG MATEMATIKA YANG MEMPELAJARI PENGATURAN OBJEK- OBJEK. ADALAH CABANG.
Pertemuan XIV FUNGSI MAYOR Assosiation. What Is Association Mining? Association rule mining: –Finding frequent patterns, associations, correlations, or.
KETIDAKPASTIAN PERTEMUAN 7.
Pertemuan 07 Peluang Beberapa Sebaran Khusus Peubah Acak Kontinu
Bina Nusantara Mata Kuliah: K0194-Pemodelan Matematika Terapan Tahun : 2008 Aplikasi Model Markov Pertemuan 22:
Statistika Mulaab,S,si M.kom Lab CAI Teknik Informatika xxxx Website Kuliah : mulaab.wordpress.com.
1 Pertemuan #2 Probability and Statistics Matakuliah: H0332/Simulasi dan Permodelan Tahun: 2005 Versi: 1/1.
1 Pertemuan 8 JARINGAN COMPETITIVE Matakuliah: H0434/Jaringan Syaraf Tiruan Tahun: 2005 Versi: 1.
1 Pertemuan 15 Game Playing Matakuliah: T0264/Intelijensia Semu Tahun: Juli 2006 Versi: 2/1.
1 Minggu 10, Pertemuan 20 Normalization (cont.) Matakuliah: T0206-Sistem Basisdata Tahun: 2005 Versi: 1.0/0.0.
1 Pertemuan > > Matakuliah: >/ > Tahun: > Versi: >
1 Pertemuan 12 WIDROW HOFF LEARNING Matakuliah: H0434/Jaringan Syaraf Tiruan Tahun: 2005 Versi: 1.
9.3 Geometric Sequences and Series. Objective To find specified terms and the common ratio in a geometric sequence. To find the partial sum of a geometric.
Expectation Maximization. Coin flipping experiment  Diberikan koin A dan B dengan nilai bias A dan B yang belum diketahui  Koin A akan memunculkan head.
Probabilitas & Teorema Bayes
SITI ROHMAH LULU MUTHOHAROH CLASS B
Certainty Factors (CF) And Beliefs
Cartesian coordinates in two dimensions
Cartesian coordinates in two dimensions
Penanganan Ketidakpastian
Statistika Chapter 4 Probability.
Pengujian Hipotesis (I) Pertemuan 11
Teorema Bayes.
Parabola Parabola.
Pertemuan 24 Teknik Searching
KETIDAKPASTIAN PERTEMUAN 7.
Fakultas Ilmu Komputer
INFERENSI DENGAN KETIDAKPASTIAN
REAL NUMBERS EKSPONENT NUMBERS.
Penanganan Ketidakpastian
BAYES 17/9/2015 Kode MK : MK :.
Fungsi Kepekatan Peluang Khusus Pertemuan 10
Pertemuan 11 Statistical Reasoning
Pert 7 KETIDAKPASTIAN.
Pertemuan #1 The Sentence
Pertemuan 21 dan 22 Analisis Regresi dan Korelasi Sederhana
Uncertainty Representation (Ketidakpastian).
TEORI PROBABILITAS by WAHYUYANTI (WYT)
Probabilitas & Teorema Bayes
THE INFORMATION ABOUT HEALTH INSURANCE IN AUSTRALIA.
Group 3 About causal Conjunction Member : 1. Ahmad Fandia R. S.(01) 2. Hesti Rahayu(13) 3. Intan Nuraini(16) 4. Putri Nur J. (27) Class: XI Science 5.
Lesson 2-1 Conditional Statements 1 Lesson 2-1 Conditional Statements.
Kuliah Sistem Pakar Pertemuan VII “INFERENSI DENGAN KETIDAK PASTIAN”
Probability IIntroduction to Probability ASatisfactory outcomes vs. total outcomes BBasic Properties CTerminology IICombinatory Probability AThe Addition.
Draw a picture that shows where the knife, fork, spoon, and napkin are placed in a table setting.
Wednesday/ September,  There are lots of problems with trade ◦ There may be some ways that some governments can make things better by intervening.
Transcript presentasi:

1 Pertemuan 10 Statistical Reasoning Matakuliah: T0264/Inteligensia Semu Tahun: Juli 2006 Versi: 2/1

2 Learning Outcomes Pada akhir pertemuan ini, diharapkan mahasiswa akan mampu : >

3 Outline Materi Materi 1 Materi 2 Materi 3 Materi 4 Materi 5

4 8.1 Probability and Bayes’ Theorem Teorema Bayes Probabilitas kondisional : P(H  E) adalah sbb : P(H i  E) = probabilitas hipotesis H i benar jika diberikan evidence E P(E  H i ) = probabilitas munculnya evidence E jika diketahui hipotesis H i benar P(H i )= probabilitas munculnya hipotesis H i (menurut hasil sebelumnya) tanpa memandang evidence apapun. Sering disebut sebagai a priori probability k = jumlah hipotesis yang mungkin

5 Probability and Bayes’ Theorem Persamaan teori Bayes

6 Probability and Bayes’ Theorem Jika terdapat evidence baru maka persamaan menjadi :

7 Probability and Bayes’ Theorem Contoh Kasus Vany mengalami gejala berupa timbulnya bintik-bintik diwajah. Dokter menduga Vany terkena cacar dengan kemungkinan : Probabilitas munculnya bintik-bintik diwajah, jika Vany terkena cacar, p(bintik2/cacar) = 0,8 Probabilitas Vany terkena cacar tanpa melihat gejala apapun, p(cacar) = 0,4 Probabilitas munculnya bintik-bintik diwajah, jika Vany alergi, p(bintik2/alergi) = 0,3 Probabilitas Vany terkena alergi tanpa melihat gejala apapun, p(alergi) = 0,7 Probabilitas munculnya bintik-bintik diwajah, jika Vany terkena jerawat, p(bintik2/jerawat) = 0,9 Probabilitas bahwa Vany terkena jerawat tanpa melihat gejala apapun, p(jerawat) = 0,5 Berapa besar kemungkinan dari masing-masing gejalan tersebut ?

8 Probability and Bayes’ Theorem Dengan cara yang sama didapat :

9 Jika setelah dilakukan pengujian terhadap hipotesis evidence baru, maka : Probability and Bayes’ Theorem E = evidence barui atau observasi baru e = evidence lama p(H/E,e) = probabilitas hipotesis H benar jika muncul evidence baru E dari evidence lama e p(H/E) = probabilitas hipotesis H benar jika diberikan evidence E p(e/E,H) = kaitan antara e dan E jika hipotesis H benar P(e/E) = kaitan antara e dan E tanpa memandang hipotesis apapun

10 Probability and Bayes’ Theorem Misal Vany mengalami bintik-bintik diwajah, dokter menduga Vany terkena cacar dengan probabilitas = 0,8. Secara teori dikatana bahwa orang yang terkena cacar badanya akan panas. Maka dilakukan observasi ulang terhadap Vany dan ternyata badanya panas. Jika diketahui probabilitas orang terkena cacar apabila panas badan, p(cacar,panas) = 0,5; keterkaitan antara bintik-bintik diwajah dan panas badan jika seseorang terkena cacar, p(bintik2/panas,cacar) = 0,4. Sedangkan kaitan antara bintik-bintik diwajah dan panas badan, p(bintik2/panas) = 0,6. Maka

11 Probability and Bayes’ Theorem p(cacar/panas,bintik2) = p(cacar/panas) * p(bintik2/panas,cacar) p(bintik2/panas) p(cacar/panas,bintik2) = 0,5 * (0,4/0,6) = 0,33

12 Probability and Bayes’ Theorem Hubungan antar himpunan adalah : panas bintik2 cacar

13 Certainty Factors and Rule-Based Systems Adding Certainty Factors to Rules An Example of a Mycin rule : If : (1) the stain of the organism is gram-positive, and (2) the morphology of the organism is coccus, and (3) the growth conformation of the organism is clumps, then there is suggestive evidence (0.7) that the identity of the organism staphylococcus

14 Certainty Factors and Rule-Based System Or, in internal form : PREMISE: ($AND (SAME CNTXT GRAM GRAMPOS) (SAME CNTXT MORPH COCCUS) (SAME CNTXT CONFORM CLUMPS)) ACTION: (CONCLUDE CNTXT IDENT STAPHYLOCOCCUS TALLY 0.7)

15 Certainty Factors and Rule-Based System Measure of Belief MB [h,e]  a measure (between 0 and 1) of belief in hypothesis h given the evidence e. MB measures the extent to which the evidence supports the hypothesis. It is zero if the evidence fails to support the hypothesis. 2. MD [h,e]  a measure (between 0 and 1) of disbelief in hypothesis h given the evidence e. MD measures the extent to which the evidence supports the negation of the hypothesis. It is zero if the evidence supports the hypothesis. 3. CF[h,e] = MB[h,e] - MD[h,e]

16 Certainty Factors and Rule-Based System Combining Uncertain Rules

17 Certainty Factors and Rule-Based System Goals for combining rules : Since the order on which evidence is collected is arbitrary, the combining functions should be commutative and associative. Until certainty is reached, additional confirming evidence should increase MB (and similarly for disconfirming evidence and MD) If uncertain inferences are chained together, then the result should be less certain than either of the inferences alone.

18 Combining Two Pieces of Evidence

19 An Example of Combining Two Observations

20 The Definition of Certainty Factors Original definitions : Similarly, the MD is the proportionate decrease in belief in h as a result of e: But this definition is incompatible with Bayesian conditional probability. The following, slightly revised one is not :

21 What if the Observations are not Independent Scenario (a) : Reconsider a rule with three antecedents and a CF of 0.7. Suppose that if there were three separate rules, each would have had a CF of 0.6. In other words, they are not independent. Then, using the combining rules, the total would be : This is very different than 0.7.

22 What if the Observations are not Independent Scenario (c) : Events : S : sprinkler was on last night W : grass is wet R : it rained last night MYCIN - style rules : If :the sprinkler was on last night, then there is suggestive evidence (0.9) that the grass will be wet this morning If :the grass is wet this morning, then there is suggestive evidence (0.8) that it rained last night

23 What if the Observations are not Independent Combining the rules, we get : MB[W,S]= 0.8 {sprinkler suggests wet} MB[R,W]= 0.8 * 0.9 = 0.72 {wet suggests rains} So sprinkler made us believe rain.

Bayesian Networks Bayesian Networks : Representing Causality Uniformly

25 Conditional Probabilities for a Bayesian Network AttributeProbability P(Wet  Sprinkler, Rain) 0.95 P(Wet  Sprinkler,  Rain) 0.9 P(Wet  Sprinkler, Rain) 0.8 P(Wet  Sprinkler,  Rain) 0.1 P(Sprinkler  RainySeason) 0.0 P(Sprinkler  RainySeason) 1.0 P(Rain  RainySeason) 0.9 P(Rain  RainySeason) 0.1 P(RainySeason)0.5

26 > End of Pertemuan 10 Good Luck