Presentasi sedang didownload. Silahkan tunggu

Presentasi sedang didownload. Silahkan tunggu

1 Pertemuan 10 Statistical Reasoning Matakuliah: T0264/Inteligensia Semu Tahun: Juli 2006 Versi: 2/1.

Presentasi serupa


Presentasi berjudul: "1 Pertemuan 10 Statistical Reasoning Matakuliah: T0264/Inteligensia Semu Tahun: Juli 2006 Versi: 2/1."— Transcript presentasi:

1 1 Pertemuan 10 Statistical Reasoning Matakuliah: T0264/Inteligensia Semu Tahun: Juli 2006 Versi: 2/1

2 2 Learning Outcomes Pada akhir pertemuan ini, diharapkan mahasiswa akan mampu : >

3 3 Outline Materi Materi 1 Materi 2 Materi 3 Materi 4 Materi 5

4 4 8.1 Probability and Bayes’ Theorem Teorema Bayes Probabilitas kondisional : P(H  E) adalah sbb : P(H i  E) = probabilitas hipotesis H i benar jika diberikan evidence E P(E  H i ) = probabilitas munculnya evidence E jika diketahui hipotesis H i benar P(H i )= probabilitas munculnya hipotesis H i (menurut hasil sebelumnya) tanpa memandang evidence apapun. Sering disebut sebagai a priori probability k = jumlah hipotesis yang mungkin

5 5 Probability and Bayes’ Theorem Persamaan teori Bayes

6 6 Probability and Bayes’ Theorem Jika terdapat evidence baru maka persamaan menjadi :

7 7 Probability and Bayes’ Theorem Contoh Kasus Vany mengalami gejala berupa timbulnya bintik-bintik diwajah. Dokter menduga Vany terkena cacar dengan kemungkinan : Probabilitas munculnya bintik-bintik diwajah, jika Vany terkena cacar, p(bintik2/cacar) = 0,8 Probabilitas Vany terkena cacar tanpa melihat gejala apapun, p(cacar) = 0,4 Probabilitas munculnya bintik-bintik diwajah, jika Vany alergi, p(bintik2/alergi) = 0,3 Probabilitas Vany terkena alergi tanpa melihat gejala apapun, p(alergi) = 0,7 Probabilitas munculnya bintik-bintik diwajah, jika Vany terkena jerawat, p(bintik2/jerawat) = 0,9 Probabilitas bahwa Vany terkena jerawat tanpa melihat gejala apapun, p(jerawat) = 0,5 Berapa besar kemungkinan dari masing-masing gejalan tersebut ?

8 8 Probability and Bayes’ Theorem Dengan cara yang sama didapat :

9 9 Jika setelah dilakukan pengujian terhadap hipotesis evidence baru, maka : Probability and Bayes’ Theorem E = evidence barui atau observasi baru e = evidence lama p(H/E,e) = probabilitas hipotesis H benar jika muncul evidence baru E dari evidence lama e p(H/E) = probabilitas hipotesis H benar jika diberikan evidence E p(e/E,H) = kaitan antara e dan E jika hipotesis H benar P(e/E) = kaitan antara e dan E tanpa memandang hipotesis apapun

10 10 Probability and Bayes’ Theorem Misal Vany mengalami bintik-bintik diwajah, dokter menduga Vany terkena cacar dengan probabilitas = 0,8. Secara teori dikatana bahwa orang yang terkena cacar badanya akan panas. Maka dilakukan observasi ulang terhadap Vany dan ternyata badanya panas. Jika diketahui probabilitas orang terkena cacar apabila panas badan, p(cacar,panas) = 0,5; keterkaitan antara bintik-bintik diwajah dan panas badan jika seseorang terkena cacar, p(bintik2/panas,cacar) = 0,4. Sedangkan kaitan antara bintik-bintik diwajah dan panas badan, p(bintik2/panas) = 0,6. Maka

11 11 Probability and Bayes’ Theorem p(cacar/panas,bintik2) = p(cacar/panas) * p(bintik2/panas,cacar) p(bintik2/panas) p(cacar/panas,bintik2) = 0,5 * (0,4/0,6) = 0,33

12 12 Probability and Bayes’ Theorem Hubungan antar himpunan adalah : panas bintik2 cacar

13 13 Certainty Factors and Rule-Based Systems Adding Certainty Factors to Rules An Example of a Mycin rule : If : (1) the stain of the organism is gram-positive, and (2) the morphology of the organism is coccus, and (3) the growth conformation of the organism is clumps, then there is suggestive evidence (0.7) that the identity of the organism staphylococcus

14 14 Certainty Factors and Rule-Based System Or, in internal form : PREMISE: ($AND (SAME CNTXT GRAM GRAMPOS) (SAME CNTXT MORPH COCCUS) (SAME CNTXT CONFORM CLUMPS)) ACTION: (CONCLUDE CNTXT IDENT STAPHYLOCOCCUS TALLY 0.7)

15 15 Certainty Factors and Rule-Based System Measure of Belief MB [h,e]  a measure (between 0 and 1) of belief in hypothesis h given the evidence e. MB measures the extent to which the evidence supports the hypothesis. It is zero if the evidence fails to support the hypothesis. 2. MD [h,e]  a measure (between 0 and 1) of disbelief in hypothesis h given the evidence e. MD measures the extent to which the evidence supports the negation of the hypothesis. It is zero if the evidence supports the hypothesis. 3. CF[h,e] = MB[h,e] - MD[h,e]

16 16 Certainty Factors and Rule-Based System Combining Uncertain Rules

17 17 Certainty Factors and Rule-Based System Goals for combining rules : Since the order on which evidence is collected is arbitrary, the combining functions should be commutative and associative. Until certainty is reached, additional confirming evidence should increase MB (and similarly for disconfirming evidence and MD) If uncertain inferences are chained together, then the result should be less certain than either of the inferences alone.

18 18 Combining Two Pieces of Evidence

19 19 An Example of Combining Two Observations

20 20 The Definition of Certainty Factors Original definitions : Similarly, the MD is the proportionate decrease in belief in h as a result of e: But this definition is incompatible with Bayesian conditional probability. The following, slightly revised one is not :

21 21 What if the Observations are not Independent Scenario (a) : Reconsider a rule with three antecedents and a CF of 0.7. Suppose that if there were three separate rules, each would have had a CF of 0.6. In other words, they are not independent. Then, using the combining rules, the total would be : This is very different than 0.7.

22 22 What if the Observations are not Independent Scenario (c) : Events : S : sprinkler was on last night W : grass is wet R : it rained last night MYCIN - style rules : If :the sprinkler was on last night, then there is suggestive evidence (0.9) that the grass will be wet this morning If :the grass is wet this morning, then there is suggestive evidence (0.8) that it rained last night

23 23 What if the Observations are not Independent Combining the rules, we get : MB[W,S]= 0.8 {sprinkler suggests wet} MB[R,W]= 0.8 * 0.9 = 0.72 {wet suggests rains} So sprinkler made us believe rain.

24 24 8.3 Bayesian Networks Bayesian Networks : Representing Causality Uniformly

25 25 Conditional Probabilities for a Bayesian Network AttributeProbability P(Wet  Sprinkler, Rain) 0.95 P(Wet  Sprinkler,  Rain) 0.9 P(Wet  Sprinkler, Rain) 0.8 P(Wet  Sprinkler,  Rain) 0.1 P(Sprinkler  RainySeason) 0.0 P(Sprinkler  RainySeason) 1.0 P(Rain  RainySeason) 0.9 P(Rain  RainySeason) 0.1 P(RainySeason)0.5

26 26 > End of Pertemuan 10 Good Luck


Download ppt "1 Pertemuan 10 Statistical Reasoning Matakuliah: T0264/Inteligensia Semu Tahun: Juli 2006 Versi: 2/1."

Presentasi serupa


Iklan oleh Google