Presentasi sedang didownload. Silahkan tunggu

Presentasi sedang didownload. Silahkan tunggu

Yeni Herdiyeni Departemen Ilmu Komputer

Presentasi serupa


Presentasi berjudul: "Yeni Herdiyeni Departemen Ilmu Komputer"— Transcript presentasi:

1 Yeni Herdiyeni Departemen Ilmu Komputer
Bayesian Networks Yeni Herdiyeni Departemen Ilmu Komputer

2 Overview Decision-theoretic techniques Applications to AI problems
Explicit management of uncertainty and tradeoffs Probability theory Maximization of expected utility Applications to AI problems Diagnosis Expert systems Planning Learning

3 Course Contents Concepts in Probability Bayesian Networks Inference
Random variables Basic properties (Bayes rule) Bayesian Networks Inference Applications

4 Probabilities Probability distribution P(X|x) X is a random variable
Discrete Continuous x is background state of information

5 Discrete Random Variables
Finite set of possible outcomes X binary:

6 Continuous Random Variable
Probability distribution (density function) over continuous values

7 More Probabilities Joint Conditional Probability that both X=x and Y=y
Probability that X=x given we know that Y=y

8 Rules of Probability Product Rule Marginalization X binary:

9 Bayes Rule

10 Course Contents Concepts in Probability Bayesian Networks Inference
Basics Additional structure Knowledge acquisition Inference Decision making Learning networks from data Reasoning over time Applications

11 Bayesian networks Basics Structured representation
Conditional independence Naïve Bayes model Independence facts

12 Naïve Bayes vs Bayesian Network
Pada Naïve Bayes, mengabaikan korelasi antar variabel. Sedangkan pada Bayesian Network, variabel input bisa saling dependent.

13 Bayesian Networks P( S=no) 0.80 S=light) 0.15 S=heavy) 0.05 Smoking=
Cancer P( S=no) 0.80 S=light) 0.15 S=heavy) 0.05 Smoking= no light heavy P( C=none) 0.96 0.88 0.60 C=benign) 0.03 0.08 0.25 C=malig) 0.01 0.04 0.15

14 Product Rule P(C,S) = P(C|S) P(S)

15 Marginalization P(Smoke) P(Cancer)

16 Bayes Rule Revisited Cancer= none benign malignant P( S=no) 0.821
0.522 0.421 S=light) 0.141 0.261 0.316 S=heavy) 0.037 0.217 0.263

17 A Bayesian Network Age Gender Exposure to Toxics Smoking Cancer Serum
Calcium Lung Tumor

18 Independence Age and Gender are independent. P(A,G) = P(G)P(A)
P(A|G) = P(A) A ^ G P(G|A) = P(G) G ^ A P(A,G) = P(G|A) P(A) = P(G)P(A) P(A,G) = P(A|G) P(G) = P(A)P(G)

19 Conditional Independence
Cancer is independent of Age and Gender given Smoking. Age Gender Smoking P(C|A,G,S) = P(C|S) C ^ A,G | S Cancer

20 More Conditional Independence: Naïve Bayes
Serum Calcium and Lung Tumor are dependent Cancer Serum Calcium is independent of Lung Tumor, given Cancer P(L|SC,C) = P(L|C) Serum Calcium Lung Tumor

21 More Conditional Independence: Explaining Away
Exposure to Toxics and Smoking are independent Exposure to Toxics Smoking E ^ S Cancer Exposure to Toxics is dependent on Smoking, given Cancer P(E = heavy | C = malignant) > P(E = heavy | C = malignant, S=heavy)

22 Put it all together Smoking Gender Age Cancer Lung Tumor Serum Calcium
Exposure to Toxics

23 General Product (Chain) Rule for Bayesian Networks
Pai=parents(Xi)

24 Conditional Independence
A variable (node) is conditionally independent of its non-descendants given its parents. Age Gender Non-Descendants Exposure to Toxics Smoking Parents Cancer is independent of Age and Gender given Exposure to Toxics and Smoking. Cancer Serum Calcium Lung Tumor Descendants

25 Another non-descendant
Age Gender Cancer is independent of Diet given Exposure to Toxics and Smoking. Exposure to Toxics Smoking Diet Cancer Serum Calcium Lung Tumor

26 Bayesian Network Bayesian Network atau Belief Network atau Probabilistik Network adalah model grafik untuk merepresentasikan interaksi antar variabel. Bayesian Network digambarkan seperti graf yang terdiri dari simpul (node) dan busur (arc). Simpul menunjukkan variabel misal X beserta nilai probabilitasnya P(X) dan busur menunjukkan hubungan antar simpul. Jika ada hubungan dari simpul X ke simpul Y, ini mengindikasikan bahwa variabel X ada pengaruh terhadap variabel Y. Pengaruh ini dinyatakan dengan peluang bersyarat P(Y|X).

27 Bayesian Network - Latihan
Yeni Herdiyeni

28 Bayesian Network Dari gambar tersebut dapat diketahui peluang gabungan dari P(R,W). Jika P(R) = 0.4, maka P(~R) = 0.6 dan jika P(~W|~R) = 0.8. Kaidah Bayes dapat digunakan untuk membuat diagnosa.

29 Bayesian Network Sebagai contoh jika diketahui bahwa rumput basah, maka peluang hujan dapat dihitung sebagai berikut :

30 Bayesian Network Berapa peluang rumput basah jika Springkler menyala (tidak diketahui hujan atau tidak)

31 Bayesian Network Berapa peluang Springkler menyala setelah diketahui rumput basah P(S|W)?

32 Bayesian Network Jika diketahui hujan, berapa peluang Springkler menyala?

33 Bayesian Network Bagaimana jika ada asumsi : Jika cuacanya mendung (cloudy), maka Springkler kemungkinan besar tidak menyala.

34 Bayesian Network Berapa peluang rumput basah jika diketahui cloudy?

35 Bayesian Network Berapa peluang rumput basah jika diketahui cloudy?

36 Latihan Jika ada seekor kucing yang suka berjalan di atap dan membuat keributan. Jika hujan, kucing tidak keluar. Berapa peluang kita akan mendengar kucing diatap jika cuaca Cloudy? P(F|C)

37 Course Contents Concepts in Probability Bayesian Networks Inference
Decision making Learning networks from data Reasoning over time Applications

38 Inference Patterns of reasoning Basic inference Exact inference
Exploiting structure Approximate inference

39

40 Predictive Inference How likely are elderly males
Age Gender How likely are elderly males to get malignant cancer? Exposure to Toxics Smoking Cancer P(C=malignant | Age>60, Gender= male) Serum Calcium Lung Tumor

41 Combined Age Gender How likely is an elderly male patient with high Serum Calcium to have malignant cancer? Exposure to Toxics Smoking Cancer P(C=malignant | Age>60, Gender= male, Serum Calcium = high) Serum Calcium Lung Tumor

42 Explaining away Age Gender If we see a lung tumor, the probability of heavy smoking and of exposure to toxics both go up. Exposure to Toxics If we then observe heavy smoking, the probability of exposure to toxics goes back down. Smoking Smoking Cancer Serum Calcium Lung Tumor

43 Inference in Belief Networks
Find P(Q=q|E= e) Q the query variable E set of evidence variables P(q | e) = P(q, e) P(e) X1,…, Xn are network variables except Q, E P(q, e) = S P(q, e, x1,…, xn) x1,…, xn

44 Basic Inference A B P(b) = ?

45 Product Rule S C P(C,S) = P(C|S) P(S)

46 Marginalization P(Smoke) P(Cancer)

47 Basic Inference A B C P(b) = S P(a, b) = S P(b | a) P(a)
P(c) = S P(c | b) P(b) P(c) = S P(a, b, c) b,a b,a = S P(c | b) P(b | a) P(a) = S P(c | b) S P(b | a) P(a) b a P(b)

48 Inference in trees Y1 Y2 X X P(x) = S P(x | y1, y2) P(y1, y2)
because of independence of Y1, Y2: y1, y2 = S P(x | y1, y2) P(y1) P(y2)

49 Course Contents Concepts in Probability Bayesian Networks Inference
Learning networks from data Reasoning over time Applications

50 Course Contents Concepts in Probability Bayesian Networks Inference
Applications

51 Applications Medical expert systems Fault diagnosis Vista
Pathfinder Parenting MSN Fault diagnosis Ricoh FIXIT Decision-theoretic troubleshooting Vista Collaborative filtering

52 Why use Bayesian Networks?
Explicit management of uncertainty/tradeoffs Modularity implies maintainability Better, flexible, and robust recommendation strategies

53 Pathfinder Pathfinder is one of the first BN systems.
It performs diagnosis of lymph-node diseases. It deals with over 60 diseases and 100 findings. Commercialized by Intellipath and Chapman Hall publishing and applied to about 20 tissue types.

54 Commercial system: Integration
Expert System with advanced diagnostic capabilities uses key features to form the differential diagnosis recommends additional features to narrow the differential diagnosis recommends features needed to confirm the diagnosis explains correct and incorrect decisions Video atlases and text organized by organ system “Carousel Mode” to build customized lectures Anatomic Pathology Information System

55 On Parenting: Selecting problem
Diagnostic indexing for Home Health site on Microsoft Network Enter symptoms for pediatric complaints Recommends multimedia content

56 On Parenting : MSN Original Multiple Fault Model

57 Single Fault approximation

58 On Parenting: Selecting problem

59 Performing diagnosis/indexing

60 RICOH Fixit Diagnostics and information retrieval

61 FIXIT: Ricoh copy machine

62 Online Troubleshooters

63 Define Problem

64 Gather Information

65 Get Recommendations

66 Vista Project: NASA Mission Control
Decision-theoretic methods for display for high-stakes aerospace decisions

67 What is Collaborative Filtering?
A way to find cool websites, news stories, music artists etc Uses data on the preferences of many users, not descriptions of the content. Firefly, Net Perceptions (GroupLens), and others offer this technology.

68 Bayesian Clustering for Collaborative Filtering
Probabilistic summary of the data Reduces the number of parameters to represent a set of preferences Provides insight into usage patterns. Inference: P(Like title i | Like title j, Like title k)

69 Applying Bayesian clustering
user classes ... title 1 title 2 title n class class title1 p(like)=0.2 p(like)=0.8 title2 p(like)=0.7 p(like)=0.1 title3 p(like)=0.99 p(like)=0.01 ...

70 Richer model Age Gender Likes soaps User class Watches Watches Watches
Seinfeld Watches NYPD Blue Watches Power Rangers

71 What’s old? Decision theory & probability theory provide:
principled models of belief and preference; techniques for: integrating evidence (conditioning); optimal decision making (max. expected utility); targeted information gathering (value of info.); parameter estimation from data.

72 What’s new? Bayesian networks exploit domain structure to allow
compact representations of complex models. Knowledge Acquisition Learning Structured Representation Inference

73 Some Important AI Contributions
Key technology for diagnosis. Better more coherent expert systems. New approach to planning & action modeling: planning using Markov decision problems; new framework for reinforcement learning; probabilistic solution to frame & qualification problems. New techniques for learning models from data.

74 What’s in our future? Better models for:
Structured Representation Better models for: preferences & utilities; not-so-precise numerical probabilities. Inferring causality from data. More expressive representation languages: structured domains with multiple objects; levels of abstraction; reasoning about time; hybrid (continuous/discrete) models.


Download ppt "Yeni Herdiyeni Departemen Ilmu Komputer"

Presentasi serupa


Iklan oleh Google