Probabilistic Reasoning over Time Session 13

Slides:



Advertisements
Presentasi serupa
Design and Analysis of Algorithm Recursive Algorithm Analysis
Advertisements

2. Introduction to Algorithm and Programming
PERSAMAAN DIFERENSIAL (DIFFERENTIAL EQUATION)
Relation
Analisis spektra UV-Vis senyawa kompleks
1 DATA STRUCTURE “ STACK” SHINTA P STMIK MDP APRIL 2011.
Proses Stokastik Semester Ganjil 2013/2014
Presented By : Group 2. A solution of an equation in two variables of the form. Ax + By = C and Ax + By + C = 0 A and B are not both zero, is an ordered.
Compound Amount Factors
Data Mining: Klasifikasi dan Prediksi Naive Bayesian & Bayesian Network . April 13, 2017.
Pertemuan 05 Sebaran Peubah Acak Diskrit
Ruang Contoh dan Peluang Pertemuan 05
1 Pertemuan 10 Statistical Reasoning Matakuliah: T0264/Inteligensia Semu Tahun: Juli 2006 Versi: 2/1.
PENDUGAAN PARAMETER Pertemuan 7
1 Pertemuan 21 Function Matakuliah: M0086/Analisis dan Perancangan Sistem Informasi Tahun: 2005 Versi: 5.
BAB 6 KOMBINATORIAL DAN PELUANG DISKRIT. KOMBINATORIAL (COMBINATORIC) : ADALAH CABANG MATEMATIKA YANG MEMPELAJARI PENGATURAN OBJEK- OBJEK. ADALAH CABANG.
PERTEMUAN KE-6 UNIFIED MODELLING LANGUAGE (UML) (Part 2)
ANALISIS MARKOV Pertemuan 21
Pertemuan 07 Peluang Beberapa Sebaran Khusus Peubah Acak Kontinu
Bina Nusantara Mata Kuliah: K0194-Pemodelan Matematika Terapan Tahun : 2008 Aplikasi Model Markov Pertemuan 22:
1 HAMPIRAN NUMERIK SOLUSI PERSAMAAN LANJAR Pertemuan 5 Matakuliah: K0342 / Metode Numerik I Tahun: 2006 TIK:Mahasiswa dapat meghitung nilai hampiran numerik.
1 Pertemuan 11 Function dari System Matakuliah: M0446/Analisa dan Perancangan Sistem Informasi Tahun: 2005 Versi: 0/0.
1 Pertemuan 13 Algoritma Pergantian Page Matakuliah: T0316/sistem Operasi Tahun: 2005 Versi/Revisi: 5.
Inventory System Simulation Pemodelan Sistem. Kapan saat yang tepat untuk meng- order? Berapa banyak yang perlu di-order? Inventory Systems.
9.3 Geometric Sequences and Series. Objective To find specified terms and the common ratio in a geometric sequence. To find the partial sum of a geometric.
Binary Search Tree. Sebuah node di Binary Search Tree memiliki path yang unik dari root menurut aturan ordering – Sebuah Node, mempunyai subtree kiri.
Keuangan dan Akuntansi Proyek Modul 2: BASIC TOOLS CHRISTIONO UTOMO, Ph.D. Bidang Manajemen Proyek ITS 2011.
Smoothing. Basic Smoothing Models Moving average, weighted moving average, exponential smoothing Single and Double Smoothing First order exponential smoothing.
Otomata & Teori Bahasa ( Week 2 )
Jartel, Sukiswo Sukiswo
Matakuliah : M0086/Analisis dan Perancangan Sistem Informasi
DISTRIBUSI BINOMIAL.
ILIMA FITRI AZMI TEACHING MATERIAL DEVELOPMENT
STATISTIKA CHATPER 4 (Perhitungan Dispersi (Sebaran))
Thinking about Instrumental Variables (IV) Christopher A. Sims (2001)
KOMUNIKASI DATA Materi Pertemuan 3.
Induksi Matematika.
07/11/2017 BARISAN DAN DERET KONSEP BARISAN DAN DERET 1.
Notasi Object Oriented System
Rekayasa Perangkat Lunak Class Diagram
Statistika Chapter 4 Probability.
Pengujian Hipotesis (I) Pertemuan 11
ENGINEERING PLUMBING AND SANITATION
DISTRIBUSI BINOMIAL.
CLASS DIAGRAM.
CA113 Pengantar Manajemen Bisnis
OOAD – TI S1 Defri Kurniawan UDINUS
DISTRIBUSI PROBABILITA
Algorithms and Programming Searching
Two-and Three-Dimentional Motion (Kinematic)
CA113 Pengantar Manajemen Bisnis
PERSAMAAN DIFERENSIAL (DIFFERENTIAL EQUATION)
Manajemen Proyek Perangkat Lunak (MPPL)
Master data Management
Pertemuan 4 CLASS DIAGRAM.
Self-Organizing Network Model (SOM) Pertemuan 10
How Can I Be A Driver of The Month as I Am Working for Uber?
Suhandi Wiratama. Before I begin this presentation, I want to thank Mr. Abe first. He taught me many things about CorelDRAW. He also guided me when I.
CA113 Pengantar Manajemen Bisnis
Simultaneous Linear Equations
CSG3G3 Kercerdasan Mesin dan Artifisial Pertemuan 10: Learning (Naive Bayes) Author-Tim Dosen KK ICM 21/11/2018.
#8_STOPWATCH TIME STUDY ANALISA DAN PENGUKURAN KERJA
Group 3 About causal Conjunction Member : 1. Ahmad Fandia R. S.(01) 2. Hesti Rahayu(13) 3. Intan Nuraini(16) 4. Putri Nur J. (27) Class: XI Science 5.
Right, indonesia is a wonderful country who rich in power energy not only in term of number but also diversity. Energy needs in indonesia are increasingly.
Vector. A VECTOR can describe anything that has both MAGNITUDE and DIRECTION The MAGNITUDE describes the size of the vector. The DIRECTION tells you where.
Probability IIntroduction to Probability ASatisfactory outcomes vs. total outcomes BBasic Properties CTerminology IICombinatory Probability AThe Addition.
Path Analysis. Path Diagram Single headed arrowruns from cause to effect Double headed bent arrow: correlation The model above assumes that all 5 variables.
2. Discussion TASK 1. WORK IN PAIRS Ask your partner. Then, in turn your friend asks you A. what kinds of product are there? B. why do people want to.
Work System Design “Work Measurement” Hardianto Iridiastadi, Ph.D.
Transcript presentasi:

Probabilistic Reasoning over Time Session 13 Course : Artificial Intelligence Effective Period : September 2018 Probabilistic Reasoning over Time Session 13

Time and Uncertainty How we estimate the probability of changing random variable? When a car is broken, remains broken during the process diagnosis (static) On the other hand, a diabetic patient has changing evidence (blood sugar, insulin doses, etc) (dynamic) We view the world as a series of snapshots (time slices) Xt denotes the set of state variables at time t Et denotes the observable evidences at time t

Time and Uncertainty How we construct the Bayesian network? What is the transition model? Too complex, we need an assumption (Markov assumption) The current state depends on only a finite fixed number of previous states (Markov chains) First-order Second-order

Time and Uncertainty First-order Markov chain: P(Xt | Xt-1) Second-order Markov chain: P(Xt | Xt-2,Xt-1) Sensor Markov assumption: P(Et | X0:t,E0:t-1) = P(Et | Xt) Stationary process: transition model and sensor model fixed First-order Second-order

Time and Uncertainty The complete joint distribution is the combination of the transition model and sensor model Bayesian network for umbrella world

Markov Chains First order Markov Chains P(Rt | Rt-1) of umbrella world Probability of raining in day 0: P(R0) = [0.8 0.2] Transition model 𝑷(𝑹𝒕 | 𝑹𝒕−𝟏)= 0.7 0.3 0.3 0.7 Probability of raining in day 1: P(R1)= P(R0) P(Rt | Rt-1) P(R1)=[(0.7*0.8+0.3*0.2) (0.8*0.3+0.7*0.2)] P(R1)=[0.62 0.38] So, the probability of raining = true is 0.62 and raining = false is 0.38

Markov Chains Instead of only using the probability of previous state, Markov Chains also measures the probability of sensor Observation model 𝑷(𝑹𝒕 | 𝑼𝑡)= 0.9 0.2 Probability of raining in day 1: P(R1)= P(R0) P(Rt | Rt-1) P(Rt | Ut) P(R1)=[(0.7*0.8+0.3*0.2)+ (0.8*0.3+0.7*0.2)] [0.9 0.2] P(R1)=[0.62 0.38] [0.9 0.2] P(R1)=[0.558 0.076] ≈ [0.88 0.12]

Markov Chains Example A child with a lower-class parent has a 60% chance of remaining in the lower class, has a 40% chance to rise to the middle class, and has no chance to reach the upper class. A child with a middle-class parent has a 30% chance of falling to the lower class, a 40% chance of remaining middle class, and a 30% chance of rising to the upper class. A child with an upper-class parent have no chance of falling to the lower class, has a 70% chance of falling to the middle class, and has a 30% chance of remaining in the upper class.

Markov Chains Example Assuming that 20% of the population belongs to the lower class, that 30% belong to the middle class, and that 50% belong to the upper class.

Markov Chains Example Markov transition matrix Markov transition diagram Initial condition

Markov Chains Solution Illustrate, consider population dynamics over the next 4 generations is :

Inference in Temporal Model Inference tasks: Filtering: computing the belief state Current state estimation P(Xt | e1:t) Prediction: computing the posterior distribution of future state Future state prediction P(Xt+k | e1:t) Smoothing: computing the posterior distribution of past state Past state analysis P(Xk | e1:t) Most likely explanation: pattern analysis P(X1:t | e1:t)

Inference in Temporal Model Filtering and prediction To recursively update the distribution using a forward message from previous states Prediction can be seen simply as filtering without the addition of new evidence new evidence

Inference in Temporal Model Filtering

Inference in Temporal Model Smoothing Process of computing the distribution over past states given evidence up to the present The computation can be split into two parts: forward message and backward message forward backward

Inference in Temporal Model Smoothing The process of backward message

Inference in Temporal Model Smoothing

Inference in Temporal Model Most likely explanation Suppose that [true, true, false, true, true] is the umbrella sequence for the security guard’s first five days on the job What is the weather sequence most likely to explain this? Here, we want find the possible sequence with high probability! How?

Inference in Temporal Model Most likely explanation There is a recursive relationship between the most likely paths to each state xt+1 and most likely paths to each state xt (Markov property) Thus, we can write the relationship as

Inference in Temporal Model Most likely explanation

Hidden Markov Models Simple Markov models  The observer know the state directly Hidden Markov models  The observer know the state indirectly (through an output state or observed data) Umbrella world is an HMM, since the security only knows the rain state from his director’s umbrella existence

Hidden Markov Models Hidden states Observed data H1 H2 HL-1 HL Hi X1 Xi XL-1 XL Observed data

Hidden Markov Models Fair/Loaded Head/Tail transition probabilities 0.9 0.9 transition probabilities 0.1 fair loaded 0.1 emission probabilities 1/2 1/2 3/4 1/4 H T H T Fair/Loaded Head/Tail X1 X2 XL-1 XL Xi H1 H2 HL-1 HL Hi

Hidden Markov Models We don’t know the location, but we know the output of the sensors

Dynamic Bayesian Network A dynamic Bayesian network or DBN is a Bayesian network that represents a temporal probability model Example: The umbrella world Every HMM is a DBN with a single state variable and a single evidence variable, vice versa The relation between HMM and DBN is analogous to the relation between Bayesian networks and full tabulated joint distributions

Dynamic Bayesian Network To construct a DBN, we must specify three kinds of information: The prior distribution over the state variables P(X0) The transition model P(Xt+1 | Xt) The sensor model P(Et | Xt)

Dynamic Bayesian Network Example Monitoring a battery-powered robot moving in the X-Y plane State: State for position and velocity Measurement state Battery level state Battery charge level state Describe the relation between states!

DBN vs HMM

Dynamic Bayesian Network Inference in DBNs: Unrolling a dynamic Bayesian network

References Stuart Russell, Peter Norvig. 2010. Artificial Intelligence : A Modern Approach. Pearson Education. New Jersey. ISBN:9780132071482

Quiz Survei dilakukan di kota dengan 1000 keluarga. Diperoleh data 600 keluiarga pelanggan toserba “SERBA” dan 400 pelanggan toserba “ADA”. Pada bulan itu diketahui: Dari 600 keluarga pelanggan “SERBA” diperoleh data bahwa 400 keluarga TETAP belanja di “SERBA” dan 200 lainnya berbelanja di toserba “ADA” Dari 400 keluarga pelanggan “ADA, dinyatakan 150 keluarga TETAP berbelanja di toserba “Ada”. Sedagnkan 250 lainnya berbebalnja di toserba “SERBA” Hitung: Matriks probabilitas transisi untuk masalah di atas Probabilitas untuk took “SERBA dan “ADA” pada bulan ketiga apabila di bulan pertama keluarga tersebut memilih untuk berbelanja di took “SERBA” Probabilitas took “SER”AN dan “ADA” pada bulan ketiga apabila pada bulan pertama keluarga tersebut memilih untuk berbebalnja di took “ADA” Nilai probabilitas pelanggan dalam keadan tetap. Jumlah perkiraan pelanggan dalam jangka Panjang untuk masing2 toser tersebut

Jawab Matrkis transisi untuk menghitung probabilitas: Probabilitas bulan pertama “SERBA” dan bulan kedua “SERBA”= 400/600=0.667 Probabilitas bulan pertama “SERBA” dan bulan kedua “ADA”=200/600=0.33 Probabilitas bulan pertama “ADA” dan bulan kedua “SERBA”=250/400=0.625 Probabilitas bulan pertama “ADA” dan bulan kedua “ADA=150/400=150/400 Matrik transisi SERBA ADA SERBA 0.667 0.33 ADA 0.625 0.375