-sensor model P(EX) Particle filtering performance Resample to obtainopains proprtinal to W()=P(e)N( ne consistent at time t:N(x)/N=P(xe Particle filtering contd.Particle filtering contd. Assume consistent at time t: N(xt |e1:t)/N = P(xt |e1:t) Propagate forward: populations of xt+1 are N(xt+1|e1:t) = ΣxtP(xt+1|xt)N(xt |e1:t) Weight samples by their likelihood for et+1: W(xt+1|e1:t+1) = P(et+1|xt+1)N(xt+1|e1:t) Resample to obtain populations proportional to W: N(xt+1|e1:t+1)/N = αW(xt+1|e1:t+1) = αP(et+1|xt+1)N(xt+1|e1:t) = αP(et+1|xt+1)ΣxtP(xt+1|xt)N(xt |e1:t) = α P 0 (et+1|xt+1)ΣxtP(xt+1|xt)P(xt |e1:t) = P(xt+1|e1:t+1) Chapter 15, Sections 1–5 37 Particle filtering performance Approximation error of particle filtering remains bounded over time, at least empirically—theoretical analysis is difficult 0 0.2 0.4 0.6 0.8 1 0 5 10 15 20 25 30 35 40 45 50 Avg absolute error Time step LW(25) LW(100) LW(1000) LW(10000) ER/SOF(25) Chapter 15, Sections 1–5 38 Summary Temporal models use state and sensor variables replicated over time Markov assumptions and stationarity assumption, so we need – transition modelP(Xt |Xt−1) – sensor model P(Et |Xt) Tasks are filtering, prediction, smoothing, most likely sequence; all done recursively with constant cost per time step Hidden Markov models have a single discrete state variable; used for speech recognition Kalman filters allow n state variables, linear Gaussian, O(n 3 ) update Dynamic Bayes nets subsume HMMs, Kalman filters; exact update intractable Particle filtering is a good approximate filtering algorithm for DBNs Chapter 15, Sections 1–5 39