当前位置:高等教育资讯网  >  中国高校课件下载中心  >  大学文库  >  浏览文档

麻省理工学院:《自制决策制造原则》英文版 Probabilistic model

资源类别:文库,文档格式:PDF,文档页数:16,文件大小:2.38MB,团购合买
Probabilistic model Integrate over al!p(x1 Bel(x,)=ap(=;|x,)p(x1a1,=12a1-2,=0) ap(E, x )p(x, Ix, 1,a, p(|x,)p(x,1x1,a1)p(x1)d
点击下载完整版文档(PDF)

●●°| Probab| listic Mode ● EStimate p(X Be(x,)=p(x,|=,a1,1,a12,=0) ● Bayes’rule Bel(x,= P(,|x,a1-12=1,a12…,2=0)p(x1a1,21,a12,-) tt-1-t-1t-2…:-0 my(1|x1)p(x1an1,x1,a12x,=0)

Bel (x )t Probabilistic Model z Estimate p(xt): Bel (x ) = x p | a z t −1, zt −1, at −2 ,..., z ) t ( t t , 0 z Bayes’ rule: (z p | x , a ( t −1, zt −1, at −2 ,..., z ) x p | at −1, zt −1, at −2 ,..., z ) = t t 0 t (z p t | at −1, zt −1, at −2 ,..., z )0 =α ( ( z p | x ) x p | at −1, zt −1, at −2 ,..., z ) t t t 0 0

●●°| Probab| listic Mode o Integrate over all p(x-1) Bel(x)=∞p(1|x1)p(x1an12E1,a1-2,2-0) (=,|x,)p(x,|x )p(x-1|a12,-0)x1 x-1 D(1x)m(x1|x1,a-)(x)

Probabilistic Model z Integrate over all p(xt-1): Bel (x ) = α ( ( z p | x ) x p | at −1, zt −1, at − 2 ,..., z ) t t t t 0 z p | xt ) ∫ x p t | xt −1, at −1, zt − 2 ,..., z ) x p | at −1 = α ( ( ( ,..., z )dx t 0 t −1 0 t −1 xt −1 z p | xt ) ∫ x p t | xt −1, at −1 = α ( ( ) (x p )dx t t −1 t −1 xt −1

●。。 Probab| istic Mode o Bayes' filter gives recursive, two-step procedure for estimating p(X,) Bel(,)+ap(= l*, p(x, 1x,a p(*-)dr Measurement Prediction o How to represent Bel(X+?

Probabilistic Model | Bayes’ filter gives recursive, two-step procedure for estimating p(x t x t) Bel (x ) = αp (z | x ) p (x | xt − 1, at − 1) p (x )dx t t t ∫ t t −1 t − 1 −1 Measurement Prediction | How to represent Bel(xt)?

Kalman, 1960 ●。。 Kalman filter An action is taken State space Posterior belier Initial belier Posterior belief after sensing after an action

Kalman, 1960 An action is taken Kalman Filter State Space Posterior belief Posterior belief Initial belief after an action after sensing

●。 Problems o Gaussian process and sensor noise Often solved extracting low-dimensional features Data-association problem o Often hard to implement Kalman filters o Gaussian posterior estimate

Problems | Gaussian Process and Sensor Noise • Often solved extracting low-dimensional features • Data-association problem | Often hard to implement Kalman filters | Gaussian Posterior Estimate

●。G| obal localization State space Posterior belier Posterior belief Initial belief after sensing after an action

Global Localization State Space Initial belief Posterior belief after an action Posterior belief after sensing

Burgard et al,1996 ●。° Markov loca| lization 三 L■■ State space itial belief Posterior belief Posterior belief after an action after sensing

Markov Localization State Space Initial belief Posterior belief after an action Posterior belief after sensing Burgard et al, 1996

●。。Prob|ems o Large memory footprint 50 m x 50m map 1° increments ≈343M o FIXed cell size limIts accuracy

Problems | Large memory footprint • 50 m x 50m map • 1° increments • ≈343M | Fixed cell size limits accuracy

Monte carlo localization ●。● The particle filter State space o Sample particles randomly from distribution o Carry around particle sets, rather than full distribution

Monte Carlo Localization: The Particle Filter State Space | Sample particles randomly from distribution | Carry around particle sets, rather than full distribution

●● Using Particle Filters o Can update distribution by updating particle set directly o Can(sometimes)compute properties of distribution directly from particles E.g., any moments: mean, variance, etc o If necessary can recover distribution from particles Fit Gaussian by recovering mean, covariance Kalman filter) Can fit multiple Gaussians using Expectation Maximization Can bin data and recover discrete multinomial (Markov localization)

Using Particle Filters | Can update distribution by updating particle set directly | Can (sometimes) compute properties of distribution directly from particles • E.g., any moments: mean, variance, etc. | If necessary, can recover distribution from particles • Fit Gaussian by recovering mean, covariance (Kalman filter) • Can fit multiple Gaussians using Expectation￾Maximization • Can bin data and recover discrete multinomial (Markov localization)

点击下载完整版文档(PDF)VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
共16页,试读已结束,阅读完整版请下载
相关文档

关于我们|帮助中心|下载说明|相关软件|意见反馈|联系我们

Copyright © 2008-现在 cucdc.com 高等教育资讯网 版权所有