the unscented particle filter 2000/09/29 이 시은. introduction filtering –estimate the...
DESCRIPTION
Extended Kalman filter –linearize the measurements and evolution models using Taylor series Unscented Kalman Filter –not apply to general non Gaussian distribution Seq. Monte Carlo Methods : Particle filters –represent posterior distribution of states. –any statistical estimates can be computed. –deal with nonlinearities distributionTRANSCRIPT
The Unscented Particle Filter
2000/09/2
9이 시은
Introduction
• Filtering– estimate the states(parameters or hidden variable) a
s a set of observations becomes available on-line• To solve it
– modeling the evolution of the system and noise• Resulting models
– non-linearity and non-Gaussian distribution
• Extended Kalman filter– linearize the measurements and evolution models u
sing Taylor series• Unscented Kalman Filter
– not apply to general non Gaussian distribution• Seq. Monte Carlo Methods : Particle filters
– represent posterior distribution of states.– any statistical estimates can be computed.– deal with nonlinearities distribution
• Particle Filter– rely on importance sampling– design of proposal distribution
• Proposal for Particle Filter– EKF Gaussian approximation– UKF proposal
• control rate at which tails go to zero• heavy tailed distribution
Dynamic State Space Model
• Transition equation and a measurement’s equation
• Goal – approximate the posterior – one of marginals, filtering density recursively
)|( 1tt xxp)|( tt xyp
)|( :1:0 tt yxp)|( :1 tt yxp
Extended Kalman Filter
• MMSE estimator based on Taylor expansion of nonlinear f and g around estimate of state tx1| ttx
Unscented Kalman Filter
• Not approximate non-linear process and observation models
• Use true nonlinear models and approximate distribution of the state random variable
• Unscented transformation
Particle Filtering
• Not require Gaussian approximation • Many variations, but based on sequential im
portance sampling– degenerate with time
• Include resampling stage
Perfect Monte Carlo Simulation
• A set of weighted particles(samples) drawn from the posterior
• Expectation
)(1)|(ˆ :01
:1:0 :0)( t
N
ixtt dx
Nyxp
ti
ttttttt dxyxpxgxgE :0:1:0:0:0 )|()())((
)(1))(( )(:0
1:0
it
N
ittt xg
NxgE
))(())(( :0.
:0 ttsa
tt xgExgE
)))((var,0()))(())((( :0)|(:0:0 :1 ttypNtttt xgNxgExgENt
Bayesian Importance Sampling
• Impossible to sample directly from the posterior
• sample from easy-to-sample, proposal distribution )|( :1:0 tt yxq
tttt
tttt
tttttt
ttttt
ttttt
tttttt
dxyxqypxwxg
dxyxqyxqypxpxypxg
dxyxqyxqyxpxgxgE
:0:1:0:1
:0:0
:0:1:0:1:0:1
:0:0:1:0
:0:1:0:1:0
:1:0:0:0
)|()()()(
)|()|()()()|()(
)|()|()|()())((
))(())()((
)|()(
)|()()(
)|()|()()|(
)|()()(
)|()()()(
1))((
:0)|(
:0:0)|(
:0:1:0:0
:0:1:0:0:0
:0:1:0
:1:0:0:0:1
:0:1:0:0:0
:0:1:0:0:0:1
:0
:1
:1
ttyq
ttttyq
ttttt
ttttttt
ttt
ttttt
ttttttt
tttttttt
tt
xwExgxwE
dxyxqxw
dxyxqxwxg
dxyxqyxqxpxyp
dxyxqxwxg
dxyxqxwxgyp
xgE
t
t
)(~)(
)(/1
)()(/1))((
:0)(
1:0
)(
1 :0)(
1:0
)(:0
)(
:0
ti
N
itt
it
N
i ti
t
N
it
itt
it
tt
xwxg
xwN
xwxgNxgE
• Asymptotic convergence and a central theorem for under the following assumptions– i.i.d samples drawn from the proposal, su
pport of the proposal include support of posterior and finite exists.
– Expectation of , exist and are finite.
))(( :0 tt xgE
tix :0)(
))(( :0 tt xgE)( :0
2ttt xgwtw
) ( ~ 1 ) | ( ˆ: 01
) (:1 : 0: 0
) (t
N
ix
it t tdx w
Ny x pt
i
Sequential Importance Sampling
• Proposal distribution
• assumption – state: Markov process– observations: independent given states
),|()|()|( :11
1:1:10:1:0 j
t
jjjttt yxxqyxqyxq
– we can sample from the proposal and evaluate likelihood and transition probability, generate a prior set of samples and iteratively compute the importance weights
),|()|()|(
),|()|()()|(
:11:0
11
1:11:01:11:0
:0:0:1
ttt
ttttt
ttttt
tttt
yxxqxxpxypw
yxxqyxqxpxypw
Choice of proposal distribution
• Minimize variance of the importance weights
• popular choice
• move particle towards the region of high likelihood
)|()|( :1,1:0:1,1:0 tttttt yxxpyxxq
)|()|( 1:1,1:0 ttttt xxpyxxq
Degeneracy of SIS algorithm
• Variance of importance ratios increases stochastically over time
Selection(Resampling)• Eliminate samples with low importance ratios
and multiply samples with high importance ratios.
• Associate to each particle a number of children
tix :0)(
NNN N
i ii 1,
SIR and Multinomial sampling
• Mapping Dirac random measure onto an equally weighted random measure
• Multinomial distribution
}~{ ,:0)(
tti wx
}/1{ ,:0)( Nx t
i
Residual resampling
• Set• perform an SIR procedure to select
remaining samples with new weights• add the results to the current
)(~~ iti wNN
iN~
Minimum variance samplingWhen to sample
Generic Particle Filter
1. Initialization t=02. For t=1,2, …
(a) Importance sampling stepfor I=1, …N, sample:
evaluate importance weightnormalize the importance weights
(b) Selection (resampling)(c) output
~ˆ )(itx
Improving Particle Filters
• Monte Carlo(MC) assumption – Dirac point-mass approx. provides an adequate
representaion of posterior • Importance sampling(IS) assumption
– obtain samples from posterior by sampling from a suitable proposal and apply importance sampling corrections.
MCMC Move Step• Introduce MCMC steps of invariant distribution• If particles are distributed according to the poste
rior then applying a Markov chain transition kernel
)|~( :1:0 tt yxp)~|( :0:0 tt xxK
Designing Better Importance Proposals
• Move samples to regions of high likelihood• prior editing
– ad-hoc acceptance test of proposing particles• Local linearization
– Taylor series expansion of likelihood and transition prior
– ex)– improved simulated annealed sampling algorithm
)ˆ,()|( )()(:1
)(,1:0
)( it
itt
it
it PxNyxxq
Rejection methods
• If likelihood is bounded, sample from optimal importance distribution
ttt Mxyp )|(
Auxiliary Particle Filters
• Obtain approximate samples from the optimal importance distribution by an auxiliary variable k.
• draw samples from joint distribution
Unscented Particle Filter
• Using UKF for proposal distribution generation within a particle filter framework
Theoretical Convergence
• Theorem1If importance weight is upper bounded for any and if one o
f selection schemes, then for all , there exists independent of N s.t. for any
Nf
cydxpxfxfN
E tt
N
itttt
itt
22
1:1:0:0
)(:0 )|()()(1
)|()|()|(
:1,1:0
1
ttt
ttttt yxxq
xxpxypw
),( 1 tt yx