PARTICLE METHODS FOR MULTIMODAL FILTERING

0 downloads 0 Views 236KB Size Report
Abstract : We present a quick method of particle filter (or bootstrap filter) with local rejection which is an adaptation of the kernel filter. This filter generalizes the ...
PARTICLE METHODS FOR MULTIMODAL FILTERING Christian Musso, Nadia Oudjane ONERA DTIM. BP 72 92322, France . {musso,oudjane}@onera.fr

Abstract : We present a quick method of particle filter (or bootstrap filter) with local rejection which is an adaptation of the kernel filter. This filter generalizes the regularized filter. The conditional density of the state is recursively estimated. The proposed filter allows a precise correction step in a given computational time. In the context of the 2D tracking problem with angle and/or range measurements, simulations show a better behavior of this filter compared with the Kalman filter and with classical bootstrap filter. We present also some results of a multiple model particle filter which can track maneuvering targets.

Keywords : particle filter, bootstrap, tracking, non-linear filtering, Monte-Carlo, rejection 1. INTRODUCTION We consider a target following a noisy dynamical equation which is partially observed, (notations are the same as in [1]) Xt = F( Xt −1 ) + Vt Yt = H ( Xt ) + Wt

(1) (2)

The Extended Kalman Filter (EKF) is widely used to estimate recursively the mean and the variance of the state Xt given the past measurement Y t = (Y1 ,..., Yt ) . The EKF assumes that the conditional density is Gaussian. But, when F or H is highly nonlinear, or in case of multimodality, the EKF is inefficient. The goal of the non-linear filtering (NLF) is to estimate the whole law of the state Xt given the measurements Y t . For example, in the tracking context, we will be able to estimate precisely the probability of the presence of a target in any portion of the state space and consenquently to estimate the position of the target. For this filter there is no hypothesis concerning the linearity of F and H and no conditions about the nature of the noise V and W. We want to estimate recursively the conditional density, denoted by ft / t ( x / y t ). Suppose we know ft −1 / t −1 , and a new measurement yt is available. Bayes rules give easily the formulation of ft / t ( x / y t ) in two steps, ft / t −1 ( x / y t −1 ) =

⎯→ Rd and H : Rd ⎯ ⎯→ Rq where F : Rd ⎯ are given functions, (Vt ) , (Wt ) are iid variables with densities,

∫ p (x / x

Vt ~ p(v)dv,

The first step (6) is the prediction step using the dynamical law. The predicted density ft / t −1 ( x / y t −1 ) is the expectation of pt ( x / Xt −1 ) where Xt −1 follows ft −1 / t −1 . The second step (7) is the correction step. qt ( y / xt ) = q( y − H ( xt ))is the likelihood at the point x t. and pt ( x / xt −1 ) = p( x − F( xt −1 )) When the noise Wt is Gaussian, q can be expressed as

Wt ~ q(v)dv

(3)

X0 , the initial state of density p0 , is assumed independent of (Vt ) , (Wt ) . ( Xt ) is a Markov chain Xt /( Xt −1 = xt −1 ) ~ p( x − F( xt −1 ))dx

(4)

(Yt ) are independent conditionally to ( Xt ) , and each (Yt ) is, conditionally to Xt , independent from Xs ( s ≠ t ) , Yt /( Xt = xt ) ~ q( y − H ( xt ))dy

(5)

t

t −1

) ft −1 / t −1 ( xt −1 / y t −1 )dxt −1

ft / t ( x / y t ) ∝ qt ( yt / x ) ft / t −1 ( x / y t −1 )

qt ( y / xt ) =

(6) (7)

1

(2π ) det(Ω) − 1 exp ⎡⎢ ( y − H ( xt ))' Ω −1 ( y − H ( xt ))' )⎤⎥ (8) ⎣2 ⎦ d/2

(I)

The correction step confronts the new measurement with the predicted density. The recursion begins with the assumed known initial density f0 / 0 ( x ) = p0 ( x ) . A natural way to solve (6) and (7) is to discretize the state space. This can be done when the dimension (d) of this space is low (d