Localization

Extended Kalman Filter Localization

from IPython.display import Image
Image(filename="ekf.png",width=600)
../_images/extended_kalman_filter_localization_1_0.png
EKF

EKF

This is a sensor fusion localization with Extended Kalman Filter(EKF).

The blue line is true trajectory, the black line is dead reckoning trajectory,

the green point is positioning observation (ex. GPS), and the red line is estimated trajectory with EKF.

The red ellipse is estimated covariance ellipse with EKF.

Code: PythonRobotics/extended_kalman_filter.py at master · AtsushiSakai/PythonRobotics

Filter design

In this simulation, the robot has a state vector includes 4 states at time \(t\).

\[\textbf{x}_t=[x_t, y_t, \phi_t, v_t]\]

x, y are a 2D x-y position, \(\phi\) is orientation, and v is velocity.

In the code, “xEst” means the state vector. code

And, \(P_t\) is covariace matrix of the state,

\(Q\) is covariance matrix of process noise,

\(R\) is covariance matrix of observation noise at time \(t\)

The robot has a speed sensor and a gyro sensor.

So, the input vecor can be used as each time step

\[\textbf{u}_t=[v_t, \omega_t]\]

Also, the robot has a GNSS sensor, it means that the robot can observe x-y position at each time.

\[\textbf{z}_t=[x_t,y_t]\]

The input and observation vector includes sensor noise.

In the code, “observation” function generates the input and observation vector with noise code

Motion Model

The robot model is

\[\dot{x} = vcos(\phi)\]
\[\dot{y} = vsin((\phi)\]
\[\dot{\phi} = \omega\]

So, the motion model is

\[\textbf{x}_{t+1} = F\textbf{x}_t+B\textbf{u}_t\]

where

\(\begin{equation*} F= \begin{bmatrix} 1 & 0 & 0 & 0\\ 0 & 1 & 0 & 0\\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 \\ \end{bmatrix} \end{equation*}\)

\(\begin{equation*} B= \begin{bmatrix} cos(\phi)dt & 0\\ sin(\phi)dt & 0\\ 0 & dt\\ 1 & 0\\ \end{bmatrix} \end{equation*}\)

\(dt\) is a time interval.

This is implemented at code

Its Jacobian matrix is

\(\begin{equation*} J_F= \begin{bmatrix} \frac{dx}{dx}& \frac{dx}{dy} & \frac{dx}{d\phi} & \frac{dx}{dv}\\ \frac{dy}{dx}& \frac{dy}{dy} & \frac{dy}{d\phi} & \frac{dy}{dv}\\ \frac{d\phi}{dx}& \frac{d\phi}{dy} & \frac{d\phi}{d\phi} & \frac{d\phi}{dv}\\ \frac{dv}{dx}& \frac{dv}{dy} & \frac{dv}{d\phi} & \frac{dv}{dv}\\ \end{bmatrix} \end{equation*}\)

\(\begin{equation*}  = \begin{bmatrix} 1& 0 & -v sin(\phi)dt & cos(\phi)dt\\ 0 & 1 & v cos(\phi)dt & sin(\phi) dt\\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\\ \end{bmatrix} \end{equation*}\)

Observation Model

The robot can get x-y position infomation from GPS.

So GPS Observation model is

\[\textbf{z}_{t} = H\textbf{x}_t\]

where

\(\begin{equation*} B= \begin{bmatrix} 1 & 0 & 0& 0\\ 0 & 1 & 0& 0\\ \end{bmatrix} \end{equation*}\)

Its Jacobian matrix is

\(\begin{equation*} J_H= \begin{bmatrix} \frac{dx}{dx}& \frac{dx}{dy} & \frac{dx}{d\phi} & \frac{dx}{dv}\\ \frac{dy}{dx}& \frac{dy}{dy} & \frac{dy}{d\phi} & \frac{dy}{dv}\\ \end{bmatrix} \end{equation*}\)

\(\begin{equation*}  = \begin{bmatrix} 1& 0 & 0 & 0\\ 0 & 1 & 0 & 0\\ \end{bmatrix} \end{equation*}\)

Extented Kalman Filter

Localization process using Extendted Kalman Filter:EKF is

=== Predict ===

\(x_{Pred} = Fx_t+Bu_t\)

\(P_{Pred} = J_FP_t J_F^T + Q\)

=== Update ===

\(z_{Pred} = Hx_{Pred}\)

\(y = z - z_{Pred}\)

\(S = J_H P_{Pred}.J_H^T + R\)

\(K = P_{Pred}.J_H^T S^{-1}\)

\(x_{t+1} = x_{Pred} + Ky\)

\(P_{t+1} = ( I - K J_H) P_{Pred}\)

Unscented Kalman Filter localization

2

This is a sensor fusion localization with Unscented Kalman Filter(UKF).

The lines and points are same meaning of the EKF simulation.

Ref:

Particle filter localization

3

This is a sensor fusion localization with Particle Filter(PF).

The blue line is true trajectory, the black line is dead reckoning trajectory,

and the red line is estimated trajectory with PF.

It is assumed that the robot can measure a distance from landmarks (RFID).

This measurements are used for PF localization.

Ref:

Histogram filter localization

4

This is a 2D localization example with Histogram filter.

The red cross is true position, black points are RFID positions.

The blue grid shows a position probability of histogram filter.

In this simulation, x,y are unknown, yaw is known.

The filter integrates speed input and range observations from RFID for localization.

Initial position is not needed.

Ref: