Sensor Fusion and Tracking Toolbox

Design, simulate, and test multisensor tracking and positioning systems

Sensor Fusion and Tracking Toolbox™ includes algorithms and tools for designing, simulating, and testing systems that fuse data from multiple sensors to maintain situational awareness and localization. Reference examples provide a starting point for multi-object tracking and sensor fusion development for surveillance and autonomous systems, including airborne, spaceborne, ground-based, shipborne, and underwater systems.

You can fuse data from real-world sensors, including active and passive radar, sonar, lidar, EO/IR, IMU, and GPS. You can also generate synthetic data from virtual sensors to test your algorithms under different scenarios. The toolbox includes multi-object trackers and estimation filters for evaluating architectures that combine grid-level, detection-level, and object- or track-level fusion. It also provides metrics, including OSPA and GOSPA, for validating performance against ground truth scenes.

For simulation acceleration or rapid prototyping, the toolbox supports C code generation.

Get Started

Learn the basics of Sensor Fusion and Tracking Toolbox


Examples for autonomous system tracking, surveillance system tracking, localization, and hardware connectivity

Orientation, Position, and Coordinate Systems

Quaternions, Euler angles, rotation matrices, and conversions

Trajectory and Scenario Generation

Ground-truth waypoint- and rate-based trajectories and scenarios

Sensor Models


Inertial Sensor Fusion

IMU and GPS sensor fusion to determine orientation and position

Estimation Filters

Kalman and particle filters, linearization functions, and motion models

Multi-Object Trackers

Multi-sensor multi-object trackers, data association, and track fusion

Visualization and Analytics

Multi-object theater plots, detection and object tracks, and track metrics