Get Started with Sensor Fusion and Tracking Toolbox

Design and simulate multisensor tracking and navigation systems

Sensor Fusion and Tracking Toolbox™ includes algorithms and tools for the design, simulation, and analysis of systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness. Reference examples provide a starting point for implementing components of airborne, ground-based, shipborne, and underwater surveillance, navigation, and autonomous systems.

The toolbox includes multi-object trackers, sensor fusion filters, motion and sensor models, and data association algorithms that let you evaluate fusion architectures using real and synthetic data. With Sensor Fusion and Tracking Toolbox you can import and define scenarios and trajectories, stream signals, and generate synthetic data for active and passive sensors, including RF, acoustic, EO/IR, and GPS/IMU sensors. You can also evaluate system accuracy and performance with standard benchmarks, metrics, and animated plots.

For simulation acceleration or desktop prototyping, the toolbox supports C code generation.



Part 1: What is Sensor Fusion?
An overview of what sensor fusion is and how it helps in the design of autonomous systems.

Part 2: Fusing Mag, Accel, and Gyro to Estimate Orientation
Use magnetometer, accelerometer, and gyro to estimate an object’s orientation.

Part 3: Fusing GPS and IMU to Estimate Pose
Use GPS and an IMU to estimate an object’s orientation and position.

Part 4: Tracking a Single Object With an IMM Filter
Track a single object by estimating state with an interacting multiple model filter.

Part 5: How to Track Multiple Objects at Once
Introduce two common problems in multi object tracking: Data association and track maintenance.