Use inertial sensor fusion algorithms to estimate orientation and position over time. The algorithms are optimized for different sensor configurations, output requirements, and motion constraints. To learn more about inertial sensor fusion algorithms and their uses, see Determine Orientation Using Inertial Sensors and Determine Pose Using Inertial Sensors and GPS.
|Orientation from magnetometer and accelerometer readings|
|Orientation from accelerometer and gyroscope readings|
|Orientation from accelerometer, gyroscope, and magnetometer readings|
|Height and orientation from MARG and altimeter readings|
|Orientation estimation from a complementary filter|
|Estimate pose from MARG and GPS data|
|Estimate pose from asynchronous MARG and GPS data|
|Estimate pose from IMU, GPS, and monocular visual odometry (MVO) data|
|Estimate pose with nonholonomic constraints|
|Create inertial navigation filter|
This example shows how to use 6-axis and 9-axis fusion algorithms to compute orientation.
Use Kalman filters to fuse IMU and GPS readings to determine pose.
This example shows how to align and preprocess logged sensor data.