Main Content

estimateGravityRotation

Estimate gravity rotation using IMU measurements and factor graph optimization

Since R2023a

    Description

    The estimateGravityRotation function estimates the gravity rotation that helps transform input poses to the local navigation reference frame of IMU using IMU measurements and factor graph optimization. The gravity rotation transforms the gravity vector from the local navigation reference frame of IMU to the pose reference frame.

    The accelerometer measurements contain constant gravity acceleration that does not contribute to motion. You must remove this from the measurements for accurate fusion with other sensor data. The input pose reference frame may not always match the local navigation reference frame of IMU, North-East-Down (NED) or East-North-Up (ENU) in which the gravity direction is known. So, it is necessary to transform the input poses to the local navigation frame to remove the known gravity effect. The estimated rotation helps align the input pose reference frame and local navigation reference frame of IMU.

    [gRot,info] = estimateGravityRotation(poses,gyroscopeReadings,accelerometerReadings,Name=Value) estimates the rotation required to transform the gravity vector from the local navigation reference frame of IMU (NED or ENU) to the input pose reference frame.

    Note

    Input poses must be in the initial IMU reference frame unless you specify the SensorTransform name-value argument, then the poses can be in a different frame.

    example

    Examples

    collapse all

    Specify input poses in the first camera pose reference frame.

    poses = [0.1 0 0 0.7071 0 0 0.7071; ...
             0.1 0.4755 -0.1545 0.7071 0 0 0.7071];

    Specify 10 gyroscope and accelerometer readings between consecutive camera frames.

    accelReadings = repmat([97.9887 -3.0315 -22.0285],10,1);
    gyroReadings = zeros(10,3);

    Specify IMU parameters.

    params = factorIMUParameters(SampleRate=100, ...
                                 ReferenceFrame="NED");

    Specify a transformation consisting of 3-D translation and rotation to transform input poses from the initial camera pose reference frame to the initial IMU pose reference frame. Initial sensor reference frame has first sensor pose at the origin of the sensor reference frame.

    sensorTransform = se3(eul2rotm([-pi/2 0 0]),[0 0.1 0]);

    Specify factor graph solver options.

    opts = factorGraphSolverOptions(MaxIterations=50);

    Estimate gravity rotation using IMU measurements between camera pose estimates.

    [gRot,solutionInfo] = estimateGravityRotation(poses, ...
                          {gyroReadings},{accelReadings}, ...
                          IMUParameters=params, ...
                          SensorTransform=sensorTransform, ...
                          SolverOptions=opts)
    gRot = 3×3
    
        0.0058   -0.6775   -0.7355
        0.1024    0.7320   -0.6736
        0.9947   -0.0713    0.0736
    
    
    solutionInfo = struct with fields:
                 InitialCost: 2.3123e+03
                   FinalCost: 19.0067
          NumSuccessfulSteps: 23
        NumUnsuccessfulSteps: 18
                   TotalTime: 0.0016
             TerminationType: 0
            IsSolutionUsable: 1
            OptimizedNodeIDs: [1 3 4 6 7]
                FixedNodeIDs: [0 2 5]
    
    

    Use gravity rotation to transform gravity vector from local navigation frame to initial camera pose reference frame.

    % Gravity direction in NED frame is along Z-Axis.
    gravityDirectionNED = [0; 0; 1];
    % Gravity direction in pose reference frame.
    gravityDirection = gRot*gravityDirectionNED
    gravityDirection = 3×1
    
       -0.7355
       -0.6736
        0.0736
    
    
    % Gravity vector in pose reference frame.
    gravityMagnitude = 9.81;
    gravity = gravityDirection*gravityMagnitude
    gravity = 3×1
    
       -7.2149
       -6.6076
        0.7223
    
    

    Input Arguments

    collapse all

    Camera or lidar poses, with similar metric units as IMU measurements estimated by stereo-visual-inertial or lidar-inertial systems, respectively, specified as one of these pose types:

    • N-by-7 matrix, where each row is of the form [x y z qw qx qy qz]. Each row defines the xyz-position, in meters, and quaternion orientation, [qw qx qy qz].

    • Array of se3 objects.

    • Camera pose table returned by the poses (Computer Vision Toolbox) function of the imageviewset (Computer Vision Toolbox) object.

    • Array of rigidtform3d (Image Processing Toolbox) objects.

    Gyroscope readings between consecutive camera views or poses, specified as an (N-1)-element cell array of M-by-3 matrices, in radians per second. N is the total number of camera views or poses specified in poses. M is the number of gyroscope readings between consecutive camera views and the three columns of gyroscopeReadings represent the [x y z] measurements between the camera views or poses.

    Note that there can be a different number of readings between consecutive camera views.

    Data Types: cell

    Accelerometer readings between consecutive camera views or poses, specified as an (N-1)-element cell array of M-by-3 matrices, in meters per second squared. N is the total number of camera views or poses specified in poses. M is the number of accelerometer readings between consecutive camera views and the three columns of accelerometerReadings represent the [x y z] measurements between the camera views or poses.

    Note that there can be a different number of readings between consecutive camera views.

    Data Types: cell

    Name-Value Arguments

    Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

    Example: estimateGravityRotation(poses,gyroscopeReadings,accelerometerReadings,IMUParameters=factorIMUParameters(SampleRate=100)) estimates the gravity rotation based on an IMU.

    IMU parameters, specified as a factorIMUParameters object.

    Example: IMUParameters=factorIMUParameters(SampleRate=100)

    Solver options, specified as a factorGraphSolverOptions object.

    Example: SolverOptions=factorGraphSolverOptions(MaxIterations=50)

    Transformation consisting of 3-D translation and rotation to transform a quantity like a pose or a point in the input pose reference frame to the initial IMU sensor reference frame, specified as a se3 object.

    For example, if the input poses are camera poses in the initial camera sensor reference frame, then the sensor transform rotates and translates a pose or a point in the initial camera sensor reference frame to the initial IMU sensor reference frame. The initial sensor reference frame has the very first sensor pose at its origin.

    Example: SensorTransform=se3(eul2rotm([-pi/2,0,0]),[0,0.1,0])

    Output Arguments

    collapse all

    Gravity rotation, returned as a 3-by-3 matrix, se3 object, or rigidtform3d (Image Processing Toolbox) object similar to input pose type. It contains the rotation required to transform the gravity vector from the local navigation reference frame of IMU (NED or ENU) to the input pose reference frame.

    Factor graph optimization solution information, returned as a structure. The fields of the structure are:

    FieldDescription
    InitialCost

    Initial cost of the non-linear least squares problem formulated by the factor graph before the optimization.

    FinalCost

    Final cost of the non-linear least squares problem formulated by the factor graph after the optimization.

    Note

    Cost is the sum of error terms, known as residuals, where each residual is a function of a subset of factor measurements.

    NumSuccessfulSteps

    Number of iterations in which the solver decreases the cost. This value includes the initialization iteration at 0 in addition to the minimizer iterations.

    NumUnsuccessfulSteps

    Number of iterations in which the iteration is numerically invalid or the solver does not decrease the cost.

    TotalTime

    Total solver optimization time in seconds.

    TerminationType

    Termination type as an integer in the range [0, 2]:

    • 0 — Solver found a solution that meets convergence criterion and decreases in cost after optimization.

    • 1 — Solver could not find a solution that meets convergence criterion after running for the maximum number of iterations.

    • 2 — Solver terminated due to an error.

    IsSolutionUsable

    Solution is usable if 1 (true), not usable if 0 (false).

    Use this information to check whether the optimization required for alignment has converged or not.

    Data Types: struct

    References

    [1] Campos, Carlos, Richard Elvira, Juan J. Gomez Rodriguez, Jose M. M. Montiel, and Juan D. Tardos. “ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual–Inertial, and Multimap SLAM.” IEEE Transactions on Robotics 37, no. 6 (December 2021): 1874–90. https://doi.org/10.1109/TRO.2021.3075644.

    Extended Capabilities

    C/C++ Code Generation
    Generate C and C++ code using MATLAB® Coder™.

    Version History

    Introduced in R2023a