I'm trying to implement sensor fusion of IMU and GPS in Simulink with a MATLAB script example as a guide. I'm getting the wrong fused position and orientation.

4 views (last 30 days)
I'm stuck while trying to implement sensor fusion for the IMU and GPS simulink blocks. I have used a MATLAB function block to add a fusion filter using this matlab example as a guide: https://www.mathworks.com/help/fusion/ug/imu-and-gps-fusion-for-inertial-navigation.html
I suspect the states and/or covariances are not being correctly updated hence my position estimate starts wandering off till it gets a GPS update. I am passing the new state and covariance matrix in each time step however, you can see my result in the graph below. The error of the MATLAB example in position is very little, between 1 and 2m whereas mine is extremely oscillatory.
I want to know what I can improve in this model, is my implementation technique correct? Am I on the wrong track trying to implement the insfilterMARG function from MATLAB in Simulink in a MATLAB function block?
If this is completely wrong, how can I implement real time sensor fusion in Simulink?
I have added all necessary files in a zip file, run the m file and then proceed to run the Simulink model to obtain the graph below.

Answers (1)

Ryan Salvo
Ryan Salvo on 7 Dec 2022
Hi Mehreen,
The ZIP-file attached in your question does not contain the Simulink model, but from the IMUandGPS.m file included, I suspect that the orientation is being incorrectly transposed when quat2rotm is executed on line 9. Try using the rotmat command instead with the "frame" rotation type.
If the issue persists, I recommend creating a service requestto resolve it.
Thanks,
Ryan

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!