IMU orientation using AHRS filter

Hello, I am having IMU orientation troubles I am using the AHRS Filter to output Angular Velocity and Quaternions relative to the NED reference frame.
To do this i use the code:
[quaternions, angularVelocity] = ahrsfilter(acc,gyr,mag);
Where acc, gyr, and mag are accelerometer, gyroscope, and magnetometer readings, respectively. Each acc, gyr, and mag are 3xN.
I want to then rotate the sensor signals so they are aligend the the NED refernce frame. I am trying to do this with the following code
for i = length(1:end(acc))
acc(i,:) = rotateframe(quaternions(i),acc(i,:))
gyr(i,:) = rotateframe(quaternions(i),gyr(i,:))
end
after i rotate the accelerometer and gyroscope readings, the rotated signals are not correct. However, the angularVelocity output from the ahrsfilter is correct. Does anyone have any ideas why this may be?
Thank you

2 Comments

hey, do you slove this problem? I got the same problem, I let the sensor frame completely coinside to the NED or the ENU frame, but the result stil has a rotatetion angle.
Hi, Exactly the same questions! have you solve this?

Sign in to comment.

Answers (1)

James Tursa
James Tursa on 23 Aug 2020
I haven't used either of those functions, but from reading the doc maybe you need to use the conjugate of the quaternions in your rotateframe call. I.e., if the sensor signals are in the BODY frame and the quaternions are NED->BODY, then maybe you need the conjugate of the quaternions to do the BODY->NED conversion.

8 Comments

Hi James, thank you very much for the help. This methos is sort of working. The pitch and yaw are now aligned, however the roll about the Y axis still is not correct. Ive copied my code here:
acc = [accX, accY, accZ];
gyr = [gyrX, gyrY, gyrZ];
mag = [magX, magY, magZ];
%Fusion Processing Loop and align reference frames
gyr(:,1) = -gyr(:,1);
gyr(:,3) = -gyr(:,3);
acc(:,1) = -acc(:,1);
mag(:,1) = -mag(:,1);
mag(:,3) = -mag(:,3);
qyaw = quaternion([sqrt(2)/2 0 0 sqrt(2)/2]);
for i = 1:length(acc)
[orientation(i), angularVelocity(i,:)] = aFilter(acc(i,:),gyr(i,:),mag(i,:));
% orientation(i) = orientation(i)*qyaw;
end
release(aFilter);
orientation = compact(orientation); % convert quaternion to matrix
orientation = quatconj(orientation); % take conjugate of quaternion
orientation = quaternion(orientation); % convert quaterion back to quaterion form
%rotate the accelerometers and gyroscope signals
for i = 1:length(acc)-1
acc(i,:) = rotateframe(orientation(i), acc(i,:));
gyr(i,:) = rotateframe(orientation(i), gyr(i,:));
end
If you see anything glaring wrong with this please feel free to let me know.
Thanks again -- really appreciated.
Flipping the signs of some of the components of the data inputs does not seem like a correct thing to be doing. If the sensors are fixed in the object body and are giving BODY frame readings, then the filter should be able to handle them as is.
If you leave the data signs as is, call the filter, and then use the quaternion conjugate to change coordinate frames of the data, maybe you can explain why you don't like the results. What is it about that result that doesn't match what you expect, and why?
Hi james, Thank you so much for thunking about this with me.
Ive attached three pictures to sort of explain my probelm. The AHRS filter outputs angualr velocity in the body sensor frame... which i great, but i still need to rotate my acceleration signal down the road so i need to make sure the way i rotate the acc and gyr signals is correct. I am checking this by checking the rotated gyr signal against the outputed angular velocity signal from the AHRS filter. If the two match perfectly then the sensor has been rotated correctly.
After the rotation the yaw looks great. But, the pitch and roll do not. Ive attached images to explain this.
It looks like between the X and Y axis the sensor is roatetd off by some amount and hence why the signal is during roll and pitch is split between the axes.
Thanks so much for your help.
James Tursa
James Tursa on 24 Aug 2020
Edited: James Tursa on 24 Aug 2020
Is the angularVelocity result from the filter coordinatized in the BODY frame? What do you get when you plot it on top of the original gyr data? E.g., scale the original gyr data to get it into rad/sec units and then plot it with angularVelocity. Do they match up (except for a bias)?
Hi james, I cant tell you how much i appreciate the help.
Yes the angularVelocity Result is from the ahrs filter in the SENSOR BODY FRAME. Here is the angualr velocity from the AHRS filter plotted with the raw gyroscope angular velocity signal.
yes the original data and the angular velocity from the AHRS filtrer match up
So then I don't understand why you are comparing a rotated gyr signal against the angularVelocity filter result. The original gyr signal is in BODY frame and that already matches the angularVelocity result in BODY frame, so why the rotated gyr data comparison? I am still not following.
Ok, Perhaps i am not uindertsanding my own problem. Im am going to work on this myslef more adn hopefully ill figure it out. I may post abck here in the future if i cna better explain it.
Thanks for your help James.
Adam

Sign in to comment.

Asked:

on 22 Aug 2020

Commented:

on 6 Dec 2022

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!