Should track and measurement be in the same frame or coordinates system and if not how will the assignment work?
1 view (last 30 days)
Show older comments
Im trying to achieve centralized tracking with camera and radar raw measurements. I track in world coordinates and I use measurement function in Kalman filter to map my state to the raw measurement. I distinguish between the sensors using SensorIndex as both have different measurement function. I have few questions here.
1) If I have only camera measurements for this time instance, the tracker (trackerGNN,trackerJPDA,trackerTOMHT) will predict the state using the kalman filter model for camera from the sensor index I specified. Now , it will look for detections to assign and if there is a radar detection which satisfies the assignment cost, it should assign the track to this radar detection. For, the next prediction step, will it switch to the radar kalman filter model specified by sensor index ?? as now it should update with radar measurements so to calculate residual from this equation :
y = Z - h(x)
y is residual,
Z is measurements,
h() is measurement fuction which maps my state to measurement space,
x is my track state in world coordinates.
2) If the normalized distance is calculated from the residual for the assignment, My residual calculation happens in measurement space but I would like to track in state space which is world coordinates. How can I achieve this?
0 Comments
Answers (1)
Elad Kivelevitch
on 3 Aug 2023
You mention that your measurement function already knows how to handle different sensor models based on SensorIndex. So, the tracker should be able to do the right thing without any additional work for you.
In other words, given a measurement function h of the following format:
function zexp = h(state, measurementParameter)
% measurementParameters are supplied to the function by the tracker from
% the objectDetection.MeasurementParameters property.
if measurementParameters.SensorInex == 1 % Let's say 1 is vision model
zexp = hvision(state, measurementParameters);
else % Use radar model
zexp = hradar(state, measurementParameters);
end
The hvision and hradar measurement function should allow you to convert from the state in global coordinates to the expected measurement in vision and radar measurement space, respectively. The measurementParameters passed to those functions will define things like where the sensor is mounted, how it is oriented, etc. These are defined in the objectDetection as defined in https://www.mathworks.com/help/fusion/ug/convert-detections-into-objectDetection-format.html
The trackers will assign each sensor separately using the SensorIndex property in the objectDetections. Cost calculation will use the expected measurement from the function above.
Note that cost is:
(z-h(x))' / (R+HPH') * (z-h(x)) + log(det(R+HPH')).
The first part is Mahalonobis distance while the log(det()) part is a compensation to avoid assignment of tracks with large uncertainty (which causes HPH' to be large).
The tracker uses the measurement functions to update the track state in global coordinates. You don't need to do anything else other than defining the measurement functions correctly.
See Also
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!