Main Content

Simulate Radar Sensor Mounted On UAV

The radar sensor enables a UAV to detect other vehicles in the airspace, so that the UAV can predict other vehicle motion and make decisions to ensure clearance from other vehicles. This example shows how to simulate a radar sensor mounted on a UAV using the uavScenario and radarDataGenerator objects. During the scenario simulation, the radarDataGenerator object generates flight tracks of another vehicle in the scenario. The ego vehicle can utilize such track information to decide whether a collision is about to happen and decide whether a flight plan change is required.

Creating UAV Scenario with Custom Radar Sensor

The testing scenario consists of two UAVs. The fixed-wing UAV is the target vehicle and the multirotor UAV is tracking the fixed-wing UAV using a mounted radar sensor.

% Use fixed random seed for simulation repeatablity.
rng(0)

% Create a scenario that runs for 10 seconds.
s = uavScenario("StopTime",10,"HistoryBufferSize",200);

% Create a fixed-wing target that moves from [30 0 0] to [20 10 0].
target = uavPlatform("Target",s,"Trajectory",waypointTrajectory([30 0 0; 20 10 0],"TimeOfArrival",[0 10])); 
updateMesh(target,"fixedwing", {1}, [1 0 0], eul2tform([0 0 pi]));

% Create a quadrotor that moves from [0 0 0] to [10 10 0].
egoMultirotor = uavPlatform("EgoVehicle",s,"Trajectory",waypointTrajectory([0 0 0; 10 10 0],"TimeOfArrival",[0 10]));
updateMesh(egoMultirotor,"quadrotor",{1},[0 1 0],eul2tform([0 0 pi]));

% Mount a radar on the quadrotor.
radarSensor = radarDataGenerator("no scanning","SensorIndex",1,"UpdateRate",10,...
    "FieldOfView",[120 80],...
    "HasElevation", true,...
    "ElevationResolution", 3,...
    "AzimuthResolution", 1, ...
    "RangeResolution", 10, ... meters
    "RangeRateResolution",3,...
    "RangeLimits", [0 750],...
    "TargetReportFormat","Tracks",...
    "TrackCoordinates",'Scenario',...
    "HasINS", true,...
    "HasFalseAlarms",true,...
    "FalseAlarmRate",1e-5,...
    "HasRangeRate",true,...
    "FalseAlarmRate", 1e-7);

% Create the sensor. ExampleHelperUAVRadar inherits from the uav.SensorAdaptor class.
radar = uavSensor("Radar",egoMultirotor,ExampleHelperUAVRadar(radarSensor),"MountingAngles", [0 0 0]);

Preview the scenario using the show3D function.

[ax,plotFrames] = show3D(s);
xlim([-5,15]);
ylim([-5,35]);
hold on

Figure contains an axes object. The axes object contains 2 objects of type patch.

Simulate and Visualize Radar Detections

Setup the scenario, run the simulation, and check the detections.

% Add detection and sensor field of view to the plot.
trackSquare = plot3(plotFrames.NED,nan,nan,nan,"-");
radarDirection = hgtransform("Parent",plotFrames.EgoVehicle.Radar,"Matrix",eye(4));
coverageAngles = linspace(-radarSensor.FieldOfView(1)/360*pi, radarSensor.FieldOfView(1)/360*pi,128);
coveragePatch = patch([0 radarSensor.RangeLimits(2)*cos(coverageAngles) 0], ...
    [0 radarSensor.RangeLimits(2)*sin(coverageAngles) 0],...
    "blue","FaceAlpha",0.3,...
    "Parent",radarDirection);
hold(ax,"off");

% Start simulation.
setup(s);
while advance(s)
    % Update sensor readings and read data.
    updateSensors(s);
    
    % Plot updated radar FOV.
    egoPose = read(egoMultirotor);
    radarFOV = coverageConfig(radarSensor, egoPose(1:3),quaternion(egoPose(10:13)));
    radarDirection.Matrix = eul2tform([radarFOV.LookAngle(1)/180*pi 0 0]);
    
    % Obtain detections from the radar and visualize them.
    [isUpdated,time,confTracks,numTracks,config] = read(radar);
    if numTracks > 0
        trackSquare.XData = [trackSquare.XData,confTracks(1).State(1)];
        trackSquare.YData = [trackSquare.YData,confTracks(1).State(3)];
        trackSquare.ZData = [trackSquare.ZData,confTracks(1).State(5)];
        drawnow limitrate
    end
    
    show3D(s,"FastUpdate", true,"Parent",ax);
    pause(0.1);
end

Figure contains an axes object. The axes object contains 4 objects of type patch, line.

The target UAV track is visualized during simulation. Using this track prediction, the ego vehicle can now make decisions about whether a collision is about to happen. This enables you to implement obstacle avoidance algorithms and test them with this scenario.