Generate High Definition Scene from Lane Detections and OpenStreetMap
This example shows how to generate a High-Definition (HD) scene containing roads with lanes by using the data extracted from OpenStreetMap®, Global Positioning System (GPS), and lane detections.
Lanes in roads of virtual driving scenarios are essential to perform safety assessment for automated driving applications. You can create real-world scenario using Standard-Definition (SD) map imported from OpenStreetMap®. However SD maps lack detailed lane information that is essential for navigation in an autonomous system. This example shows a workflow to generate HD map from SD maps with detailed lane information using recorded lane detections and GPS sensor data.
This figure shows these steps.
Load Sensor Data
This example requires the Scenario Builder for Automated Driving Toolbox™ support package. Check if the support package is installed and, if it is not installed, install it using the Get and Manage Add-Ons.
checkIfScenarioBuilderIsInstalled
Download a ZIP file containing GPS data, lane detections, and camera information and then unzip. In this example, you use camera data for visual validation of the generated scene.
dataFolder = tempdir; dataFilename = "PolysyncSensorData_23a.zip"; url = "https://ssd.mathworks.com/supportfiles/driving/data/"+dataFilename; filePath = fullfile(dataFolder, dataFilename); if ~isfile(filePath) websave(filePath,url); end unzip(filePath, dataFolder); dataset = fullfile(dataFolder,"PolysyncSensorData"); data = load(fullfile(dataset,"sensorData.mat"));
Load the GPS data.
gpsData = data.GPSData;
gpsData
is a table with these columns:
timeStamp
— Time, in microseconds, at which the GPS data was collected.latitude
— Latitude coordinate values of ego waypoints. Units are in degrees.longitude
— Longitude coordinate values of ego waypoints. Units are in degrees.altitude
— Altitude coordinate values of ego waypoints. Units are in meters.
Display the first five entries of gpsData
.
gpsData(1:5,:)
ans=5×4 table
timeStamp latitude longitude altitude
__________ ________ _________ ________
1.4616e+15 45.532 -122.62 34.432
1.4616e+15 45.532 -122.62 34.434
1.4616e+15 45.532 -122.62 34.395
1.4616e+15 45.532 -122.62 34.379
1.4616e+15 45.532 -122.62 34.383
Load the recorded lane detections data. Alternatively, you can generate lane detections by processing raw camera data. For more information on how to generate lane detection from camera data, see the Extract Lane Information from Recorded Camera Data for Scene Generation example.
laneDetections = data.LaneDetections
laneDetections = laneData with properties: TimeStamp: [715×1 double] LaneBoundaryData: {715×1 cell} LaneInformation: [] StartTime: 1.4616e+15 EndTime: 1.4616e+15 NumSamples: 715
laneDetections
is a laneData
object. For more information on how to store lane detection data into laneData
object, see the Preprocess Lane Detections for Scenario Generation example.
Read and display the first five entries of laneDetections
.
sampleData = readData(laneDetections, RowIndices=(1:5)')
sampleData=5×3 table
TimeStamp LaneBoundary1 LaneBoundary2
__________ ___________________________ ___________________________
1.4616e+15 {1×1 parabolicLaneBoundary} {1×1 parabolicLaneBoundary}
1.4616e+15 {1×1 parabolicLaneBoundary} {1×1 parabolicLaneBoundary}
1.4616e+15 {1×1 parabolicLaneBoundary} {1×1 parabolicLaneBoundary}
1.4616e+15 {1×1 parabolicLaneBoundary} {1×1 parabolicLaneBoundary}
1.4616e+15 {1×1 parabolicLaneBoundary} {1×1 parabolicLaneBoundary}
Load the camera data recorded from a forward-facing monocular camera mounted on the ego vehicle.
cameraData = data.CameraData;
The camera data is a table with two columns:
timeStamp
— Time, in microseconds, at which the image data was captured.fileName
— Filenames of the images in the data set.
The images are located in the Camera
folder under dataset
directory. Create a table that contains file paths of these images for each timestamp by using the helperUpdateTable
function.
imageFolder = "Camera";
cameraData = helperUpdateTable(cameraData,dataset,imageFolder);
Display the first five entries of cameraData
.
cameraData(1:5,:);
Remove data
from workspace to save memory.
clear data;
Crop and Preprocess Sensor Data
Crop the GPS data, lane detections, and camera data relative to GPS time stamp range by using the helperCropData
.
startTime = gpsData.timeStamp(1); endTime = gpsData.timeStamp(end); % Pack all the tables in a cell array. recordedData = {gpsData,laneDetections,cameraData}; % Crop the data. recordedData = helperCropData(recordedData,startTime,endTime);
The data set time format is not in POSIX® system which is required for Scenario Builder functions. Convert time stamps to posixtime
by using the helperNormTimeInSecs
function and specify these options.
scale
— Scale to convert time stamp. Specify as1e-6
to convert microseconds to seconds. Specify1
, if the recorded time stamps are in seconds.offset
— Offset to set the simulation start time. Driving scenario supports start time as0
.
scale = 1e-6; offset = startTime; recordedData = helperNormTimeInSecs(recordedData,offset,scale);
Extract the GPS data, lane detections and camera data with updated timestamps values from recordedData
.
gpsData = recordedData{1,1}; laneDetections = recordedData{1,2}; cameraData = recordedData{1,3};
Remove recordedData
from workspace.
clear recordedData;
Extract Map Roads using GPS data
Create a geographic player using the geoplayer
object and display the full route using GPS data.
zoomLevel = 16; center = mean([gpsData.latitude gpsData.longitude]); player = geoplayer(center(1),center(2),zoomLevel); plotRoute(player, gpsData.latitude,gpsData.longitude);
Obtain geographic bounding box coordinates from GPS data by using the getMapROI
function.
mapStruct = getMapROI(gpsData.latitude,gpsData.longitude);
The map file required for importing roads of an area has been downloaded from the OpenStreetMap (OSM) website. The OpenStreetMap provides access to worldwide, crowd-sourced map data. The data is licensed under the Open Data Commons Open Database License (ODbL). For more information on the ODbL, see the Open Data Commons Open Database License site.
url = mapStruct.osmUrl; filename = "drive_map.osm"; websave(filename,url,weboptions(ContentType="xml"));
Extract road properties and geographic reference coordinates to use to identify ego roads by using the roadprops
function.
[roadData,geoReference] = roadprops("OpenStreetMap",filename);
Display the first five entries of roadData
.
roadData(1:5,:)
ans=5×10 table
RoadID JunctionID RoadName RoadCenters RoadWidth BankAngle Heading Lanes LeftBoundary RightBoundary
______ __________ _____________________________ _____________ _________ _____________ _____________ ____________ ______________ ______________
1 0 "5510377-617349836-617349835" {15×3 double} 7.35 {15×1 double} {15×1 double} 1×1 lanespec { 83×3 double} { 83×3 double}
2 0 "5510384" { 5×3 double} 7.35 { 5×1 double} { 5×1 double} 1×1 lanespec { 16×3 double} { 16×3 double}
3 0 "5510384" {14×3 double} 7.35 {14×1 double} {14×1 double} 1×1 lanespec { 73×3 double} { 73×3 double}
4 0 "5511492" {15×3 double} 7.35 {15×1 double} {15×1 double} 1×1 lanespec {161×3 double} {161×3 double}
5 0 "5511492" { 5×3 double} 7.35 { 5×1 double} { 5×1 double} 1×1 lanespec { 93×3 double} { 93×3 double}
Visualize the imported roads by using the helperPlotRoads
function. Notice that the roads do not contain lane information. Define cropWindow
of the form [x y width height] to crop the map, zoom in, and display. The x and y elements are the coordinates of the top-left corner of the crop window.
cropWindow = [-30 -30 60 60]; helperPlotRoads(roadData,cropWindow);
Select Ego Roads from Road Network
Convert geographic GPS coordinates to local East-North-Up (ENU) coordinates by using the latlon2local
function. The transformed coordinates define the trajectory waypoints of the ego vehicle. Units are in meters.
[xEast,yNorth,zUp] = latlon2local(gpsData.latitude,gpsData.longitude,gpsData.altitude,geoReference); waypoints = [xEast,yNorth,zUp];
Raw GPS data often contains noise. Smooth the GPS waypoints by using the smoothdata
function.
window = round(size(waypoints,1)*0.2);
waypoints = smoothdata(waypoints,"rloess",window);
If GPS data suffers from inaccuracies in position and orientation, then you must improve ego vehicle localization to generate accurate ego trajectory. For more information, see the Ego Vehicle Localization Using GPS and IMU Fusion for Scenario Generation example.
Create the ego trajectory using the waypoints and their corresponding time of arrivals by using the waypointTrajectory
(Sensor Fusion and Tracking Toolbox) System object™. This example do not consider altitude values of the waypoints, therefore set all the altitude values to 0
. You must set ReferenceFrame
property of this System object to ENU because Scenario Builder for Automated Driving Toolbox® supports the ENU format for local coordinate data.
waypoints = double([waypoints(:,1:2) zeros(size(zUp))]);
egoTrajectory = waypointTrajectory(waypoints,gpsData.timeStamp,ReferenceFrame="ENU");
Select road properties on which the ego vehicle is traveling by using the selectActorRoads
function.
egoRoadData = selectActorRoads(roadData,egoTrajectory.Waypoints);
Display egoRoadData
table.
egoRoadData
egoRoadData=1×10 table
RoadID JunctionID RoadName RoadCenters RoadWidth BankAngle Heading Lanes LeftBoundary RightBoundary
______ __________ __________________ ______________ _________ _____________ _____________ ____________ ______________ ______________
76 0 "Banfield Freeway" {439×3 double} 10.95 {35×1 double} {35×1 double} 1×1 lanespec {619×3 double} {619×3 double}
Visualize the selected roads by using the helperPlotRoads
function. Notice that the selected ego roads do not have lane information.
helperPlotRoads(egoRoadData,cropWindow,'DataTips',true);
Generate Lane Specifications for Ego Roads
Visualize lane detections and camera images by using the birdsEyePlot
function and helperPlotLanes
function.
% Initialize the figure with bird's-eye plot. currentFigure = figure(Position=[0 0 1400 600]); hPlot = axes(uipanel(currentFigure,Position=[0 0 0.5 1],Title="Recorded Lanes")); bep = birdsEyePlot(XLim=[0 60],YLim=[-35 35],Parent=hPlot); camPlot = axes(uipanel(currentFigure,Position=[0.5 0 0.5 1],Title="Camera View")); helperPlotLanes(bep,camPlot,laneDetections,cameraData);
Create reference lane specifications by using the lanespec
function and a dictionary object. Specify three number of lanes on the basis of visual inspection of camera data. Extract the RoadID for the ego road from the egoRoadData
table. The reference lane specification is used to provide the correct number of lanes for each road in the scene as the roads imported using OpenStreetMap do not always contain this information. Create a lanespec
object with the specified number of lanes and assign it to the respective RoadID in the reference lane specifications.
numOfLanes = 3; roadID = egoRoadData.RoadID(1); refLaneSpec = dictionary; refLaneSpec(roadID) = {lanespec(numOfLanes)};
Specify the lane index for the first waypoint of the ego vehicle. You must start numbering the lanes in left-to-right order along the travel direction of the ego vehicle. For this example, the first waypoint of the ego vehicle is on the third lane as shown in the camera image.
firstEgoWaypointLaneIdx = 3;
Update the lane specification with information extracted from lane detections by using the updateLaneSpec
function. The camera image shows that all three lanes of the ego road have nearly same lane width. To replicate ego lane width for non-detected lanes, set the ReplicateUpdatedLaneWidth
argument to true.
egoRoadsWithUpdatedLanes = updateLaneSpec(laneDetections,egoRoadData,refLaneSpec,...
egoTrajectory,firstEgoWaypointLaneIdx,ReplicateUpdatedLaneWidth=true);
Visualize ego roads with updated lane specifications by using the helperPlotRoads
function.
helperPlotRoads(egoRoadsWithUpdatedLanes,cropWindow);
Simulate Scenario with Ego Vehicle and Updated Lanes
Define parameters to create a driving scenario. endTime
is specified as GPS end time and the sampleTime
is calculated as minimum difference between two consecutive GPS timestamps.
endTime = gpsData.timeStamp(end); sampleTime = min(diff(gpsData.timeStamp));
Create a driving scenario by using the drivingScenario
object. Specify SampleTime
and StopTime
for the driving scenario.
scenario = drivingScenario(SampleTime=sampleTime,StopTime=endTime);
Add roads with updated lane specifications by using helperAddRoads
function. This helper function adds roads to driving scenario using the road
function.
scenario = helperAddRoads2Scene(scenario,egoRoadsWithUpdatedLanes);
Add the ego vehicle waypoints to scenario by using the helperAddEgo
function.
scenario = helperAddEgo2Scene(scenario,egoTrajectory);
Visualize the generated scenario and compare it with recorded camera data using the helperVisualizeHDSceneWithEgo
function.
currentFigure = figure(Name="Generated Scene",Position=[0 0 1400 600]);
helperVisualizeHDSceneWithEgo(currentFigure,scenario,cameraData);
Create and Import RoadRunner HD Map
HD map contains lane information which is useful for automated driving applications such as sensing, perception, localization, and planning. Create a RoadRunner HD map from generated driving scenario by using the getRoadRunnerHDMap
function.
rrHDMap = getRoadRunnerHDMap(scenario);
Note that the function getRoadRunnerHDMap
has some limitations. For more details, see getRoadRunnerHDMap
.
Visualize HD Map roads by using the helperPlotRoads
function.
helperPlotRoads(rrHDMap,cropWindow)
Write the RoadRunner HD Map to a binary file, which you can import in RoadRunner.
write(rrHDMap,"RoadRunnerHDMapFromLanes.rrhd");
To open RoadRunner using MATLAB, specify the path to your RoadRunner project. This code shows the path for a sample project folder location in Windows®.
rrProjectPath = "C:\RR\MyProject";
Specify the path to your local RoadRunner installation folder. This code shows the path for the default installation location in Windows.
rrAppPath = "C:\Program Files\RoadRunner R2022b\bin\win64";
Open RoadRunner using the specified path to your project.
rrApp = roadrunner(rrProjectPath,InstallationFolder=rrAppPath);
Import the RoadRunnerHDMapFromLanes.rrhd
scene file into RoadRunner.
importScene(rrApp,fullfile(pwd,"RoadRunnerHDMapFromLanes.rrhd"),"RoadRunner HD Map");
This figure shows a scene built using the RoadRunner Scene Builder Tool.
Note: Use of the Scene Builder Tool requires a RoadRunner Scene Builder license.
See Also
Functions
getMapROI
|roadprops
|selectActorRoads
|updateLaneSpec
|cubicLaneBoundary
|parabolicLaneBoundary
Related Topics
- Overview of Scenario Generation from Recorded Sensor Data
- Smooth GPS Waypoints for Ego Localization
- Extract Lane Information from Recorded Camera Data for Scene Generation
- Extract Vehicle Track List from Recorded Lidar Data for Scenario Generation
- Extract Vehicle Track List from Recorded Camera Data for Scenario Generation
- Generate Scenario from Actor Track List and GPS Data