Tata Motors European Technical Centre Accelerates Development of Autonomous Vehicle Control Algorithms with Model-Based Design

“A small team of engineers pulled together an autonomous vehicle with off-the-shelf hardware and control algorithms developed and implemented with Model-Based Design. Though the system isn’t production-ready, it does demonstrate important design concepts with a pragmatic design approach.”

Challenge

Build and demonstrate an autonomous vehicle for the UK Autodrive project

Solution

Use Model-Based Design to model, simulate, and generate embedded code for motion planning and vehicle control algorithms

Results

  • Real-time controller implementation accelerated
  • Debugging simplified
  • Development time focused on design
Trials for Tata’s autonomous vehicle in Coventry, UK.

Trials for TMETC’s autonomous vehicle in Coventry, UK.

In its 2013 Autumn Budget statement, the UK government introduced measures to encourage the development of self-driving cars in the UK. In July 2014, the UK’s innovation agency, Innovate UK, launched the "Introducing driverless cars to UK roads" competition. UK Autodrive was one of three projects awarded funding. This project brought together leading automotive companies, academic institutions, legislators, insurers, and other stakeholders in a three-year trial of self-driving vehicles and connected car technologies, establishing the UK as a global hub for the research, development, and integration of self-driving vehicles and associated technologies.

As part of UK Autodrive, Tata Motors European Technical Centre (TMETC) developed autonomous driving software and deployed it in a Tata Hexa SUV equipped with off-the-shelf drive-by-wire hardware. A small team of engineers from TMETC developed the sensor perception, motion planning, and vehicle control algorithms. Model-Based Design with MATLAB® and Simulink® enabled this team to move quickly from design on paper to simulations and then to running on an embedded ECU in the vehicle.

“With Simulink, we could concentrate on the high-level design implementation rather than low-level coding,” says Dr. Mark Tucker, Lead Engineer at TMETC. “This was important to us, as delivering a functional vehicle was our goal, not demonstrating our coding skills.”

Challenge

The TMETC team aimed to deliver a demonstrable self-driving vehicle with a small team of engineers while keeping the project on schedule and on budget. To meet these objectives, they relied on off-the-shelf components where possible and looked for ways to shorten development time for core control algorithms.

A principal design challenge was integrating the many disparate elements of the system. These elements included radar, lidar, GPS, inertial measurement, and mono vision, as well as algorithms for sensor fusion, motion planning, simultaneous localization and mapping, and vehicle control.

All communication between elements had to be logged to comply with UK regulations, particularly “The Pathway to Driverless Cars: A Code of Practice for Testing,” published by the Department of Transport. The team decided to use the Robot Operating System (ROS) middleware to address integration and logging requirements. As a result, the algorithms they wrote needed ROS interfaces, and the team needed a way to visualize and analyze logged ROS data.

Roof-mounted sensors on the autonomous vehicle.

Roof-mounted sensors on the autonomous vehicle.

Solution

TMETC’s engineers used Simulink to model, simulate, and generate code for the motion planning and vehicle control algorithms deployed in the autonomous Hexa.

Three vehicle control algorithms were developed: pure pursuit, lane keeping, and model predictive control. To evaluate each algorithm, they integrated it with simple lateral and longitudinal models of the vehicle and ran closed-loop simulations.

The pure-pursuit approach lacked sufficient stability, and the lane-keeping approach performed relatively poorly in urban centers that required navigation of tight curves and slow speeds. The model predictive controller performed well in simulations spanning a range of operating scenarios.

The team refined the lateral and longitudinal model predictive controllers, which use reference set points, vehicle dynamic measurements, and a model of the vehicle dynamics to generate optimal vehicle control sequences for steering, accelerating, and braking in order to follow the planned trajectory.

Hardware-in-the-loop tests were employed to check hardware interfaces.

The TMETC team generated code from their motion planning algorithms with Embedded Coder® and deployed it to a Linux-based PC installed in the vehicle. Using Simulink Real-Time™, they deployed their vehicle control algorithms to Speedgoat target hardware installed in the vehicle.

On-road tests were conducted, during which data was logged from ROS data as well as directly from the vehicle controller. Data was analyzed and visualized using RViz, MATLAB, and Robotics System Toolbox™. To debug and further refine the control algorithms, logged driving scenarios data was played back through the controllers in simulation.

TMETC successfully demonstrated their autonomous vehicle on a mixture of urban roads and grid-based streets in the UK Autodrive project’s vehicle trials in Coventry and Milton Keynes.

Results

  • Real-time controller implementation accelerated. “As soon as we were ready for testing on the vehicle, we used Simulink Real-Time to deploy our vehicle controller to the Speedgoat hardware,” says Tucker.
  • Debugging simplified. “Simulink enabled us to play back data from on-road tests in simulations,” says Tucker. “We could stop the simulation at any point, making it possible to dig into the control model to see what was happening and resolve any quirks we identified in our algorithms.”
  • Development time focused on design. “All the motion planning and vehicle control code was generated from our Simulink models,” says Tucker. “This saved us a lot of time because we could concentrate on the high-level design, not implementing equations and handling exceptions in code. Coding our control algorithms by hand would have been a much larger task.”