Autoware Tutorial


Autoware, open source, autonomous driving, ROS


Autoware is the largest autonomous driving open source community. It has found widespread adoption as Autoware is used by 100+ companies, runs on 30+ vehicles, is used in 20+ countries, as well as for education in 5 countries. Autoware.AI is the original Autoware project build on ROS 1, launched as a research and development platform for autonomous driving technology for researchers, developers, and students interested in autonomous driving technology. Now Autoware.Auto is launching, which is the next generation of Autoware based on ROS 2. Autoware.Auto is managed by an open source community manager, applies best-in-class software engineering practices, and is based on a redesigned architecture.

This tutorial will start with a holistic overview over all Autoware projects. Then there will be a set of invited talks presenting algorithms and modules available in Autoware. Part of the workshop will be dedicated to a tutorial including how to bring up Autoware on a computer.

The goal of this tutorial is to introduce Autoware to the IV community and to encourage publication of academic results in open source in addition to the paper presented at the conference.

Please click below for details and for submission of presentation proposals.

Tentative Agenda

  • 9:00-10:00 Welcome and Introduction to Autoware
  • 10:00-10:30 Algorithms in Autoware 1
  • 10:30-11:00 Coffee Break and Networking
  • 11:00-11:30 Algorithms in Autoware 2
  • 11:30-12:30 Building Applications with Autoware 1
  • 12:30-13:30 Lunch and Networking
  • 13:30-14:15 Building Applications with Autoware 2
  • 14:15-15:00 Simulation and Autoware
  • 15:00-15:30 Coffee Break and Lightning Talks
  • 15:30-16:00 How to get started with Autoware
  • 16:00-16:45 Panel: Autoware and its impact on autonomous driving
  • 16:45-17:00 Feedback and Fin

Please find more details here: https://www.autoware.org/agenda

Extended Object Tracking and Sensor Fusion:
Theory and Applications Tutorial


Sensor fusion, multi-object-tracking, extended objects, data association, clustering, radar, lidar, vision, autonomous vehicles, environment perception


Environment perception for autonomous vehicles involves the important tasks of detecting, classifying and tracking all objects of interests in the vicinity of the vehicle, such as vehicles, pedestrians, and bicyclists. Solving these tasks accurately and robustly is paramount for the safe operation of the vehicle. In this context, this tutorial provides an overview of sensor fusion and multi-object tracking techniques. The focus lies on high-resolution sensors (e.g., lidar, radar, and camera devices) that give rise to multiple detections per object. In the first part of the tutorial, the modeling of different object and measurement types is discussed and suitable (single) object tracking approaches are presented. The second part of the tutorial introduces recent data association and clustering methods for tracking multiple objects. Finally, the last part of the tutorial will demonstrate applications and discuss pitfalls of these approaches in real-world scenarios.