Around the world, companies are racing to create fully autonomous vehicles. But according to Professor Steven Waslander (UTIAS), there’s one scenario facing all competitors that hasn’t yet received as much attention as it deserves: winter.
“Winter conditions aggravate the remaining challenges in autonomous driving,” says Waslander. “Reduced visibility limits perception performance, and slippery road surfaces are a big challenge for vehicle control.”
To drive safely in all conditions, including winter, Waslander says that autonomous vehicles will need to fully observe their surroundings despite limits to their sensor range, to get advanced warning of challenging situations and to react very quickly to changing conditions.
Waslander — along with fellow U of T Engineering professors and members of the U of T Robotics Institute including Timothy Barfoot, Jonathan Kelly and Angela Schoellig (all UTIAS) — is leading a new partnership that will take on these challenges by bringing together the best minds from academia and industry.
WinTOR: Autonomous Driving in Adverse Conditions is a new collaboration that aims to transform Toronto into a global hub for research and development related to autonomous driving in winter. Corporate partners include leading companies in the autonomous vehicles sector such as General Motors Canada, LG Electronics, Applanix and Algolux.
The team already has a track record of success. Last year, Waslander and his collaborators published the Canadian Adverse Driving Conditions (CADC) dataset. Created using the Autonomoose, a self-driving vehicle designed by Waslander and his team, the open-source dataset records real winter driving conditions from roads in southwestern Ontario.
The CADC, which is already being used by researchers from around the world to train new AI software, joins a long list of research accomplishments from each of the group’s professors. Their expertise covers the full extent of autonomous driving perception and planning domains, including object detection and tracking, robust state estimation and calibration, localization and mapping, prediction and planning and safety-critical learning control.
Read more about autonomous vehicle research at U of T Engineering
All four professors are advisors to the aUToronto team, a group of undergraduate and graduate students who have designed and built a self-driving electric vehicle called Zeus. The U of T Engineering team has placed first in the international AutoDrive Challenge in each of the four annual competitions to date.
The new partnership is supported by more than $12 million in funding from a variety of sources. These include a grant recently announced by the Ontario Research Fund – Research Excellence program, as well as funding from the Natural Science and Engineering Research Council, and direct and in-kind donations from the project partners. More than 20 individuals will work as part of the team, including graduate and undergraduate students, professors and engineers.
“We are excited to understand how we can apply the collaborative research under this program to real-world scenarios,” says Louis Nastro, Director, Land and Autonomous Vehicle Strategy at Applanix. “It gives us an opportunity to attract highly talented individuals with the experience needed to join our team, and helps Canada establish itself as a leading provider of advancement in autonomy.”
“Algolux’s mission is to solve the issue of computer vision robustness in harsh driving conditions, a fundamental problem not effectively addressed by current approaches,” says Felix Heide, co-founder and CTO of Algolux. “As a Canadian company, we are thrilled to bring our expertise to this project and continue to advance the state-of-the-art in perception technologies.”
“We are excited and proud to be a partner of this initiative,” says Kevin Ferreira, Director of the LG Electronics Toronto AI Lab. “It is challenging to drive social impact with game changing technology in a fast-moving industry such as autonomous driving. Partnerships and collaborations such as this initiative, is an effective strategy to make contributions to the research community, and to deliver impactful applications that improve safety while driving.”
The project is divided into three broad themes:
- Sensor Filtering for Object Detection — New ways of analyzing the data from sensors such as visual cameras, radar and lidar will help to separate the signals that represent real objects from the noise caused by falling or blowing snow. Strategies will include both pre-processing techniques and improved artificial intelligence algorithms trained to be aware of the limits of their own performance.
- Sensor Fusion, Localization and Tracking — While today’s self-driving cars can reliably determine where they are in relation to their surroundings, the techniques they use start to break down under adverse driving conditions. The team will leverage new algorithmic strategies in vision and lidar registration, as well as new sensing options such as ground-penetrating and automotive radar, to make localization algorithms more resilient in adverse conditions.
- Prediction, Planning and Control — This theme will enable self-driving cars of the future to change the way they drive in response to winter hazards. For example, they might take a slightly different path to avoid a snowdrift, or slow down when driving over a section of road that their sensors have perceived as particularly slippery. They will learn the implications of adverse weather on the vehicles around them, and be able to assess the increased uncertainty of outcomes, enabling them to plan actions that can be executed reliably in winter conditions.
As ambitious as the current plan is, the team is just getting started. They are leveraging their network — including the Robotics Institute, one of more than a dozen Institutional Strategic Initiatives across U of T — to grow their team.
“We continue to seek additional faculty, partners and funding to grow the effort,” says Waslander. “We have many more ideas to work on, from multi-hypothesis prediction and interaction planning to attentive perception and explainable, efficient AI for autonomous driving.”