Project SNOW

Project SNOW

Self-driving Navigation Optimized for Winter

Self-driving cars are expected on our roads soon. In the project SNOW (Self-driving Navigation Optimized for Winter), we focus on the unexplored problem of autonomous driving during winter that still raises reliability concerns. We have the expertise to automatically build 3D maps of the environment while moving through it with robots. We aim at using this knowledge to investigate mapping and control solutions for challenging conditions related to Canadian weather.

The main goal of this project is to extend the current technology used for autonomous driving toward unstructured and dynamic environments generated by winter conditions (e.g., a snow-covered forest). This project is addressing the applications of autonomous driving in remote areas, autonomous refueling, search and rescue missions, Canadian Arctic Sovereignty, freight transport on Northern ice roads, etc. Our research concentrates on maps built by a UGV, which will be able to adapt to environmental changes caused by snow and winds, while allowing a robust localization of the vehicle in real-time. These maps will also serve as the foundation for novel path planning algorithms handling deformable obstacles and environments (e.g., deep snow under a vehicle). The project is carried out in partnership with General Dynamics Land Systems - Canada (GDLS-C).

To achieve the goals of the project, the following specific objectives are defined:

  • Objective 1—Mapping and Localization: to develop algorithms to allow the UGV to localize and map its environment in winter conditions.
  • Objective 2—Path Planning and Control: to develop algorithms for planning paths and adapt the behavior of the UGV according to weather conditions.
  • Objective 3—Field Testing and Integration: to carry out an extensive series of experiments using the UGV in a snow-covered forest.

Publications

  1. Deschênes, S.-P., Baril, D., Boxan, M., Laconte, J., Giguère, P., & Pomerleau, F. (2024). Saturation-Aware Angular Velocity Estimation: Extending the Robustness of SLAM to Aggressive Motions. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA). https://arxiv.org/abs/2310.07844 Submitted
     PDF Publisher  Bibtex source
  2. Courcelle, C., Baril, D., Pomerleau, F., & Laconte, J. (2023). On the Importance of Quantifying Visibility for Autonomous Vehicles under Extreme Precipitation. In H. Abut, G. Schmidt, K. Takeda, J. Lambert, & J. H. L. Hansen (Eds.), Towards Human-Vehicle Harmonization (Vol. 3, pp. 239–250). De Gruyter. https://doi.org/doi:10.1515/9783110981223-018
     Publisher  Bibtex source
  3. Pomerleau, F. (2023). Robotics in Snow and Ice. In M. H. Ang, O. Khatib, & B. Siciliano (Eds.), Encyclopedia of Robotics (pp. 1–6). Springer Berlin Heidelberg.
     PDF  Bibtex source
  4. Kubelka, V., Vaidis, M., & Pomerleau, F. (2022). Gravity-constrained point cloud registration. Proceedings of the IEEE International Conference on Intelligent Robots and Systems (IROS), 4873–4879. https://doi.org/10.1109/IROS47612.2022.9981916
     Publisher  Bibtex source
  5. Baril, D., Deschênes, S.-P., Gamache, O., Vaidis, M., LaRocque, D., Laconte, J., Kubelka, V., Giguère, P., & Pomerleau, F. (2022). Kilometer-scale autonomous navigation in subarctic forests: challenges and lessons learned. Field Robotics, 2(1), 1628–1660. https://doi.org/10.55417/fr.2022050
     PDF Publisher  Bibtex source
  6. Vaidis, M., Giguère, P., Pomerleau, F., & Kubelka, V. (2021). Accurate outdoor ground truth based on total stations. 2021 18th Conference on Robots and Vision (CRV). https://arxiv.org/abs/2104.14396
     PDF Publisher  Bibtex source
  7. Baril, D., Grondin, V., Deschenes, S., Laconte, J., Vaidis, M., Kubelka, V., Gallant, A., Giguere, P., & Pomerleau, F. (2020). Evaluation of Skid-Steering Kinematic Models for Subarctic Environments. 2020 17th Conference on Computer and Robot Vision (CRV), 198–205. https://doi.org/10.1109/CRV50864.2020.00034 Best Robotic Vision Paper Award!
     PDF Publisher  Bibtex source

From theory to practice

The following video shows the day we received the UGV - a Warthog robot made by a Canadian company, Clearpath Robotics. We have received the robot in August 2019 and since we began to realize ours plans in practice:

Integration of the UGV hardware and software

Since the reception of the UGV, we have had a detailed plan of the tasks to follow to meet our goals, one of them being able to test the robot in a snowy forest during the winter of 2019. The first step was to integrate the sensor suite we had planned for the UGV. This also involved constructing a solid metal frame by ours hands.

When we finished this electrical part, we continued with the software integration. The mapping software required integration with the system of the UGV, mainly communication with its low-level computer which provides time synchronization and wheel odometry. After finishing all these tasks, it was finally the time to start testing. We found a nice and practical place for initial testing in front of our faculty building. In September 2019 when we fixed all the important details, we were able to proceed to the Montmorency forest.

The main goal of the first tests in the forest was to prove that we could safely deploy the robot, record all necessary data and possibly run mapping on-board the robot computers. We have, of course, discovered some bugs but they were all quick to fix and after a few sessions, we were able to generate large 3D maps of the forest.

Because the mapping functionality seemed working fine, we proceeded to implementing a basic path-following functionality. The robot would be navigated once by hand over a desired path while recording its position. Then, a software controller would repeat the path indefinitely, based on the onboard mapping and localization capabilities. The initial tuning of the controller showed some minor oscillations around the learned trajectory:

Fortunately, we were able to find the source of the problem, which was a misalignment between the commanded and actually executed turning rate of the robot. After fixing the problem, the controller was able to give us much more satisfying results:

First review meeting

At the end of October 2019, the first review meeting took place. It lasted two days and was intended for our partner GDLS-C. We presented the integrated system and its capabilities to allow the partner to replicate the results on their identical UGV. The following video summarizes what was achieved:

Team

Media Coverage