TY - GEN
T1 - 360$$^{\circ }$$ Real-Time 3D Multi-object Detection and Tracking for Autonomous Vehicle Navigation
AU - Del Egido, Javier
AU - Gómez-Huélamo, Carlos
AU - Bergasa, Luis M.
AU - Barea, Rafael
AU - López-Guillén, Elena
AU - Araluce, Javier
AU - Gutiérrez, Rodrigo
AU - Antunes, Miguel
N1 - Publisher Copyright:
© 2021, The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2021
Y1 - 2021
N2 - This paper presents a real-time 3D Multi Object Detection and Tracking (DAMOT) method proposed for the UAH autonomous electric car. It allows the vehicle to recognize 360 surrounding objects and uniquely identify them to be able to follow their trajectory in scene by only receiving a 3D point cloud through ROS framework. First, we describe our proposal of 3D object detector, based on PointPillars [11], processing LiDAR data to locate objects in space obtaining their dimensions and location. Secondly, we use BEV-MOT [7], our Multi-Object Tracking technique in order to uniquely identify each object over a Bird’s-Eye View (BEV) through a combination of 2D Kalman filter and Hungarian algorithm, allowing the ego-vehicle to follow surrounding objects trajectories. A comparison of the performance of our proposal with other state-of-the-art methods is carried out applying KITTI-3DMOT evaluation tool extracted from AB3DMOT [21] on KITTI [5] validation dataset. Finally, we validate our DAMOT in several traffic scenarios implemented in CARLA [4] open-source driving simulator by using AB4COGT tool, designed by authors, studying its performance in a controlled but realistic urban environment with real-time execution, providing several demonstration videos (https://cutt.ly/3rU113d).
AB - This paper presents a real-time 3D Multi Object Detection and Tracking (DAMOT) method proposed for the UAH autonomous electric car. It allows the vehicle to recognize 360 surrounding objects and uniquely identify them to be able to follow their trajectory in scene by only receiving a 3D point cloud through ROS framework. First, we describe our proposal of 3D object detector, based on PointPillars [11], processing LiDAR data to locate objects in space obtaining their dimensions and location. Secondly, we use BEV-MOT [7], our Multi-Object Tracking technique in order to uniquely identify each object over a Bird’s-Eye View (BEV) through a combination of 2D Kalman filter and Hungarian algorithm, allowing the ego-vehicle to follow surrounding objects trajectories. A comparison of the performance of our proposal with other state-of-the-art methods is carried out applying KITTI-3DMOT evaluation tool extracted from AB3DMOT [21] on KITTI [5] validation dataset. Finally, we validate our DAMOT in several traffic scenarios implemented in CARLA [4] open-source driving simulator by using AB4COGT tool, designed by authors, studying its performance in a controlled but realistic urban environment with real-time execution, providing several demonstration videos (https://cutt.ly/3rU113d).
KW - 3D Multi-object Tracking
KW - Autonomous navigation
KW - CARLA
KW - DAMOT
KW - LiDAR
KW - Real-time
KW - ROS
UR - https://www.scopus.com/pages/publications/85097402629
U2 - 10.1007/978-3-030-62579-5_17
DO - 10.1007/978-3-030-62579-5_17
M3 - Conference contribution
AN - SCOPUS:85097402629
SN - 9783030625788
T3 - Advances in Intelligent Systems and Computing
SP - 241
EP - 255
BT - Advances in Physical Agents II - Proceedings of the 21st International Workshop of Physical Agents WAF 2020
A2 - Bergasa, Luis M.
A2 - Ocaña, Manuel
A2 - Barea, Rafael
A2 - López-Guillén, Elena
A2 - Revenga, Pedro
PB - Springer Science and Business Media Deutschland GmbH
T2 - 21st International Workshop of Physical Agents, WAF 2020
Y2 - 19 November 2020 through 20 November 2020
ER -