TY - GEN
T1 - Integrating OpenFace 2.0 Toolkit for Driver Attention Estimation in Challenging Accidental Scenarios
AU - Araluce, Javier
AU - Bergasa, Luis M.
AU - Gómez-Huélamo, Carlos
AU - Barea, Rafael
AU - López-Guillén, Elena
AU - Arango, Felipe
AU - Pérez-Gil, Óscar
N1 - Publisher Copyright:
© 2021, The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2021
Y1 - 2021
N2 - Gaze estimation is a typical approach to monitor the driver attention on the road scene. This indicator is of great importance in safe driving and in the design of the takeover control strategy for a Level 3 and Level 4 automation system. Nowadays, most of eye gaze tracking techniques are intrusive and costly, which limits their applicability over real vehicles. On the other hand, current databases used for gaze validation face the driver attention task focused on critical situations in simulation but they do not encounter actual accidents. This paper presents a low-cost and non-intrusive camera-based gaze mapping system integrating the open-source state-of-the art OpenFace 2.0 Toolkit[3] to visualise the driver attention simulation on prerecorded real traffic scenes through a heat map. The proposal has been validated by using the recent and challenging public dataset DADA2000[9] which has 2000 video sequences with annotated driving scenarios based on real accidents. We compare our system with an expensive desktop-mounted eye-tracker, obtaining on par results and showing it is a good tool for driver attention monitoring able to be used in the design of take over systems and driving scenarios awareness systems for automated vehicles.
AB - Gaze estimation is a typical approach to monitor the driver attention on the road scene. This indicator is of great importance in safe driving and in the design of the takeover control strategy for a Level 3 and Level 4 automation system. Nowadays, most of eye gaze tracking techniques are intrusive and costly, which limits their applicability over real vehicles. On the other hand, current databases used for gaze validation face the driver attention task focused on critical situations in simulation but they do not encounter actual accidents. This paper presents a low-cost and non-intrusive camera-based gaze mapping system integrating the open-source state-of-the art OpenFace 2.0 Toolkit[3] to visualise the driver attention simulation on prerecorded real traffic scenes through a heat map. The proposal has been validated by using the recent and challenging public dataset DADA2000[9] which has 2000 video sequences with annotated driving scenarios based on real accidents. We compare our system with an expensive desktop-mounted eye-tracker, obtaining on par results and showing it is a good tool for driver attention monitoring able to be used in the design of take over systems and driving scenarios awareness systems for automated vehicles.
KW - Accidental scenarios
KW - Computer vision
KW - Driver attention
KW - Gaze estimation
KW - Heat map
UR - https://www.scopus.com/pages/publications/85097397904
U2 - 10.1007/978-3-030-62579-5_19
DO - 10.1007/978-3-030-62579-5_19
M3 - Conference contribution
AN - SCOPUS:85097397904
SN - 9783030625788
T3 - Advances in Intelligent Systems and Computing
SP - 274
EP - 288
BT - Advances in Physical Agents II - Proceedings of the 21st International Workshop of Physical Agents WAF 2020
A2 - Bergasa, Luis M.
A2 - Ocaña, Manuel
A2 - Barea, Rafael
A2 - López-Guillén, Elena
A2 - Revenga, Pedro
PB - Springer Science and Business Media Deutschland GmbH
T2 - 21st International Workshop of Physical Agents, WAF 2020
Y2 - 19 November 2020 through 20 November 2020
ER -