Open Access iconOpen Access

ARTICLE

Convergence of Stereo Vision-Based Multimodal YOLOs for Faster Detection of Potholes

Sungan Yoon, Jeongho Cho*

Department of Electrical Engineering, Soonchunhyang University, Asan, 31538, Korea

* Corresponding Author: Jeongho Cho. Email: email

Computers, Materials & Continua 2022, 73(2), 2821-2834. https://doi.org/10.32604/cmc.2022.027840

Abstract

Road potholes can cause serious social issues, such as unexpected damages to vehicles and traffic accidents. For efficient road management, technologies that quickly find potholes are required, and thus researches on such technologies have been conducted actively. The three-dimensional (3D) reconstruction method has relatively high accuracy and can be used in practice but it has limited application owing to its long data processing time and high sensor maintenance cost. The two-dimensional (2D) vision method has the advantage of inexpensive and easy application of sensor. Recently, although the 2D vision method using the convolutional neural network (CNN) has shown improved pothole detection performance and adaptability, large amount of data is required to sufficiently train the CNN. Therefore, we propose a method to improve the learning performance of CNN-based object detection model by artificially generating synthetic data similar to a pothole and enhancing the learning data. Additionally, to make the defective areas appear more contrasting, the transformed disparity map (TDM) was calculated using stereo-vision cameras, and the detection performance of the model was further improved through the late fusion with RGB (Red, Green, Blue) images. Consequently, through the convergence of multimodal You Only Look Once (YOLO) frameworks trained by RGB images and TDMs respectively, the detection performance was enhanced by 10.7% compared with that when using only RGB. Further, the superiority of the proposed method was confirmed by showing that the data processing speed was two times faster than the existing 3D reconstruction method.

Keywords


Cite This Article

APA Style
Yoon, S., Cho, J. (2022). Convergence of stereo vision-based multimodal yolos for faster detection of potholes. Computers, Materials & Continua, 73(2), 2821-2834. https://doi.org/10.32604/cmc.2022.027840
Vancouver Style
Yoon S, Cho J. Convergence of stereo vision-based multimodal yolos for faster detection of potholes. Comput Mater Contin. 2022;73(2):2821-2834 https://doi.org/10.32604/cmc.2022.027840
IEEE Style
S. Yoon and J. Cho, “Convergence of Stereo Vision-Based Multimodal YOLOs for Faster Detection of Potholes,” Comput. Mater. Contin., vol. 73, no. 2, pp. 2821-2834, 2022. https://doi.org/10.32604/cmc.2022.027840



cc Copyright © 2022 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 1630

    View

  • 866

    Download

  • 0

    Like

Share Link