Open Access
ARTICLE
Review on Video Object Tracking Based on Deep Learning
College of Computer Science and Technology, China University of Mining and Technology, Xuzhou, 221116, China.
Mine Digitization Engineering Research Center of the Ministry of Education, China University of Mining and Technology, Xuzhou, 221116, China.
Key Laboratory of Wireless Sensor Network & Communication, Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences, Shanghai, 201899, China.
Information Communication Technology Department, Wollo University, Dessie Ethiopia, Ethiopia.
*Corresponding Author: Wei Chen. Email: .
Journal of New Media 2019, 1(2), 63-74. https://doi.org/10.32604/jnm.2019.06253
Abstract
Video object tracking is an important research topic of computer vision, which finds a wide range of applications in video surveillance, robotics, human-computer interaction and so on. Although many moving object tracking algorithms have been proposed, there are still many difficulties in the actual tracking process, such as illumination change, occlusion, motion blurring, scale change, self-change and so on. Therefore, the development of object tracking technology is still challenging. The emergence of deep learning theory and method provides a new opportunity for the research of object tracking, and it is also the main theoretical framework for the research of moving object tracking algorithm in this paper. In this paper, the existing deep tracking-based target tracking algorithms are classified and sorted out. Based on the previous knowledge and my own understanding, several solutions are proposed for the existing methods. In addition, the existing deep learning target tracking method is still difficult to meet the requirements of real-time, how to design the network and tracking process to achieve speed and effect improvement, there is still a lot of research space.Keywords
Cite This Article
Citations
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.