Open Access iconOpen Access

ARTICLE

Monocular Visual SLAM for Markerless Tracking Algorithm to Augmented Reality

Tingting Yang1,*, Shuwen Jia1, Ying Yu1, Zhiyong Sui2

1 Shcool of Information and Intelligence Engineering, University of Sanya, 572000, Sanya, China
2 Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, 75080-3021, USA

* Corresponding Author: Tingting Yang. Email: email

Intelligent Automation & Soft Computing 2023, 35(2), 1691-1704. https://doi.org/10.32604/iasc.2023.027466

Abstract

Augmented Reality (AR) tries to seamlessly integrate virtual content into the real world of the user. Ideally, the virtual content would behave exactly like real objects. This necessitates a correct and precise estimation of the user’s viewpoint (or that of a camera) with regard to the virtual content’s coordinate system. Therefore, the real-time establishment of 3-dimension (3D) maps in real scenes is particularly important for augmented reality technology. So in this paper, we integrate Simultaneous Localization and Mapping (SLAM) technology into augmented reality. Our research is to implement an augmented reality system without markers using the ORB-SLAM2 framework algorithm. In this paper we propose an improved method for Oriented FAST and Rotated BRIEF (ORB) feature extraction and optimized key frame selection, as well as the use of the Progressive Sample Consensus (PROSAC) algorithm for planar estimation of augmented reality implementations, thus solving the problem of increased system runtime because of the loss of large amounts of texture information in images. In this paper, we get better results by comparing experiments and data analysis. However, there are some improved methods of PROSAC algorithm which are more suitable for the detection of plane feature points.

Keywords


Cite This Article

T. Yang, S. Jia, Y. Yu and Z. Sui, "Monocular visual slam for markerless tracking algorithm to augmented reality," Intelligent Automation & Soft Computing, vol. 35, no.2, pp. 1691–1704, 2023.



cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 1336

    View

  • 620

    Download

  • 0

    Like

Share Link