Open Access
ARTICLE
Monocular Depth Estimation with Sharp Boundary
1 Faculty of Intelligent Manufacturing, Wuyi University, Jiangmen, 529000, China
2 China-Germany (Jiangmen) Artificial Intelligence Institute, Jiangmen, 529000, China
3 Zhuhai 4DAGE Network Technology, Zhuhai, 519000, China
* Corresponding Author: Yan Cui. Email:
(This article belongs to the Special Issue: Recent Advances in Virtual Reality)
Computer Modeling in Engineering & Sciences 2023, 136(1), 573-592. https://doi.org/10.32604/cmes.2023.023424
Received 25 April 2022; Accepted 22 September 2022; Issue published 05 January 2023
Abstract
Monocular depth estimation is the basic task in computer vision. Its accuracy has tremendous improvement in the decade with the development of deep learning. However, the blurry boundary in the depth map is a serious problem. Researchers find that the blurry boundary is mainly caused by two factors. First, the low-level features, containing boundary and structure information, may be lost in deep networks during the convolution process. Second, the model ignores the errors introduced by the boundary area due to the few portions of the boundary area in the whole area, during the backpropagation. Focusing on the factors mentioned above. Two countermeasures are proposed to mitigate the boundary blur problem. Firstly, we design a scene understanding module and scale transform module to build a lightweight fuse feature pyramid, which can deal with low-level feature loss effectively. Secondly, we propose a boundary-aware depth loss function to pay attention to the effects of the boundary’s depth value. Extensive experiments show that our method can predict the depth maps with clearer boundaries, and the performance of the depth accuracy based on NYU-Depth V2, SUN RGB-D, and iBims-1 are competitive.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.