Open Access
ARTICLE
Perceptual Image Outpainting Assisted by Low-Level Feature Fusion and Multi-Patch Discriminator
1 College of Computer Science, Chengdu University of Information Technology, Chengdu, 610225, China
2 Xihua University, Chengdu, 610039, China
3 University of Agriculture Faisalabad, Pakistan
Computers, Materials & Continua 2022, 71(3), 5021-5037. https://doi.org/10.32604/cmc.2022.023071
Received 27 August 2021; Accepted 11 November 2021; Issue published 14 January 2022
Abstract
Recently, deep learning-based image outpainting has made greatly notable improvements in computer vision field. However, due to the lack of fully extracting image information, the existing methods often generate unnatural and blurry outpainting results in most cases. To solve this issue, we propose a perceptual image outpainting method, which effectively takes the advantage of low-level feature fusion and multi-patch discriminator. Specifically, we first fuse the texture information in the low-level feature map of encoder, and simultaneously incorporate these aggregated features reusability with semantic (or structural) information of deep feature map such that we could utilize more sophisticated texture information to generate more authentic outpainting images. Then we also introduce a multi-patch discriminator to enhance the generated texture, which effectively judges the generated image from the different level features and concurrently impels our network to produce more natural and clearer outpainting results. Moreover, we further introduce perceptual loss and style loss to effectively improve the texture and style of outpainting images. Compared with the existing methods, our method could produce finer outpainting results. Experimental results on Places2 and Paris StreetView datasets illustrated the effectiveness of our method for image outpainting.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.