Open Access
ARTICLE
Image-to-Image Style Transfer Based on the Ghost Module
1 College of Computer Science and Technology, Harbin Engineering University, Harbin, 150001, China
2 Heilongjiang Hengxun Technology Co., Ltd., Harbin, 150001, China
3 College of Engineering and Information Technology, Georgia Southern University, Statesboro, 30458, GA, USA
* Corresponding Author: Liguo Zhang. Email:
Computers, Materials & Continua 2021, 68(3), 4051-4067. https://doi.org/10.32604/cmc.2021.016481
Received 03 January 2021; Accepted 17 March 2021; Issue published 06 May 2021
Abstract
The technology for image-to-image style transfer (a prevalent image processing task) has developed rapidly. The purpose of style transfer is to extract a texture from the source image domain and transfer it to the target image domain using a deep neural network. However, the existing methods typically have a large computational cost. To achieve efficient style transfer, we introduce a novel Ghost module into the GANILLA architecture to produce more feature maps from cheap operations. Then we utilize an attention mechanism to transform images with various styles. We optimize the original generative adversarial network (GAN) by using more efficient calculation methods for image-to-illustration translation. The experimental results show that our proposed method is similar to human vision and still maintains the quality of the image. Moreover, our proposed method overcomes the high computational cost and high computational resource consumption for style transfer. By comparing the results of subjective and objective evaluation indicators, our proposed method has shown superior performance over existing methods.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.