Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (14)
  • Open Access

    ARTICLE

    Image Style Transfer for Exhibition Hall Design Based on Multimodal Semantic-Enhanced Algorithm

    Qing Xie*, Ruiyun Yu

    CMC-Computers, Materials & Continua, Vol.84, No.1, pp. 1123-1144, 2025, DOI:10.32604/cmc.2025.062712 - 09 June 2025

    Abstract Although existing style transfer techniques have made significant progress in the field of image generation, there are still some challenges in the field of exhibition hall design. The existing style transfer methods mainly focus on the transformation of single dimensional features, but ignore the deep integration of content and style features in exhibition hall design. In addition, existing methods are deficient in detail retention, especially in accurately capturing and reproducing local textures and details while preserving the content image structure. In addition, point-based attention mechanisms tend to ignore the complexity and diversity of image features… More >

  • Open Access

    ARTICLE

    Expo-GAN: A Style Transfer Generative Adversarial Network for Exhibition Hall Design Based on Optimized Cyclic and Neural Architecture Search

    Qing Xie*, Ruiyun Yu

    CMC-Computers, Materials & Continua, Vol.83, No.3, pp. 4757-4774, 2025, DOI:10.32604/cmc.2025.063345 - 19 May 2025

    Abstract This study presents a groundbreaking method named Expo-GAN (Exposition-Generative Adversarial Network) for style transfer in exhibition hall design, using a refined version of the Cycle Generative Adversarial Network (CycleGAN). The primary goal is to enhance the transformation of image styles while maintaining visual consistency, an area where current CycleGAN models often fall short. These traditional models typically face difficulties in accurately capturing expansive features as well as the intricate stylistic details necessary for high-quality image transformation. To address these limitations, the research introduces several key modifications to the CycleGAN architecture. Enhancements to the generator involve… More >

  • Open Access

    ARTICLE

    PhotoGAN: A Novel Style Transfer Model for Digital Photographs

    Qiming Li1, Mengcheng Wu1, Daozheng Chen1,2,*

    CMC-Computers, Materials & Continua, Vol.83, No.3, pp. 4477-4494, 2025, DOI:10.32604/cmc.2025.062969 - 19 May 2025

    Abstract Image style transfer is a research hotspot in the field of computer vision. For this job, many approaches have been put forth. These techniques do, however, still have some drawbacks, such as high computing complexity and content distortion caused by inadequate stylization. To address these problems, PhotoGAN, a new Generative Adversarial Network (GAN) model is proposed in this paper. A deeper feature extraction network has been designed to capture global information and local details better. Introducing multi-scale attention modules helps the generator focus on important feature areas at different scales, further enhancing the effectiveness of More >

  • Open Access

    ARTICLE

    Improving Machine Translation Formality with Large Language Models

    Murun Yang1,*, Fuxue Li2

    CMC-Computers, Materials & Continua, Vol.82, No.2, pp. 2061-2075, 2025, DOI:10.32604/cmc.2024.058248 - 17 February 2025

    Abstract Preserving formal style in neural machine translation (NMT) is essential, yet often overlooked as an optimization objective of the training processes. This oversight can lead to translations that, though accurate, lack formality. In this paper, we propose how to improve NMT formality with large language models (LLMs), which combines the style transfer and evaluation capabilities of an LLM and the high-quality translation generation ability of NMT models to improve NMT formality. The proposed method (namely INMTF) encompasses two approaches. The first involves a revision approach using an LLM to revise the NMT-generated translation, ensuring a… More >

  • Open Access

    ARTICLE

    Explicitly Color-Inspired Neural Style Transfer Using Patchified AdaIN

    Bumsoo Kim1, Wonseop Shin2, Yonghoon Jung1, Youngsup Park3, Sanghyun Seo1,4,*

    CMES-Computer Modeling in Engineering & Sciences, Vol.141, No.3, pp. 2143-2164, 2024, DOI:10.32604/cmes.2024.056079 - 31 October 2024

    Abstract Arbitrary style transfer aims to perceptually reflect the style of a reference image in artistic creations with visual aesthetics. Traditional style transfer models, particularly those using adaptive instance normalization (AdaIN) layer, rely on global statistics, which often fail to capture the spatially local color distribution, leading to outputs that lack variation despite geometric transformations. To address this, we introduce Patchified AdaIN, a color-inspired style transfer method that applies AdaIN to localized patches, utilizing local statistics to capture the spatial color distribution of the reference image. This approach enables enhanced color awareness in style transfer, adapting… More >

  • Open Access

    ARTICLE

    Constructive Robust Steganography Algorithm Based on Style Transfer

    Xiong Zhang1,2, Minqing Zhang1,2,3,*, Xu’an Wang1,2,3,*, Siyuan Huang1,2, Fuqiang Di1,2

    CMC-Computers, Materials & Continua, Vol.81, No.1, pp. 1433-1448, 2024, DOI:10.32604/cmc.2024.056742 - 15 October 2024

    Abstract Traditional information hiding techniques achieve information hiding by modifying carrier data, which can easily leave detectable traces that may be detected by steganalysis tools. Especially in image transmission, both geometric and non-geometric attacks can cause subtle changes in the pixels of the image during transmission. To overcome these challenges, we propose a constructive robust image steganography technique based on style transformation. Unlike traditional steganography, our algorithm does not involve any direct modifications to the carrier data. In this study, we constructed a mapping dictionary by setting the correspondence between binary codes and image categories and… More >

  • Open Access

    ARTICLE

    Robust Information Hiding Based on Neural Style Transfer with Artificial Intelligence

    Xiong Zhang1,2, Minqing Zhang1,2,3,*, Xu An Wang1,2,3, Wen Jiang1,2, Chao Jiang1,2, Pan Yang1,4

    CMC-Computers, Materials & Continua, Vol.79, No.2, pp. 1925-1938, 2024, DOI:10.32604/cmc.2024.050899 - 15 May 2024

    Abstract This paper proposes an artificial intelligence-based robust information hiding algorithm to address the issue of confidential information being susceptible to noise attacks during transmission. The algorithm we designed aims to mitigate the impact of various noise attacks on the integrity of secret information during transmission. The method we propose involves encoding secret images into stylized encrypted images and applies adversarial transfer to both the style and content features of the original and embedded data. This process effectively enhances the concealment and imperceptibility of confidential information, thereby improving the security of such information during transmission and… More >

  • Open Access

    ARTICLE

    PP-GAN: Style Transfer from Korean Portraits to ID Photos Using Landmark Extractor with GAN

    Jongwook Si1, Sungyoung Kim2,*

    CMC-Computers, Materials & Continua, Vol.77, No.3, pp. 3119-3138, 2023, DOI:10.32604/cmc.2023.043797 - 26 December 2023

    Abstract The objective of style transfer is to maintain the content of an image while transferring the style of another image. However, conventional methods face challenges in preserving facial features, especially in Korean portraits where elements like the “Gat” (a traditional Korean hat) are prevalent. This paper proposes a deep learning network designed to perform style transfer that includes the “Gat” while preserving the identity of the face. Unlike traditional style transfer techniques, the proposed method aims to preserve the texture, attire, and the “Gat” in the style image by employing image sharpening and face landmark,… More >

  • Open Access

    ARTICLE

    ECGAN: Translate Real World to Cartoon Style Using Enhanced Cartoon Generative Adversarial Network

    Yixin Tang*

    CMC-Computers, Materials & Continua, Vol.76, No.1, pp. 1195-1212, 2023, DOI:10.32604/cmc.2023.039182 - 08 June 2023

    Abstract Visual illustration transformation from real-world to cartoon images is one of the famous and challenging tasks in computer vision. Image-to-image translation from real-world to cartoon domains poses issues such as a lack of paired training samples, lack of good image translation, low feature extraction from the previous domain images, and lack of high-quality image translation from the traditional generator algorithms. To solve the above-mentioned issues, paired independent model, high-quality dataset, Bayesian-based feature extractor, and an improved generator must be proposed. In this study, we propose a high-quality dataset to reduce the effect of paired training… More >

  • Open Access

    ARTICLE

    APST-Flow: A Reversible Network-Based Artistic Painting Style Transfer Method

    Meng Wang*, Yixuan Shao, Haipeng Liu

    CMC-Computers, Materials & Continua, Vol.75, No.3, pp. 5229-5254, 2023, DOI:10.32604/cmc.2023.036631 - 29 April 2023

    Abstract In recent years, deep generative models have been successfully applied to perform artistic painting style transfer (APST). The difficulties might lie in the loss of reconstructing spatial details and the inefficiency of model convergence caused by the irreversible en-decoder methodology of the existing models. Aiming to this, this paper proposes a Flow-based architecture with both the en-decoder sharing a reversible network configuration. The proposed APST-Flow can efficiently reduce model uncertainty via a compact analysis-synthesis methodology, thereby the generalization performance and the convergence stability are improved. For the generator, a Flow-based network using Wavelet additive coupling… More >

Displaying 1-10 on page 1 of 14. Per Page