Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (2)
  • Open Access

    ARTICLE

    FSA-Net: A Cost-efficient Face Swapping Attention Network with Occlusion-Aware Normalization

    Zhipeng Bin1, Huihuang Zhao1,2,*, Xiaoman Liang1,2, Wenli Chen1

    Intelligent Automation & Soft Computing, Vol.37, No.1, pp. 971-983, 2023, DOI:10.32604/iasc.2023.037270 - 29 April 2023

    Abstract The main challenges in face swapping are the preservation and adaptive superimposition of attributes of two images. In this study, the Face Swapping Attention Network (FSA-Net) is proposed to generate photorealistic face swapping. The existing face-swapping methods ignore the blending attributes or mismatch the facial keypoint (cheek, mouth, eye, nose, etc.), which causes artifacts and makes the generated face silhouette non-realistic. To address this problem, a novel reinforced multi-aware attention module, referred to as RMAA, is proposed for handling facial fusion and expression occlusion flaws. The framework includes two stages. In the first stage, a More >

  • Open Access

    REVIEW

    An Overview of Face Manipulation Detection

    Xingwang Ju*

    Journal of Cyber Security, Vol.2, No.4, pp. 197-207, 2020, DOI:10.32604/jcs.2020.014310 - 07 December 2020

    Abstract Due to the power of editing tools, new types of fake faces are being created and synthesized, which has attracted great attention on social media. It is reasonable to acknowledge that one human cannot distinguish whether the face is manipulated from the real faces. Therefore, the detection of face manipulation becomes a critical issue in digital media forensics. This paper provides an overview of recent deep learning detection models for face manipulation. Some public dataset used for face manipulation detection is introduced. On this basis, the challenges for the research and the potential future directions More >

Displaying 1-10 on page 1 of 2. Per Page