Open Access
ARTICLE
Multi-Purpose Forensics of Image Manipulations Using Residual- Based Feature
1 Southwest University of Science and Technology, Mianyang, 621010, China.
2 Binghamton University, State University of New York, NewYork, 13902, USA.
* Corresponding Author: Hui Zeng. Email: .
Computers, Materials & Continua 2020, 65(3), 2217-2231. https://doi.org/10.32604/cmc.2020.011006
Received 14 April 2020; Accepted 12 June 2020; Issue published 16 September 2020
Abstract
The multi-purpose forensics is an important tool for forge image detection. In this paper, we propose a universal feature set for the multi-purpose forensics which is capable of simultaneously identifying several typical image manipulations, including spatial low-pass Gaussian blurring, median filtering, re-sampling, and JPEG compression. To eliminate the influences caused by diverse image contents on the effectiveness and robustness of the feature, a residual group which contains several highpass filtered residuals is introduced. The partial correlation coefficient is exploited from the residual group to purely measure neighborhood correlations in a linear way. Besides that, we also combine autoregressive coefficient and transition probability to form the proposed composite feature which is used to measure how manipulations change the neighborhood relationships in both linear and non-linear way. After a series of dimension reductions, the proposed feature set can accelerate the training and testing for the multipurpose forensics. The proposed feature set is then fed into a multi-classifier to train a multi-purpose detector. Experimental results show that the proposed detector can identify several typical image manipulations, and is superior to the complicated deep CNN-based methods in terms of detection accuracy and time efficiency for JPEG compressed image with low resolution.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.