Open Access
ARTICLE
PRNU Extraction from Stabilized Video: A Patch Maybe Better than a Bunch
1 Qilu University of Technology (Shandong Academy of Sciences), Shandong Provincial Key Laboratory of Computer Networks, Jinan, 250353, China
2 Shandong Computer Science Center, Shandong Provincial Key Laboratory of Computer Networks, Jinan, 250014, China
3 University of Connecticut, Mansfield, CT, 06269, USA
* Corresponding Author: Jian Li. Email:
Computer Systems Science and Engineering 2021, 36(1), 189-200. https://doi.org/10.32604/csse.2021.014138
Received 01 September 2020; Accepted 16 October 2020; Issue published 23 December 2020
Abstract
This paper presents an algorithm to solve the problem of Photo-Response Non-Uniformity (PRNU) noise facing stabilized video. The stabilized video undergoes in-camera processing like rolling shutter correction. Thus, misalignment exists between the PRNU noises in the adjacent frames owing to the global and local frame registration performed by the in-camera processing. The misalignment makes the reference PRNU noise and the test PRNU noise unable to extract and match accurately. We design a computing method of maximum likelihood estimation algorithm for extracting the PRNU noise from stabilized video frames. Besides, unlike most prior arts tending to match the PRNU noise in whole frame, we propose a new patch-based matching strategy, aiming at reducing the influence from misalignment of frame the PRNU noise. After extracting the reference PRNU noise and the test PRNU noise, this paper adopts the reference and the test PRNU overlapping patch-based matching. It is different from the traditional matching method. This paper conducts different experiments on 224 stabilized videos taken by 13 smartphones in the VISION database. The area under curve of the algorithm proposed in this paper is 0.841, which is significantly higher than 0.805 of the whole frame matching in the traditional algorithm. Experimental results show good performance and effectiveness the proposed strategy by comparing with the prior arts.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.