Forensic detection of re-quantization and re-sampling.
Doctoral thesis, UCL (University College London).
This thesis investigates the forensic detection of re-quantization and re-sampling that often occurs when multimedia content has been tampered with. The detection is based on statistical classification techniques, including Fisher linear discriminant (FLD) and support vector machine (SVM). Successively compressing an image with different quality factors is a typical example of uniform re-quantization. In the first part of the thesis, we investigate the forensic detection of uniform re-quantization for images. Three features are introduced based on the observation that uniform re-quantization (i) introduces discontinuities in the signal histogram and (ii) induces periodic artifacts. After validating the discriminative potential of these features with synthetic Gaussian signals, we propose a system to detect JPEG re-compression. Both linear (i.e. FLD) and non-linear (i.e. SVM) classifications are examined for comparative purposes. Experimental results clearly demonstrate the ability of the proposed features to detect JPEG re-compression, as well as their competitiveness compared to prior approaches to achieve the same goal. Successively compressing a speech signal with different speech encodings is a typical example of non-uniform re-quantization. In the second part of the thesis, we investigate the forensic detection of non-uniform re-quantization for speech signals. Two detection algorithms, based on the non-periodic histogram artifacts present in the time-domain and DFT-domain respectively, are compared with each other. Comparative experiments indicate that both detection algorithms produce reliable results with high area under the curve (AUC) values for a set of different experimental scenarios. In general, the time-domain detection performs slightly better than the DFT-domain detection. However, the latter is superior in the less dimensionality of input vectors to the FLD classifier being used. Re-sizing an image is a typical example of re-sampling. In the third part of the thesis, we investigate the forensic detection of re-sampling for images. A new method is proposed to detect re-sampled imagery. The method is based on examining the normalized energy density present within windows of varying size in the second derivative of the image in the frequency domain, and exploiting this characteristic to derive a 19-dimensional feature vector that is used to train a SVM classifier. Comparison with prior work reveals that the proposed algorithm performs similarly for re-sampling rates greater than 1, and is superior to prior work for re-sampling rates less than 1. Experiments are performed for both bilinear and bicubic interpolations and qualitatively similar results are observed for each. Results are also provided for the detection of re-sampled imagery after noise corruption and JPEG compression. As expected, some degradation in performance is observed as the noise increases or the JPEG quality factor declines.
|Title:||Forensic detection of re-quantization and re-sampling|
|Additional information:||Permission for digitisation not received|
|UCL classification:||UCL > School of BEAMS > Faculty of Engineering Science > Computer Science|
Archive Staff Only