We address the known problem of detecting a previous compression in JPEG images, focusing on the challenging case of high and very high quality factors (≥90) as well as repeated compression with identical or nearly identical quality factors. We first revisit the approaches based on Benford- Fourier analysis in the DCT domain and block convergence analysis in the spatial domain. Both were originally conceived for specific scenarios. Leveraging decision tree theory, we design a combined approach complementing the discriminatory capabilities. We obtain a set of novel detectors targeted to high quality grayscale JPEG images.
Forensics of High Quality and Nearly Identical JPEG Image Recompression
Pasquini, Cecilia;Boato, Giulia;
2016-01-01
Abstract
We address the known problem of detecting a previous compression in JPEG images, focusing on the challenging case of high and very high quality factors (≥90) as well as repeated compression with identical or nearly identical quality factors. We first revisit the approaches based on Benford- Fourier analysis in the DCT domain and block convergence analysis in the spatial domain. Both were originally conceived for specific scenarios. Leveraging decision tree theory, we design a combined approach complementing the discriminatory capabilities. We obtain a set of novel detectors targeted to high quality grayscale JPEG images.| File | Dimensione | Formato | |
|---|---|---|---|
|
IH2016.pdf
Solo gestori archivio
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
1.37 MB
Formato
Adobe PDF
|
1.37 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione



