Training and Testing Texture Similarity Metrics for Structurally Lossless Compression

Kaixuan Zhang, Zhaochen Shi, Jana Zujovic, Huib De Ridder, Rene Van Egmond, David L. Neuhoff, Thrasyvoulos N. Pappas

Research output: Contribution to journalArticleScientificpeer-review

Abstract

We present a systematic approach for training and testing structural texture similarity metrics (STSIMs) so that they can be used to exploit texture redundancy for structurally lossless image compression. The training and testing is based on a set of image distortions that reflect the characteristics of the perturbations present in natural texture images. We conduct empirical studies to determine the perceived similarity scale across all pairs of original and distorted textures. We then introduce a data-driven approach for training the Mahalanobis formulation of STSIM based on the resulting annotated texture pairs. Experimental results demonstrate that training results in significant improvements in metric performance. We also show that the performance of the trained STSIM metrics is competitive with state of the art metrics based on convolutional neural networks, at substantially lower computational cost.

Original languageEnglish
Pages (from-to)1614-1626
Number of pages13
JournalIEEE Transactions on Image Processing
Volume33
DOIs
Publication statusPublished - 2024

Bibliographical note

Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.

Keywords

  • Databases
  • Distortion
  • Distortion measurement
  • Image coding
  • Measurement
  • Redundancy
  • Training

Fingerprint

Dive into the research topics of 'Training and Testing Texture Similarity Metrics for Structurally Lossless Compression'. Together they form a unique fingerprint.

Cite this