MSU Defenses Video Quality Metrics Benchmark

Check your IQA/VQA metric robustness to adversarial attacks

Powered by
Project head: Aleksandr Gushchin
G&M Lab head: Dr. Dmitriy Vatolin
Measurements, analysis:
Khaled Abud, Georgii Bychkov,
Ekaterina Shumitskaya, Vyacheslav Napadovsky
Anastasia Antsiferova, Sergey Lavrushkin

Key features of the Benchmark

  • 20+ evaluated defense methods of different types (purification, adversarial training, certified robustness)
  • 14 adversarial Wb and BB attacks including FGSM-based, Universal Adversarial Perturbation-based and Perceptual-aware attacks
  • Automatic cloud-based pipeline for evaluation

What’s new

  • 30.06.2024 Alpha-version of benchmark Benchmark Release


Dataset can be found here.


Non-adaptive leaderboard for adversarial purification defenses. Evaluated metrics are averaged across all images and attacks. Defense parameters’ values with the highest correlations for adversarial images are selected:

Methodology and dataset


Cite us

You can find the full text of our paper through the link.

We will include citation later.


We would highly appreciate any suggestions and ideas on how to improve our benchmark. Please contact us via email:

30 Jun 2024
See Also
Video Colorization Benchmark
Explore the best video colorization algorithms
Learning-Based Image Compression Benchmark
The First extensive comparison of Learned Image Compression algorithms
Super-Resolution Quality Metrics Benchmark
Discover 66 Super-Resolution Quality Metrics and choose the most appropriate for your videos
Video Saliency Prediction Benchmark
Explore the best video saliency prediction (VSP) algorithms
Super-Resolution for Video Compression Benchmark
Learn about the best SR methods for compressed videos and choose the best model to use with your codec
Metrics Robustness Benchmark
Check your image or video quality metric for robustness to adversarial attacks
Site structure