MSU Full-Reference Video Quality Metrics Benchmark
Discover the newest metrics and find the most appropriate method for your tasks
Anastasia Antsiferova
Diverse dataset
- 40 different video codecs of 10 compression standards
- 2500+ compressed streams
- 780.000+ subjective scores
- 10.000+ viewers
- User-generated content
VQA and IQA metrics
- 20+ metrics without variations
- The biggest leaderboard of neural networks-based video quality metrics
- Calculations over U and V planes
- Metrics with different weighted
average for planes
Various charts
- Bar chart with the overall metrics perfomance
- Comparison on different compression standards with 95% confidence intervals
- Speed-Quality chart
Note
This page is a part of MSU Video Quality Benchmark, which you can find here.
Results
The chart below shows the correlation of metrics with subjective scores on our dataset. You can choose the type of correlation and compression standard of codecs used for compression. We recommend that you focus on Spearman’s rank correlation coefficient.
Correlation type: Compression Standard:
The results of the comparison on different compression standards and different bitrates ranges, as well as full-reference and no-reference metrics detailed analysis, are presented on the leaderboard page.
Methodology and dataset
To see all steps of metrics evaluation and the description of our dataset visit the methodology page.
How to submit your method
Find out the strong and weak sides of your method and compare it to the best commercial and free methods.
We kindly invite you to participate in our benchmark. To do this follow the steps below:
Send us an email to vqa@videoprocessing.ai
with the following information:
ref — path to reference video (for full-reference metrics)
dist — path to distorted video
output — path to output of your algorithm
t — threshold, if it's required in your algorithm
|
You can verify the results of current participants or estimate the perfomance of your method on public samples
of our dataset. Just send us an email with a request to share them with you. |
Our policy:
- We won't publish the results of your method without your permission.
- We share only public samples of our dataset as it is private.
Information about all other participants you can find in the participants page.
Cite us
@inproceedings{
NEURIPS2022_59ac9f01,
author = {Antsiferova, Anastasia and Lavrushkin, Sergey and Smirnov, Maksim and Gushchin, Aleksandr and Vatolin, Dmitriy and Kulikov, Dmitriy},
booktitle = {Advances in Neural Information Processing Systems},
editor = {S. Koyejo and S. Mohamed and A. Agarwal and D. Belgrave and K. Cho and A. Oh},
pages = {13814--13825},
publisher = {Curran Associates, Inc.},
title = {Video compression dataset and benchmark of learning-based video-quality metrics},
url = {https://proceedings.neurips.cc/paper_files/paper/2022/file/59ac9f01ea2f701310f3d42037546e4a-Paper-Datasets_and_Benchmarks.pdf},
volume = {35},
year = {2022}
}
|
You can find the full text of our paper through the link.
Contacts
We would highly appreciate any suggestions and ideas on how to improve our benchmark. Please contact us via email: vqa@videoprocessing.ai.
Also you can subscribe to updates on our benchmark:
MSU Video Quality Measurement Tool
Widest Range of Metrics & Formats
- Modern & Classical Metrics SSIM, MS-SSIM, PSNR, VMAF and 10+ more
- Non-reference analysis & video characteristics
Blurring, Blocking, Noise, Scene change detection, NIQE and more
Fastest Video Quality Measurement
- GPU support
Up to 11.7x faster calculation of metrics with GPU - Real-time measure
- Unlimited file size
Main MSU VQMT page on compression.ru
Crowd-sourced subjective
quality evaluation platform
- Conduct comparison of video codecs and/or encoding parameters
What is it?
Subjectify.us is a web platform for conducting fast crowd-sourced subjective comparisons.
The service is designed for the comparison of images, video, and sound processing methods.
Main features
- Pairwise comparison
- Detailed report
- Providing all of the raw data
- Filtering out answers from cheating respondents
Subjectify.us
-
MSU Benchmark Collection
- Video Colorization Benchmark
- Super-Resolution for Video Compression Benchmark
- Defenses for Image Quality Metrics Benchmark
- Learning-Based Image Compression Benchmark
- Super-Resolution Quality Metrics Benchmark
- Video Saliency Prediction Benchmark
- Metrics Robustness Benchmark
- Video Upscalers Benchmark
- Video Deblurring Benchmark
- Video Frame Interpolation Benchmark
- HDR Video Reconstruction Benchmark
- No-Reference Video Quality Metrics Benchmark
- Full-Reference Video Quality Metrics Benchmark
- Video Alignment and Retrieval Benchmark
- Mobile Video Codecs Benchmark
- Video Super-Resolution Benchmark
- Shot Boundary Detection Benchmark
- The VideoMatting Project
- Video Completion
- Codecs Comparisons & Optimization
- VQMT
- MSU Datasets Collection
- Metrics Research
- Video Quality Measurement Tool 3D
- Video Filters
- Other Projects