MSU Super-Resolution for Video Compression Benchmark

Discover SR methods for compressed videos and choose the best model to use with your codec

Powered by
Powered by
G&M Lab head: Dr. Dmitriy Vatolin
Project adviser: Dr. Dmitriy Kulikov
Measurements, analysis: 
Evgeniy Bogatyrev,
Egor Sklyarov



Diverse dataset

  • H.264, H.265, H.266, AV1, AVS3 codec standards
  • More than 260 test videos
  • 6 different bitrates

Various charts

  • Visual comparison for more than
    80 SR+codec pairs
  • RD curves and bar charts
    for 5 objective metrics
  • SR+codec pairs ranked by BSQ-rate

Extensive report

  • 80+ pages with different plots
  • 15 SOTA SR methods
    and 6 objective metrics
  • Extensive subjective comparison
    with 5300+ valid participants
  • Powered by Subjectify.us


The pipeline of our benchmark

What’s new

  • 21.07.2023 Added E-MoEVRT.
  • 12.06.2023 Added RKPQ-4xSR.
  • 21.07.2022 Added SR codecs to the benchmark. Added MDTVSFA correlation to the correlation chart.
  • 12.04.2022 Uploaded the results of extensive subjective comparison. See “Subjective score” in Charts section.
  • 25.03.2022 Added VRT, BasicVSR, RBPN, and COMISR. Updated Leaderboards and Visualizations sections.
  • 14.03.2022 Uploaded new dataset. Updated the Methodology.
  • 26.10.2021 Updated the Methodology.
  • 12.10.2021 Published October Report. Added 2 new videos to the dataset. Updated Charts section and Visualizations.
  • 28.09.2021 Improved the Leaderboards section to make it more user-friendly, updated the Methodology and added ERQAv1.1 metric.
  • 21.09.2021 Added 2 new videos to the dataset, new plots to the Charts section, and new Visualizations.
  • 14.09.2021 Public beta-version Release.
  • 31.08.2021 Alpha-version Release.

Charts

In this section, you can see RD curves, which show the bitrate/quality distribution of each SR+codec pair, and bar charts, which show the BSQ-rate calculated for objective metrics and subjective scores.

Read about the participants here.
You can see the information about codecs in the methodology.

Metrics:

Charts with metrics

You can choose the test sequence, the codec that was used to compress it, and the metric.

If BSQ-rate of any method equals 0, then this method should be considered much better than reference codec (codec with no SR).

Highlight the plot region where you want to zoom in.

Video: Codec: Metric:

Show hidden curves:

Correlation of metrics with subjective assessment

We calculated objective metrics on the crops used for subjective comparison and calculated a correlation between the subjective and objective results. Below you can see the average correlation of metrics over all test cases.

* ERQA-MDTVSFA is calculated by multiplying MDTVSFA and ERQA values over the video.

Speed/BSQ-rate trade-off

Read about Frames per Second (FPS) calculation here. Read about BSQ-rate over Subjective score here.

Visualization

In this section, you can choose the sequence, see a cropped part of a frame from it, shifted Y-PSNR visualization, and ERQAv2.0 Visualization for this crop. For shifted Y-PSNR we find the optimal shift for Y-PSNR and apply MSU VQMT PSNR to frames with this shift. See the methodology for more information.

Video: Codec: Approximate bitrate:

Model 1: Model 2: Model 3:

Drag a red rectangle to the area that you want to see zoomed-in

GT

amq-12

ahq-11

amqs-1

Leaderboards

SR+codec pairs leaderboard

The table below shows a comparison of all pairs of Super-Resolution algorithms and codecs. Each column shows BSQ-rate over a specific metric. You can sort the table by any column.
All methods that took part in subjective comparison are ranked by BSQ-rate over subjective score. Other methods are ranked by BSQ-rate over ERQA.
If BSQ-rate of any method equals 0, then this method should be considered much better than reference codec.
If BSQ-rate of any method is striving to eternity (marked as '∞'), then this method should be considered much worse than reference codec.
"TBP" means that this SR+codec pair did not take part in subjective comparison.

Video:

Rank SR + codec Y-VMAF ERQAv2.0 Y-PSNR Y-MS-SSIM LPIPS

SR codecs

You can find information about SR codecs on the participants page.

You can choose the test sequence and the metric.

Highlight the plot region where you want to zoom in.

Video: Metric:

Submit your method

Verify your method’s ability to restore compressed videos and compare it with other algorithms.
You can go to the page with information about other participants.

1. Download input data Download low-resolution input videos as sequences of frames in PNG format.
There are 2 available options:
  1. Download 1 folder with all videos joined in one sequence here.
    Neighboring videos are separated by 5 black frames, which will be skipped
    for evaluation.

  2. If you worry this strategy can affect your performance, you can download
    269 folders with each video here.


2. Apply your algorithm Apply your Super-Resolution algorithm to upscale frames to 1920×1080 resolution.
You can also send us the code of your method or the executable file
with the instructions on how to run it and we will run it ourselves.

3. Send us result Send us an email to sr-codecs-benchmark@videoprocessing.ai with the following
information:
    A. Name of your method that will be specified in our benchmark
    B. A way for us to download your method's output frames (e.g. link
    to the cloud drive)
    C. (Optional) Any additional information about the method:
      1. Full name of your model
      2. Parameters which were used
      3. A link to the code of your model, if it is available
      4. A link to the paper about your model, if it is available
      5. Execution time of your algorithm and information about used GPU, if it was used
      6. Any other additional information

You can verify the results of current participants or estimate the perfomance of your method on public samples of our dataset. Just send an email to sr-codecs-benchmark@videoprocessing.ai with a request to share them with you.

Our policy:

  • We won't publish the results of your method without your permission.
  • We share only public samples of our dataset as it is private.

Download the Report

Download the report
(free download)


(PDF, 25,5 MB)



Released on October, 12
75 SR+codec pairs
x264, x265, aomenc, VVenC, uavs3 codecs
3 Full HD video sequences
6 different objective metrics and subjective comparison
Y-PSNR, YUV-MS-SSIM, Y-VMAF, Y-VMAF NEG, LPIPS, ERQA
80+ pages with plots

Videos used in the report:

Acknowledgements:

Cite Us

To refer to our benchmark in your work, cite our paper:

@article{
author={Bogatyrev, Evgeney and Molodetskikh, Ivan and Vatolin, Dmitriy},
journal={arXiv preprint arXiv:2305.04844},
title={Compressed video quality assessment for super-resolution: a benchmark and a quality metric},
year={2023},
}

Contact Us

For questions and propositions, please contact us: sr-codecs-benchmark@videoprocessing.ai

You can subscribe to updates on our benchmark:


MSU Video Quality Measurement Tool

              

    The tool for performing video/image quality analyses using reference or no-reference metrics

Widest Range of Metrics & Formats

  • Modern & Classical Metrics SSIM, MS-SSIM, PSNR, VMAF and 10+ more
  • Non-reference analysis & video characteristics
    Blurring, Blocking, Noise, Scene change detection, NIQE and more

Fastest Video Quality Measurement

  • GPU support
    Up to 11.7x faster calculation of metrics with GPU
  • Real-time measure
  • Unlimited file size

  • Main MSU VQMT page on compression.ru

Crowd-sourced subjective
quality evaluation platform

  • Conduct comparison of video codecs and/or encoding parameters

What is it?

Subjectify.us is a web platform for conducting fast crowd-sourced subjective comparisons.

The service is designed for the comparison of images, video, and sound processing methods.

Main features

  • Pairwise comparison
  • Detailed report
  • Providing all of the raw data
  • Filtering out answers from cheating respondents

  • Subjectify.us
21 Jul 2023
See Also
MSU Image- and video-quality metrics analysis
Description of a project in MSU Graphics and Media Laboratory
Learning-Based Image Compression Benchmark
Super-Resolution Quality Metrics Benchmark
Discover 66 Super-Resolution Quality Metrics and choose the most appropriate for your videos
Video Saliency Prediction Benchmark
Explore the best video saliency prediction (VSP) algorithms
Metrics Robustness Benchmark
Check your image or video quality metric for robustness to adversarial attacks
Video Upscalers Benchmark
The most extensive comparison of video super-resolution (VSR) algorithms by subjective quality
Site structure