MSU Video Frame Interpolation Benchmark

Discover the best algorithm for high-quality and smooth slow motion videos

Powered by
Powered by
G&M Lab head: Dr. Dmitriy Vatolin
Measurements, analysis: 
Andrey Akifev,
Konstantin Kozhemiakov

Key features of the Benchmark

  • Comparison of 8 methods of video frame interpolation
  • A new dataset with gaming content and real-life footage
  • 5 objective metrics for interpolation quality assessment and speed measurement of algorithms
  • Subjective comparison with more than 400 participants (powered by Subjectify.us)

Leaderboard

The table below shows a comparison of all Video Frame Interpolation methods. You can sort the table by a specific metric.
To walk through papers and implementations of algorithms go to the Participants tab. Discover details of the comparison in the Methodology tab.

Dataset:

Rank Algorithm Subjective PSNR SSIM VMAF LPIPS MS-SSIM FPS

* these algorithms do not require computing power

Charts

Speed/Quality trade-off

Metric: Test:

Visualizations

This section presents visualizations of all algorithms.

  • The first line is full-sized frames
  • The second line is crops from interpolated intermediate frames
  • The third and fourth lines are the visualizations of error maps of PSNR and SSIM respectively.
You can select up to three models for comparison and one of the test videos. Use sliding window to zoom ROI and better consider interpolation artifacts.

Video:

Model 1: Model 2: Model 3:

Drag a red rectangle in the area, which you want to crop.

GT

GT

RIFE

CAIN

Your method submission

Verify the interpolation ability of your Video Frame Interpolation algorithm and compare it with other solutions. You can see information about other participants here.

1. Download input data
Download low frame rate videos
2. Apply your algorithm Interpolate intermediate frames of low FPS videos using your algorithm.
You can also send us the code of your method or the executable file and we will run it ourselves.
3. Send us result Send us an email to vfi-benchmark@videoprocessing.ai with the following information:
    A. Name of your method that will be specified in our benchmark
    B. Link to the cloud drive (Google Drive, OneDrive, Dropbox, etc.), containing output frames.
      You can send us files in the following formats:
      1) .png
      2) .mov, if you make a video of the frames yourself
      Please read the evaluation section of the methodology before submitting your algorithm
    C. (Optional) Execution time of your algorithm and information about used GPU
    D. (Optional) Any additional information about the method:
      1. Full name of your model
      2. The parameter set that was used
      3. A link to the code of your model, if it is available
      4. A link to the paper about your model, if it is available
      5. Any other additional information

Contacts

We would highly appreciate any suggestions and ideas on how to improve our benchmark. For questions and propositions, please contact us: vfi-benchmark@videoprocessing.ai

Also you can subscribe to updates on our benchmark:


MSU Video Quality Measurement Tool

              

    The tool for performing video/image quality analyses using reference or no-reference metrics

Widest Range of Metrics & Formats

  • Modern & Classical Metrics SSIM, MS-SSIM, PSNR, VMAF and 10+ more
  • Non-reference analysis & video characteristics
    Blurring, Blocking, Noise, Scene change detection, NIQE and more

Fastest Video Quality Measurement

  • GPU support
    Up to 11.7x faster calculation of metrics with GPU
  • Real-time measure
  • Unlimited file size

  • Main MSU VQMT page on compression.ru

Crowd-sourced subjective
quality evaluation platform

  • Conduct comparison of video codecs and/or encoding parameters

What is it?

Subjectify.us is a web platform for conducting fast crowd-sourced subjective comparisons.

The service is designed for the comparison of images, video, and sound processing methods.

Main features

  • Pairwise comparison
  • Detailed report
  • Providing all of the raw data
  • Filtering out answers from cheating respondents

  • Subjectify.us
04 Oct 2022
See Also
Real-World Stereo Color and Sharpness Mismatch Dataset
Download new real-world video dataset of stereo color and sharpness mismatches
Super-Resolution Quality Metrics Benchmark
Discover 66 Super-Resolution Quality Metrics and choose the most appropriate for your videos
Learning-Based Image Compression Benchmark
The First extensive comparison of Learned Image Compression algorithms
Video Saliency Prediction Benchmark
Explore the best video saliency prediction (VSP) algorithms
Super-Resolution for Video Compression Benchmark
Learn about the best SR methods for compressed videos and choose the best model to use with your codec
Metrics Robustness Benchmark
Check your image or video quality metric for robustness to adversarial attacks
Site structure