A Stacked Deep MEMC Network for Frame Rate Up Conversion and its Application to HEVC

Optical flows and video frame interpolation are considered as a chicken-egg problem such that one problem affects the other and vice versa. This paper presents a stack of deep networks to estimate intermediate optical flows from the very first intermediate synthesized frame and later generate the ve...

Full description

Bibliographic Details
Main Authors: Nguyen Van Thang, Kyujoong Lee, Hyuk-Jae Lee
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9042307/
_version_ 1831810277790711808
author Nguyen Van Thang
Kyujoong Lee
Hyuk-Jae Lee
author_facet Nguyen Van Thang
Kyujoong Lee
Hyuk-Jae Lee
author_sort Nguyen Van Thang
collection DOAJ
description Optical flows and video frame interpolation are considered as a chicken-egg problem such that one problem affects the other and vice versa. This paper presents a stack of deep networks to estimate intermediate optical flows from the very first intermediate synthesized frame and later generate the very end interpolated frame by combining the very first one and two learned intermediate optical flows based warped frames. The primary benefit is that it glues two problems into a single comprehensive framework that learns altogether by using both an analysis-by-synthesis technique for optical flow estimation and Convolutional Neural Networks (CNN) kernels-based frame synthesis. The proposed network is the first attempt to merge two previous branches of previous approaches, optical flow-based synthesis and CNN kernels-based synthesis into a comprehensive network. Experiments are carried out with various challenging datasets, all showing that the proposed network outperforms the state-of-the-art methods with significant margins for video frame interpolation and the estimated optical flows are more accurate for challenging movements. Furthermore, the proposed Motion Estimation Motion Compensation (MEMC) network shows its outstanding enhancement of the quality of compressed videos.
first_indexed 2024-12-22T20:57:00Z
format Article
id doaj.art-0bb74930ef634f999df2dd46a48d35e9
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-22T20:57:00Z
publishDate 2020-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-0bb74930ef634f999df2dd46a48d35e92022-12-21T18:12:56ZengIEEEIEEE Access2169-35362020-01-018583105832110.1109/ACCESS.2020.29820399042307A Stacked Deep MEMC Network for Frame Rate Up Conversion and its Application to HEVCNguyen Van Thang0https://orcid.org/0000-0003-0841-8586Kyujoong Lee1https://orcid.org/0000-0002-3080-3010Hyuk-Jae Lee2Department of Electrical and Computer Engineering, Seoul National University, Seoul, South KoreaDepartment of Electronic Engineering, Sun Moon University, Asan, South KoreaDepartment of Electrical and Computer Engineering, Seoul National University, Seoul, South KoreaOptical flows and video frame interpolation are considered as a chicken-egg problem such that one problem affects the other and vice versa. This paper presents a stack of deep networks to estimate intermediate optical flows from the very first intermediate synthesized frame and later generate the very end interpolated frame by combining the very first one and two learned intermediate optical flows based warped frames. The primary benefit is that it glues two problems into a single comprehensive framework that learns altogether by using both an analysis-by-synthesis technique for optical flow estimation and Convolutional Neural Networks (CNN) kernels-based frame synthesis. The proposed network is the first attempt to merge two previous branches of previous approaches, optical flow-based synthesis and CNN kernels-based synthesis into a comprehensive network. Experiments are carried out with various challenging datasets, all showing that the proposed network outperforms the state-of-the-art methods with significant margins for video frame interpolation and the estimated optical flows are more accurate for challenging movements. Furthermore, the proposed Motion Estimation Motion Compensation (MEMC) network shows its outstanding enhancement of the quality of compressed videos.https://ieeexplore.ieee.org/document/9042307/Frame rate up conversionvideo frame interpolationoptical flowHEVCMEMCCNN
spellingShingle Nguyen Van Thang
Kyujoong Lee
Hyuk-Jae Lee
A Stacked Deep MEMC Network for Frame Rate Up Conversion and its Application to HEVC
IEEE Access
Frame rate up conversion
video frame interpolation
optical flow
HEVC
MEMC
CNN
title A Stacked Deep MEMC Network for Frame Rate Up Conversion and its Application to HEVC
title_full A Stacked Deep MEMC Network for Frame Rate Up Conversion and its Application to HEVC
title_fullStr A Stacked Deep MEMC Network for Frame Rate Up Conversion and its Application to HEVC
title_full_unstemmed A Stacked Deep MEMC Network for Frame Rate Up Conversion and its Application to HEVC
title_short A Stacked Deep MEMC Network for Frame Rate Up Conversion and its Application to HEVC
title_sort stacked deep memc network for frame rate up conversion and its application to hevc
topic Frame rate up conversion
video frame interpolation
optical flow
HEVC
MEMC
CNN
url https://ieeexplore.ieee.org/document/9042307/
work_keys_str_mv AT nguyenvanthang astackeddeepmemcnetworkforframerateupconversionanditsapplicationtohevc
AT kyujoonglee astackeddeepmemcnetworkforframerateupconversionanditsapplicationtohevc
AT hyukjaelee astackeddeepmemcnetworkforframerateupconversionanditsapplicationtohevc
AT nguyenvanthang stackeddeepmemcnetworkforframerateupconversionanditsapplicationtohevc
AT kyujoonglee stackeddeepmemcnetworkforframerateupconversionanditsapplicationtohevc
AT hyukjaelee stackeddeepmemcnetworkforframerateupconversionanditsapplicationtohevc