An Automated Method for Identification of Key frames in Bharatanatyam Dance Videos

Identifying <italic>k</italic> ey frames is the first and necessary step before solving the variety of other <italic>B</italic> haratanatyam problems. The paper aims to partition the momentarily stationary frames (<italic>key frame</italic> s) from this dance vide...

Full description

Bibliographic Details
Main Authors: Himadri Bhuyan, Partha Pratim Das, Jatindra Kumar Dash, Jagadeesh Killi
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9429237/
Description
Summary:Identifying <italic>k</italic> ey frames is the first and necessary step before solving the variety of other <italic>B</italic> haratanatyam problems. The paper aims to partition the momentarily stationary frames (<italic>key frame</italic> s) from this dance video&#x2019;s motion frames. The proposed <italic>key frame</italic> s (KFs) localization is novel, simple, and effective compared to the existing dance video analysis methods. It is distinctive from standard KFs detection algorithms as used in other human motion videos. In the dance&#x2019;s basic structure, the occurrence of KFs during performances is often not completely stationary and varies with the dance form and the performer. Hence, it is not easy to decide a global threshold (on the quantum of motion) to work across dancers and performances. The earlier approaches try to compute the threshold iteratively. However, the novelty of the paper is: (a) formulating an adaptive threshold, (b) adopting Machine Learning (ML) approach and, (c) generating the effective feature by combining three frame differencing and bit-plane technique for the KF detection. In ML, we use Support Vector Machine (SVM) and Convolutional Neural Network (CNN) as the classifiers. The proposed approaches are also compared and analyzed with the earlier approaches. Finally, the proposed ML techniques emerge as a winner with around 90&#x0025; accuracy.
ISSN:2169-3536