Internet Data-based Media Resource Mining Method for Artistic Talent Cultivation

The cultivation of innovative artistic talents is the top priority of art education, making full use of Internet media resources to provide talents with high artistic literacy for art development. This paper begins with the integration of media resources in Internet data and builds a framework for t...

Full description

Bibliographic Details
Main Author: Wang Lina
Format: Article
Language:English
Published: Sciendo 2024-01-01
Series:Applied Mathematics and Nonlinear Sciences
Subjects:
Online Access:https://doi.org/10.2478/amns.2023.2.01522
Description
Summary:The cultivation of innovative artistic talents is the top priority of art education, making full use of Internet media resources to provide talents with high artistic literacy for art development. This paper begins with the integration of media resources in Internet data and builds a framework for the dimensions and processes of media resource knowledge integration. The BERT model combined with the multi-head self-attention mechanism is used for feature extraction of media text resources in Internet data, the signal decomposition of audio media resources is realized through the matching tracking algorithm, and the recommendation of video resources for art talent development is carried out based on the temporal hierarchical attention model. To verify the effectiveness of the algorithms given in this paper, a verification analysis of each algorithm is performed. The results show that when the IR value of audio signal decomposition increases from 1.49 to 10.36, the ROC-AUC value of art cultivation media audio decreases from 0.95±0.02 to 0.93±0.04. After 200 iterations, the video resource recommendation algorithm’s time fluctuation is between 0.21ms and 9.81ms, depending on the users’ behavior sequence. Making full use of technology to deeply mine Internet data can enable the diversification of media resources for art talent training and improve the quality of art talent training.
ISSN:2444-8656