Optimizing Session-Aware Recommenders: A Deep Dive into GRU-Based Latent Interaction Integration
This study introduces session-aware recommendation models, leveraging GRU (Gated Recurrent Unit) and attention mechanisms for advanced latent interaction data integration. A primary advancement is enhancing latent context, a critical factor for boosting recommendation accuracy. We address the existi...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2024-02-01
|
Series: | Future Internet |
Subjects: | |
Online Access: | https://www.mdpi.com/1999-5903/16/2/51 |
_version_ | 1827343556835016704 |
---|---|
author | Ming-Yen Lin Ping-Chun Wu Sue-Chen Hsueh |
author_facet | Ming-Yen Lin Ping-Chun Wu Sue-Chen Hsueh |
author_sort | Ming-Yen Lin |
collection | DOAJ |
description | This study introduces session-aware recommendation models, leveraging GRU (Gated Recurrent Unit) and attention mechanisms for advanced latent interaction data integration. A primary advancement is enhancing latent context, a critical factor for boosting recommendation accuracy. We address the existing models’ rigidity by dynamically blending short-term (most recent) and long-term (historical) preferences, moving beyond static period definitions. Our approaches, pre-combination (LCII-Pre) and post-combination (LCII-Post), with fixed (Fix) and flexible learning (LP) weight configurations, are thoroughly evaluated. We conducted extensive experiments to assess our models’ performance on public datasets such as Amazon and MovieLens 1M. Notably, on the MovieLens 1M dataset, LCII-Pre<sub>Fix</sub> achieved a 1.85% and 2.54% higher Recall@20 than II-RNN and BERT4Rec<sub>+st+TSA</sub>, respectively. On the Steam dataset, LCII-Post<sub>LP</sub> outperformed these models by 18.66% and 5.5%. Furthermore, on the Amazon dataset, LCII showed a 2.59% and 1.89% improvement in Recall@20 over II-RNN and CAII. These results affirm the significant enhancement our models bring to session-aware recommendation systems, showcasing their potential for both academic and practical applications in the field. |
first_indexed | 2024-03-07T22:31:58Z |
format | Article |
id | doaj.art-bdecc95885dc440aa27b0b110b4657a7 |
institution | Directory Open Access Journal |
issn | 1999-5903 |
language | English |
last_indexed | 2024-03-07T22:31:58Z |
publishDate | 2024-02-01 |
publisher | MDPI AG |
record_format | Article |
series | Future Internet |
spelling | doaj.art-bdecc95885dc440aa27b0b110b4657a72024-02-23T15:17:19ZengMDPI AGFuture Internet1999-59032024-02-011625110.3390/fi16020051Optimizing Session-Aware Recommenders: A Deep Dive into GRU-Based Latent Interaction IntegrationMing-Yen Lin0Ping-Chun Wu1Sue-Chen Hsueh2Department of Information Engineering and Computer Science, Feng Chia University, Taichung 402, TaiwanDepartment of Information Engineering and Computer Science, Feng Chia University, Taichung 402, TaiwanDepartment of Information Management, Chaoyang University of Technology, Taichung 413, TaiwanThis study introduces session-aware recommendation models, leveraging GRU (Gated Recurrent Unit) and attention mechanisms for advanced latent interaction data integration. A primary advancement is enhancing latent context, a critical factor for boosting recommendation accuracy. We address the existing models’ rigidity by dynamically blending short-term (most recent) and long-term (historical) preferences, moving beyond static period definitions. Our approaches, pre-combination (LCII-Pre) and post-combination (LCII-Post), with fixed (Fix) and flexible learning (LP) weight configurations, are thoroughly evaluated. We conducted extensive experiments to assess our models’ performance on public datasets such as Amazon and MovieLens 1M. Notably, on the MovieLens 1M dataset, LCII-Pre<sub>Fix</sub> achieved a 1.85% and 2.54% higher Recall@20 than II-RNN and BERT4Rec<sub>+st+TSA</sub>, respectively. On the Steam dataset, LCII-Post<sub>LP</sub> outperformed these models by 18.66% and 5.5%. Furthermore, on the Amazon dataset, LCII showed a 2.59% and 1.89% improvement in Recall@20 over II-RNN and CAII. These results affirm the significant enhancement our models bring to session-aware recommendation systems, showcasing their potential for both academic and practical applications in the field.https://www.mdpi.com/1999-5903/16/2/51recommender systemsession-aware recommendationlatent-context informationlong-term and short-term preferencegated recurrent unit |
spellingShingle | Ming-Yen Lin Ping-Chun Wu Sue-Chen Hsueh Optimizing Session-Aware Recommenders: A Deep Dive into GRU-Based Latent Interaction Integration Future Internet recommender system session-aware recommendation latent-context information long-term and short-term preference gated recurrent unit |
title | Optimizing Session-Aware Recommenders: A Deep Dive into GRU-Based Latent Interaction Integration |
title_full | Optimizing Session-Aware Recommenders: A Deep Dive into GRU-Based Latent Interaction Integration |
title_fullStr | Optimizing Session-Aware Recommenders: A Deep Dive into GRU-Based Latent Interaction Integration |
title_full_unstemmed | Optimizing Session-Aware Recommenders: A Deep Dive into GRU-Based Latent Interaction Integration |
title_short | Optimizing Session-Aware Recommenders: A Deep Dive into GRU-Based Latent Interaction Integration |
title_sort | optimizing session aware recommenders a deep dive into gru based latent interaction integration |
topic | recommender system session-aware recommendation latent-context information long-term and short-term preference gated recurrent unit |
url | https://www.mdpi.com/1999-5903/16/2/51 |
work_keys_str_mv | AT mingyenlin optimizingsessionawarerecommendersadeepdiveintogrubasedlatentinteractionintegration AT pingchunwu optimizingsessionawarerecommendersadeepdiveintogrubasedlatentinteractionintegration AT suechenhsueh optimizingsessionawarerecommendersadeepdiveintogrubasedlatentinteractionintegration |