A Transformer-BERT Integrated Model-Based Automatic Conversation Method Under English Context

The contextual understanding ability in complex conversation scenarios has been a challenging issue, and existing methods mostly failed to possess such characteristics. To bridge such gap, this paper formulates a novel composite large language model to investigate such issue. As a result, taking Eng...

Full description

Bibliographic Details
Main Authors: Xing'an Li, Tangfa Liu, Longlong Zhang, Fayez Alqahtani, Amr Tolba
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10497586/
_version_ 1797193539884417024
author Xing'an Li
Tangfa Liu
Longlong Zhang
Fayez Alqahtani
Amr Tolba
author_facet Xing'an Li
Tangfa Liu
Longlong Zhang
Fayez Alqahtani
Amr Tolba
author_sort Xing'an Li
collection DOAJ
description The contextual understanding ability in complex conversation scenarios has been a challenging issue, and existing methods mostly failed to possess such characteristics. To bridge such gap, this paper formulates a novel composite large language model to investigate such issue. As a result, taking English context as the scene, a Transformer-BERT integrated model-based automatic conversation model is proposed in this work. Firstly, the unidirectional BERT-based automatic conversation model is improved by introducing attention mechanism. It is expected to enhance feature expression for conversation texts by linking context to identify long-difficult sentences. Besides, a bidirectional Transformer encoder is utilized as the input layer before the BERT encoder. Through the two modules, dynamic language training based on English situational conversations can be completed to build the automatic conversation model. The proposed conversation model is further assessed on massive real-world English language context in terms of conversation performance. The experimental results show that compared with traditional rule-based or machine learning methods, the proposal has significantly improved response quality and fluency in English context. It can more accurately understand context, capture subtle semantic differences, and generate more coherent responses.
first_indexed 2024-04-24T05:42:00Z
format Article
id doaj.art-cc28bec2e878408387b739fc97d0093d
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-04-24T05:42:00Z
publishDate 2024-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-cc28bec2e878408387b739fc97d0093d2024-04-23T23:00:27ZengIEEEIEEE Access2169-35362024-01-0112557575576710.1109/ACCESS.2024.338810010497586A Transformer-BERT Integrated Model-Based Automatic Conversation Method Under English ContextXing'an Li0https://orcid.org/0009-0006-8256-6359Tangfa Liu1https://orcid.org/0009-0000-9140-8527Longlong Zhang2Fayez Alqahtani3https://orcid.org/0000-0001-8972-5953Amr Tolba4https://orcid.org/0000-0003-3439-6413Department of Foreign Languages and Literature, Gongqing College of Nanchang University, Jiujiang, ChinaSchool of Economics and Management, Gannan University of Science and Technology, Ganzhou, ChinaDepartment of Foreign Languages and Literature, Gongqing College of Nanchang University, Jiujiang, ChinaDepartment of Software Engineering, College of Computer and Information Sciences, King Saud University, Riyadh, Saudi ArabiaDepartment of Computer Science, Community College, King Saud University, Riyadh, Saudi ArabiaThe contextual understanding ability in complex conversation scenarios has been a challenging issue, and existing methods mostly failed to possess such characteristics. To bridge such gap, this paper formulates a novel composite large language model to investigate such issue. As a result, taking English context as the scene, a Transformer-BERT integrated model-based automatic conversation model is proposed in this work. Firstly, the unidirectional BERT-based automatic conversation model is improved by introducing attention mechanism. It is expected to enhance feature expression for conversation texts by linking context to identify long-difficult sentences. Besides, a bidirectional Transformer encoder is utilized as the input layer before the BERT encoder. Through the two modules, dynamic language training based on English situational conversations can be completed to build the automatic conversation model. The proposed conversation model is further assessed on massive real-world English language context in terms of conversation performance. The experimental results show that compared with traditional rule-based or machine learning methods, the proposal has significantly improved response quality and fluency in English context. It can more accurately understand context, capture subtle semantic differences, and generate more coherent responses.https://ieeexplore.ieee.org/document/10497586/Large language modelautomatic conversationsemantic contextnatural language processing
spellingShingle Xing'an Li
Tangfa Liu
Longlong Zhang
Fayez Alqahtani
Amr Tolba
A Transformer-BERT Integrated Model-Based Automatic Conversation Method Under English Context
IEEE Access
Large language model
automatic conversation
semantic context
natural language processing
title A Transformer-BERT Integrated Model-Based Automatic Conversation Method Under English Context
title_full A Transformer-BERT Integrated Model-Based Automatic Conversation Method Under English Context
title_fullStr A Transformer-BERT Integrated Model-Based Automatic Conversation Method Under English Context
title_full_unstemmed A Transformer-BERT Integrated Model-Based Automatic Conversation Method Under English Context
title_short A Transformer-BERT Integrated Model-Based Automatic Conversation Method Under English Context
title_sort transformer bert integrated model based automatic conversation method under english context
topic Large language model
automatic conversation
semantic context
natural language processing
url https://ieeexplore.ieee.org/document/10497586/
work_keys_str_mv AT xinganli atransformerbertintegratedmodelbasedautomaticconversationmethodunderenglishcontext
AT tangfaliu atransformerbertintegratedmodelbasedautomaticconversationmethodunderenglishcontext
AT longlongzhang atransformerbertintegratedmodelbasedautomaticconversationmethodunderenglishcontext
AT fayezalqahtani atransformerbertintegratedmodelbasedautomaticconversationmethodunderenglishcontext
AT amrtolba atransformerbertintegratedmodelbasedautomaticconversationmethodunderenglishcontext
AT xinganli transformerbertintegratedmodelbasedautomaticconversationmethodunderenglishcontext
AT tangfaliu transformerbertintegratedmodelbasedautomaticconversationmethodunderenglishcontext
AT longlongzhang transformerbertintegratedmodelbasedautomaticconversationmethodunderenglishcontext
AT fayezalqahtani transformerbertintegratedmodelbasedautomaticconversationmethodunderenglishcontext
AT amrtolba transformerbertintegratedmodelbasedautomaticconversationmethodunderenglishcontext