Explaining pretrained language models' understanding of linguistic structures using construction grammar

Construction Grammar (CxG) is a paradigm from cognitive linguistics emphasizing the connection between syntax and semantics. Rather than rules that operate on lexical items, it posits constructions as the central building blocks of language, i.e., linguistic units of different granularity that combi...

Full description

Bibliographic Details
Main Authors: Leonie Weissweiler, Valentin Hofmann, Abdullatif Köksal, Hinrich Schütze
Format: Article
Language:English
Published: Frontiers Media S.A. 2023-10-01
Series:Frontiers in Artificial Intelligence
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/frai.2023.1225791/full
_version_ 1797661268704755712
author Leonie Weissweiler
Leonie Weissweiler
Valentin Hofmann
Valentin Hofmann
Abdullatif Köksal
Abdullatif Köksal
Hinrich Schütze
Hinrich Schütze
author_facet Leonie Weissweiler
Leonie Weissweiler
Valentin Hofmann
Valentin Hofmann
Abdullatif Köksal
Abdullatif Köksal
Hinrich Schütze
Hinrich Schütze
author_sort Leonie Weissweiler
collection DOAJ
description Construction Grammar (CxG) is a paradigm from cognitive linguistics emphasizing the connection between syntax and semantics. Rather than rules that operate on lexical items, it posits constructions as the central building blocks of language, i.e., linguistic units of different granularity that combine syntax and semantics. As a first step toward assessing the compatibility of CxG with the syntactic and semantic knowledge demonstrated by state-of-the-art pretrained language models (PLMs), we present an investigation of their capability to classify and understand one of the most commonly studied constructions, the English comparative correlative (CC). We conduct experiments examining the classification accuracy of a syntactic probe on the one hand and the models' behavior in a semantic application task on the other, with BERT, RoBERTa, and DeBERTa as the example PLMs. Our results show that all three investigated PLMs, as well as OPT, are able to recognize the structure of the CC but fail to use its meaning. While human-like performance of PLMs on many NLP tasks has been alleged, this indicates that PLMs still suffer from substantial shortcomings in central domains of linguistic knowledge.
first_indexed 2024-03-11T18:42:58Z
format Article
id doaj.art-9d663f4504a94e04b009cd0876bef3ee
institution Directory Open Access Journal
issn 2624-8212
language English
last_indexed 2024-03-11T18:42:58Z
publishDate 2023-10-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Artificial Intelligence
spelling doaj.art-9d663f4504a94e04b009cd0876bef3ee2023-10-12T09:06:41ZengFrontiers Media S.A.Frontiers in Artificial Intelligence2624-82122023-10-01610.3389/frai.2023.12257911225791Explaining pretrained language models' understanding of linguistic structures using construction grammarLeonie Weissweiler0Leonie Weissweiler1Valentin Hofmann2Valentin Hofmann3Abdullatif Köksal4Abdullatif Köksal5Hinrich Schütze6Hinrich Schütze7Center for Information and Language Processing, LMU Munich, Munich, GermanyMunich Center for Machine Learning, Munich, GermanyCenter for Information and Language Processing, LMU Munich, Munich, GermanyFaculty of Linguistics, University of Oxford, Oxford, United KingdomCenter for Information and Language Processing, LMU Munich, Munich, GermanyMunich Center for Machine Learning, Munich, GermanyCenter for Information and Language Processing, LMU Munich, Munich, GermanyMunich Center for Machine Learning, Munich, GermanyConstruction Grammar (CxG) is a paradigm from cognitive linguistics emphasizing the connection between syntax and semantics. Rather than rules that operate on lexical items, it posits constructions as the central building blocks of language, i.e., linguistic units of different granularity that combine syntax and semantics. As a first step toward assessing the compatibility of CxG with the syntactic and semantic knowledge demonstrated by state-of-the-art pretrained language models (PLMs), we present an investigation of their capability to classify and understand one of the most commonly studied constructions, the English comparative correlative (CC). We conduct experiments examining the classification accuracy of a syntactic probe on the one hand and the models' behavior in a semantic application task on the other, with BERT, RoBERTa, and DeBERTa as the example PLMs. Our results show that all three investigated PLMs, as well as OPT, are able to recognize the structure of the CC but fail to use its meaning. While human-like performance of PLMs on many NLP tasks has been alleged, this indicates that PLMs still suffer from substantial shortcomings in central domains of linguistic knowledge.https://www.frontiersin.org/articles/10.3389/frai.2023.1225791/fullNLPprobingconstruction grammarcomputational linguisticslarge language models
spellingShingle Leonie Weissweiler
Leonie Weissweiler
Valentin Hofmann
Valentin Hofmann
Abdullatif Köksal
Abdullatif Köksal
Hinrich Schütze
Hinrich Schütze
Explaining pretrained language models' understanding of linguistic structures using construction grammar
Frontiers in Artificial Intelligence
NLP
probing
construction grammar
computational linguistics
large language models
title Explaining pretrained language models' understanding of linguistic structures using construction grammar
title_full Explaining pretrained language models' understanding of linguistic structures using construction grammar
title_fullStr Explaining pretrained language models' understanding of linguistic structures using construction grammar
title_full_unstemmed Explaining pretrained language models' understanding of linguistic structures using construction grammar
title_short Explaining pretrained language models' understanding of linguistic structures using construction grammar
title_sort explaining pretrained language models understanding of linguistic structures using construction grammar
topic NLP
probing
construction grammar
computational linguistics
large language models
url https://www.frontiersin.org/articles/10.3389/frai.2023.1225791/full
work_keys_str_mv AT leonieweissweiler explainingpretrainedlanguagemodelsunderstandingoflinguisticstructuresusingconstructiongrammar
AT leonieweissweiler explainingpretrainedlanguagemodelsunderstandingoflinguisticstructuresusingconstructiongrammar
AT valentinhofmann explainingpretrainedlanguagemodelsunderstandingoflinguisticstructuresusingconstructiongrammar
AT valentinhofmann explainingpretrainedlanguagemodelsunderstandingoflinguisticstructuresusingconstructiongrammar
AT abdullatifkoksal explainingpretrainedlanguagemodelsunderstandingoflinguisticstructuresusingconstructiongrammar
AT abdullatifkoksal explainingpretrainedlanguagemodelsunderstandingoflinguisticstructuresusingconstructiongrammar
AT hinrichschutze explainingpretrainedlanguagemodelsunderstandingoflinguisticstructuresusingconstructiongrammar
AT hinrichschutze explainingpretrainedlanguagemodelsunderstandingoflinguisticstructuresusingconstructiongrammar