Prioritizing information during working memory: Beyond sustained internal attention.
Working memory (WM) has limited capacity. This leaves attention with the important role of allowing into storage only the most relevant information. It is increasingly evident that attention is equally crucial for prioritizing representations within WM as the importance of individual items changes....
Main Authors: | , , |
---|---|
Format: | Journal article |
Language: | English |
Published: |
Elsevier
2017
|
_version_ | 1797082790772080640 |
---|---|
author | Myers, N Stokes, M Nobre, A |
author_facet | Myers, N Stokes, M Nobre, A |
author_sort | Myers, N |
collection | OXFORD |
description | Working memory (WM) has limited capacity. This leaves attention with the important role of allowing into storage only the most relevant information. It is increasingly evident that attention is equally crucial for prioritizing representations within WM as the importance of individual items changes. Retrospective prioritization has been proposed to result from a focus of internal attention highlighting one of several representations. Here, we suggest an updated model, in which prioritization acts in multiple steps: first orienting towards and selecting a memory, and then reconfiguring its representational state in the service of upcoming task demands. Reconfiguration sets up an optimized perception-action mapping, obviating the need for sustained attention. This view is consistent with recent literature, makes testable predictions, and links WM with task switching and action preparation. |
first_indexed | 2024-03-07T01:32:51Z |
format | Journal article |
id | oxford-uuid:942cdf96-a8f8-4a8d-9c1c-3e29cdeaff2a |
institution | University of Oxford |
language | English |
last_indexed | 2024-03-07T01:32:51Z |
publishDate | 2017 |
publisher | Elsevier |
record_format | dspace |
spelling | oxford-uuid:942cdf96-a8f8-4a8d-9c1c-3e29cdeaff2a2022-03-26T23:37:32ZPrioritizing information during working memory: Beyond sustained internal attention.Journal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:942cdf96-a8f8-4a8d-9c1c-3e29cdeaff2aEnglishSymplectic Elements at OxfordElsevier2017Myers, NStokes, MNobre, AWorking memory (WM) has limited capacity. This leaves attention with the important role of allowing into storage only the most relevant information. It is increasingly evident that attention is equally crucial for prioritizing representations within WM as the importance of individual items changes. Retrospective prioritization has been proposed to result from a focus of internal attention highlighting one of several representations. Here, we suggest an updated model, in which prioritization acts in multiple steps: first orienting towards and selecting a memory, and then reconfiguring its representational state in the service of upcoming task demands. Reconfiguration sets up an optimized perception-action mapping, obviating the need for sustained attention. This view is consistent with recent literature, makes testable predictions, and links WM with task switching and action preparation. |
spellingShingle | Myers, N Stokes, M Nobre, A Prioritizing information during working memory: Beyond sustained internal attention. |
title | Prioritizing information during working memory: Beyond sustained internal attention. |
title_full | Prioritizing information during working memory: Beyond sustained internal attention. |
title_fullStr | Prioritizing information during working memory: Beyond sustained internal attention. |
title_full_unstemmed | Prioritizing information during working memory: Beyond sustained internal attention. |
title_short | Prioritizing information during working memory: Beyond sustained internal attention. |
title_sort | prioritizing information during working memory beyond sustained internal attention |
work_keys_str_mv | AT myersn prioritizinginformationduringworkingmemorybeyondsustainedinternalattention AT stokesm prioritizinginformationduringworkingmemorybeyondsustainedinternalattention AT nobrea prioritizinginformationduringworkingmemorybeyondsustainedinternalattention |