Learning-augmented weighted paging
We consider a natural semi-online model for weighted paging, where at any time the algorithm is given predictions, possibly with errors, about the next arrival of each page. The model is inspired by Belady's classic optimal offline algorithm for unweighted paging, and extends the recently studi...
Main Authors: | , , , , |
---|---|
Format: | Conference item |
Language: | English |
Published: |
Society for Industrial and Applied Mathematics
2022
|
_version_ | 1826309845323612160 |
---|---|
author | Bansal, N Coester, C Kumar, R Purohit, M Veez, E |
author_facet | Bansal, N Coester, C Kumar, R Purohit, M Veez, E |
author_sort | Bansal, N |
collection | OXFORD |
description | We consider a natural semi-online model for weighted paging, where at any time the algorithm is given predictions, possibly with errors, about the next arrival of each page. The model is inspired by Belady's classic optimal offline algorithm for unweighted paging, and extends the recently studied model for learning-augmented paging [45, 50, 52] to the weighted setting.
<br>
For the case of perfect predictions, we provide an ℓ-competitive deterministic and an O(log ℓ)-competitive randomized algorithm, where ℓ is the number of distinct weight classes. Both these bounds are tight, and imply an O(log W)- and O(log log W)-competitive ratio, respectively, when the page weights lie between 1 and W. Previously, it was not known how to use these predictions in the weighted setting and only bounds of k and O(log k) were known, where k is the cache size. Our results also generalize to the interleaved paging setting and to the case of imperfect predictions, with the competitive ratios degrading smoothly from O(ℓ) and O(log ℓ) to O(k) and O(log k), respectively, as the prediction error increases.
<br>
Our results are based on several insights on structural properties of Belady's algorithm and the sequence of page arrival predictions, and novel potential functions that incorporate these predictions. For the case of unweighted paging, the results imply a very simple potential function based proof of the optimality of Belady's algorithm, which may be of independent interest.
|
first_indexed | 2024-03-07T07:41:45Z |
format | Conference item |
id | oxford-uuid:f8f430f4-23fd-49b9-9cf4-1c24230151cf |
institution | University of Oxford |
language | English |
last_indexed | 2024-03-07T07:41:45Z |
publishDate | 2022 |
publisher | Society for Industrial and Applied Mathematics |
record_format | dspace |
spelling | oxford-uuid:f8f430f4-23fd-49b9-9cf4-1c24230151cf2023-04-20T12:43:42ZLearning-augmented weighted pagingConference itemhttp://purl.org/coar/resource_type/c_5794uuid:f8f430f4-23fd-49b9-9cf4-1c24230151cfEnglishSymplectic ElementsSociety for Industrial and Applied Mathematics2022Bansal, NCoester, CKumar, RPurohit, MVeez, EWe consider a natural semi-online model for weighted paging, where at any time the algorithm is given predictions, possibly with errors, about the next arrival of each page. The model is inspired by Belady's classic optimal offline algorithm for unweighted paging, and extends the recently studied model for learning-augmented paging [45, 50, 52] to the weighted setting. <br> For the case of perfect predictions, we provide an ℓ-competitive deterministic and an O(log ℓ)-competitive randomized algorithm, where ℓ is the number of distinct weight classes. Both these bounds are tight, and imply an O(log W)- and O(log log W)-competitive ratio, respectively, when the page weights lie between 1 and W. Previously, it was not known how to use these predictions in the weighted setting and only bounds of k and O(log k) were known, where k is the cache size. Our results also generalize to the interleaved paging setting and to the case of imperfect predictions, with the competitive ratios degrading smoothly from O(ℓ) and O(log ℓ) to O(k) and O(log k), respectively, as the prediction error increases. <br> Our results are based on several insights on structural properties of Belady's algorithm and the sequence of page arrival predictions, and novel potential functions that incorporate these predictions. For the case of unweighted paging, the results imply a very simple potential function based proof of the optimality of Belady's algorithm, which may be of independent interest. |
spellingShingle | Bansal, N Coester, C Kumar, R Purohit, M Veez, E Learning-augmented weighted paging |
title | Learning-augmented weighted paging |
title_full | Learning-augmented weighted paging |
title_fullStr | Learning-augmented weighted paging |
title_full_unstemmed | Learning-augmented weighted paging |
title_short | Learning-augmented weighted paging |
title_sort | learning augmented weighted paging |
work_keys_str_mv | AT bansaln learningaugmentedweightedpaging AT coesterc learningaugmentedweightedpaging AT kumarr learningaugmentedweightedpaging AT purohitm learningaugmentedweightedpaging AT veeze learningaugmentedweightedpaging |