When are local queries useful for robust learning?

Distributional assumptions have been shown to be necessary for the robust learnability of concept classes when considering the exact-in-the-ball robust risk and access to random examples by Gourdeau et al. (2019). In this paper, we study learning models where the learner is given more power through...

Descrizione completa

Dettagli Bibliografici
Autori principali: Gourdeau, P, Kanade, V, Kwiatkowska, M, Worrell, J
Natura: Conference item
Lingua:English
Pubblicazione: Curran Associates 2023
_version_ 1826311271067156480
author Gourdeau, P
Kanade, V
Kwiatkowska, M
Worrell, J
author_facet Gourdeau, P
Kanade, V
Kwiatkowska, M
Worrell, J
author_sort Gourdeau, P
collection OXFORD
description Distributional assumptions have been shown to be necessary for the robust learnability of concept classes when considering the exact-in-the-ball robust risk and access to random examples by Gourdeau et al. (2019). In this paper, we study learning models where the learner is given more power through the use of local queries, and give the first distribution-free algorithms that perform robust empirical risk minimization (ERM) for this notion of robustness. The first learning model we consider uses local membership queries (LMQ), where the learner can query the label of points near the training sample. We show that, under the uniform distribution, LMQs do not increase the robustness threshold of conjunctions and any superclass, e.g., decision lists and halfspaces. Faced with this negative result, we introduce the local equivalence query (LEQ) oracle, which returns whether the hypothesis and target concept agree in the perturbation region around a point in the training sample, as well as a counterexample if it exists. We show a separation result: on one hand, if the query radius λ is strictly smaller than the adversary's perturbation budget ρ, then distribution-free robust learning is impossible for a wide variety of concept classes; on the other hand, the setting λ=ρ allows us to develop robust ERM algorithms. We then bound the query complexity of these algorithms based on online learning guarantees and further improve these bounds for the special case of conjunctions. We finish by giving robust learning algorithms for halfspaces with margins on both {0,1}n and ℝn.
first_indexed 2024-03-07T08:05:52Z
format Conference item
id oxford-uuid:ec8d88fc-dd57-4a0b-8b13-f7f60cee6ce0
institution University of Oxford
language English
last_indexed 2024-03-07T08:05:52Z
publishDate 2023
publisher Curran Associates
record_format dspace
spelling oxford-uuid:ec8d88fc-dd57-4a0b-8b13-f7f60cee6ce02023-10-30T10:14:10ZWhen are local queries useful for robust learning?Conference itemhttp://purl.org/coar/resource_type/c_5794uuid:ec8d88fc-dd57-4a0b-8b13-f7f60cee6ce0EnglishSymplectic ElementsCurran Associates2023Gourdeau, PKanade, VKwiatkowska, MWorrell, JDistributional assumptions have been shown to be necessary for the robust learnability of concept classes when considering the exact-in-the-ball robust risk and access to random examples by Gourdeau et al. (2019). In this paper, we study learning models where the learner is given more power through the use of local queries, and give the first distribution-free algorithms that perform robust empirical risk minimization (ERM) for this notion of robustness. The first learning model we consider uses local membership queries (LMQ), where the learner can query the label of points near the training sample. We show that, under the uniform distribution, LMQs do not increase the robustness threshold of conjunctions and any superclass, e.g., decision lists and halfspaces. Faced with this negative result, we introduce the local equivalence query (LEQ) oracle, which returns whether the hypothesis and target concept agree in the perturbation region around a point in the training sample, as well as a counterexample if it exists. We show a separation result: on one hand, if the query radius λ is strictly smaller than the adversary's perturbation budget ρ, then distribution-free robust learning is impossible for a wide variety of concept classes; on the other hand, the setting λ=ρ allows us to develop robust ERM algorithms. We then bound the query complexity of these algorithms based on online learning guarantees and further improve these bounds for the special case of conjunctions. We finish by giving robust learning algorithms for halfspaces with margins on both {0,1}n and ℝn.
spellingShingle Gourdeau, P
Kanade, V
Kwiatkowska, M
Worrell, J
When are local queries useful for robust learning?
title When are local queries useful for robust learning?
title_full When are local queries useful for robust learning?
title_fullStr When are local queries useful for robust learning?
title_full_unstemmed When are local queries useful for robust learning?
title_short When are local queries useful for robust learning?
title_sort when are local queries useful for robust learning
work_keys_str_mv AT gourdeaup whenarelocalqueriesusefulforrobustlearning
AT kanadev whenarelocalqueriesusefulforrobustlearning
AT kwiatkowskam whenarelocalqueriesusefulforrobustlearning
AT worrellj whenarelocalqueriesusefulforrobustlearning