Multi-Fidelity Neural Architecture Search With Knowledge Distillation
Neural architecture search (NAS) targets at finding the optimal architecture of a neural network for a problem or a family of problems. Evaluations of neural architectures are very time-consuming. One of the possible ways to mitigate this issue is to use low-fidelity evaluations, namely training on...
Main Authors: | Ilya Trofimov, Nikita Klyuchnikov, Mikhail Salnikov, Alexander Filippov, Evgeny Burnaev |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10007805/ |
Similar Items
-
NAS-Bench-NLP: Neural Architecture Search Benchmark for Natural Language Processing
by: Nikita Klyuchnikov, et al.
Published: (2022-01-01) -
DistilNAS: Neural Architecture Search With Distilled Data
by: Swaroop N. Prabhakar, et al.
Published: (2022-01-01) -
High-dimensional multi-fidelity Bayesian optimization for quantum control
by: Marjuka F Lazin, et al.
Published: (2023-01-01) -
A Derivative-Free Line-Search Algorithm for Simulation-Driven Design Optimization Using Multi-Fidelity Computations
by: Riccardo Pellegrini, et al.
Published: (2022-02-01) -
A novel heuristic target-dependent neural architecture search method with small samples
by: Leiyang Fu, et al.
Published: (2022-11-01)