DistilNAS: Neural Architecture Search With Distilled Data

Can we perform Neural Architecture Search (NAS) with a smaller subset of target dataset and still fair better in terms of performance with significant reduction in search cost? In this work, we propose a method, called DistilNAS, which utilizes a curriculum learning based approach to distill the tar...

Full description

Bibliographic Details
Main Authors: Swaroop N. Prabhakar, Ankur Deshwal, Rahul Mishra, Hyeonsu Kim
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9963961/