Progressive skeletonization: trimming more fat from a network at initialization

Recent studies have shown that skeletonization (pruning parameters) of networks at initialization provides all the practical benefits of sparsity both at inference and training time, while only marginally degrading their performance. However, we observe that beyond a certain level of sparsity (appro...

Повний опис

Бібліографічні деталі
Автори: de Jorge, P, Sanyal, A, Behl, HS, Torr, PHS, Rogez, G, Dokania, PK
Формат: Conference item
Мова:English
Опубліковано: OpenReview 2020

Схожі ресурси