Memorizing without overfitting: Bias, variance, and interpolation in overparameterized models

The bias-variance trade-off is a central concept in supervised learning. In classical statistics, increasing the complexity of a model (e.g., number of parameters) reduces bias but also increases variance. Until recently, it was commonly believed that optimal performance is achieved at intermediate...

Full description

Bibliographic Details
Main Authors: Jason W. Rocks, Pankaj Mehta
Format: Article
Language:English
Published: American Physical Society 2022-03-01
Series:Physical Review Research
Online Access:http://doi.org/10.1103/PhysRevResearch.4.013201