Carathéodory sampling for stochastic gradient descent

Many problems require to optimize empirical risk functions over large data sets. Gradient descent methods that calculate the full gradient in every descent step do not scale to such datasets. Various flavours of Stochastic Gradient Descent (SGD) replace the expensive summation that computes the full...

সম্পূর্ণ বিবরণ

গ্রন্থ-পঞ্জীর বিবরন
প্রধান লেখক: Cosentino, F, Oberhauser, H, Abate, A
বিন্যাস: Internet publication
ভাষা:English
প্রকাশিত: 2020