Sequential Training of Neural Networks With Gradient Boosting
This paper presents a novel technique based on gradient boosting to train the final layers of a neural network (NN). Gradient boosting is an additive expansion algorithm in which a series of models are trained sequentially to approximate a given function. A neural network can also be seen as an addi...
Main Authors: | Seyedsaman Emami, Gonzalo Martinez-Munoz |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10110967/ |
Similar Items
-
A Gradient Boosting Approach for Training Convolutional and Deep Neural Networks
by: Seyedsaman Emami, et al.
Published: (2023-01-01) -
Deep Learning for Multi-Output Regression Using Gradient Boosting
by: Seyedsaman Emami, et al.
Published: (2024-01-01) -
XBNet: An extremely boosted neural network
by: Tushar Sarkar
Published: (2022-09-01) -
Taxonomic identification of hoverfly specimens using neural network and gradient boosting machine techniques
by: Dunja Popovic, et al.
Published: (2020-09-01) -
A boosting ensemble learning based hybrid light gradient boosting machine and extreme gradient boosting model for predicting house prices
by: Racheal Sibindi, et al.
Published: (2023-04-01)