Greedy layerwise training of convolutional neural networks

This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.

Bibliographic Details
Main Author: Trinh, Loc Quang.
Other Authors: Aleksander Ma̧dry.
Format: Thesis
Language:eng
Published: Massachusetts Institute of Technology 2019
Subjects:
Online Access:https://hdl.handle.net/1721.1/123128
_version_ 1826207522607857664
author Trinh, Loc Quang.
author2 Aleksander Ma̧dry.
author_facet Aleksander Ma̧dry.
Trinh, Loc Quang.
author_sort Trinh, Loc Quang.
collection MIT
description This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
first_indexed 2024-09-23T13:50:52Z
format Thesis
id mit-1721.1/123128
institution Massachusetts Institute of Technology
language eng
last_indexed 2024-09-23T13:50:52Z
publishDate 2019
publisher Massachusetts Institute of Technology
record_format dspace
spelling mit-1721.1/1231282019-12-05T18:05:00Z Greedy layerwise training of convolutional neural networks Trinh, Loc Quang. Aleksander Ma̧dry. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Electrical Engineering and Computer Science. This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019 Cataloged from student-submitted PDF version of thesis. Includes bibliographical references (pages 61-63). Layerwise training presents an alternative approach to end-to-end back-propagation for training deep convolutional neural networks. Although previous work was unsuccessful in demonstrating the viability of layerwise training, especially on large-scale datasets such as ImageNet, recent work has shown that layerwise training on specific architectures can yield highly competitive performances. On ImageNet, the layerwise trained networks can perform comparably to many state-of-the-art end-to-end trained networks. In this thesis, we compare the performance gap between the two training procedures across a wide range of network architectures and further analyze the possible limitations of layerwise training. Our results show that layerwise training quickly saturates after a certain critical layer, due to the overfitting of early layers within the networks. We discuss several approaches we took to address this issue and help layerwise training improve across multiple architectures. From a fundamental standpoint, this study emphasizes the need to open the blackbox that is modern deep neural networks and investigate the layerwise interactions between intermediate hidden layers within deep networks, all through the lens of layerwise training. by Loc Quang Trinh. M. Eng. M.Eng. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science 2019-12-05T18:04:59Z 2019-12-05T18:04:59Z 2019 2019 Thesis https://hdl.handle.net/1721.1/123128 1128279897 eng MIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission. http://dspace.mit.edu/handle/1721.1/7582 63 pages application/pdf Massachusetts Institute of Technology
spellingShingle Electrical Engineering and Computer Science.
Trinh, Loc Quang.
Greedy layerwise training of convolutional neural networks
title Greedy layerwise training of convolutional neural networks
title_full Greedy layerwise training of convolutional neural networks
title_fullStr Greedy layerwise training of convolutional neural networks
title_full_unstemmed Greedy layerwise training of convolutional neural networks
title_short Greedy layerwise training of convolutional neural networks
title_sort greedy layerwise training of convolutional neural networks
topic Electrical Engineering and Computer Science.
url https://hdl.handle.net/1721.1/123128
work_keys_str_mv AT trinhlocquang greedylayerwisetrainingofconvolutionalneuralnetworks