Subgradient methods for convex minimization

Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2002.

Bibliographic Details
Main Author: NediÄ , Angelia
Other Authors: Dimitri P. Bertsekas.
Format: Thesis
Language:eng
Published: Massachusetts Institute of Technology 2005
Subjects:
Online Access:http://hdl.handle.net/1721.1/16843
_version_ 1826208563531350016
author NediÄ , Angelia
author2 Dimitri P. Bertsekas.
author_facet Dimitri P. Bertsekas.
NediÄ , Angelia
author_sort NediÄ , Angelia
collection MIT
description Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2002.
first_indexed 2024-09-23T14:07:36Z
format Thesis
id mit-1721.1/16843
institution Massachusetts Institute of Technology
language eng
last_indexed 2024-09-23T14:07:36Z
publishDate 2005
publisher Massachusetts Institute of Technology
record_format dspace
spelling mit-1721.1/168432019-04-10T09:38:03Z Subgradient methods for convex minimization NediÄ , Angelia Dimitri P. Bertsekas. Massachusetts Institute of Technology. Dept. of Electrical Engineering and Computer Science. Massachusetts Institute of Technology. Dept. of Electrical Engineering and Computer Science. Electrical Engineering and Computer Science. Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2002. Includes bibliographical references (p. 169-174). This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. Many optimization problems arising in various applications require minimization of an objective cost function that is convex but not differentiable. Such a minimization arises, for example, in model construction, system identification, neural networks, pattern classification, and various assignment, scheduling, and allocation problems. To solve convex but not differentiable problems, we have to employ special methods that can work in the absence of differentiability, while taking the advantage of convexity and possibly other special structures that our minimization problem may possess. In this thesis, we propose and analyze some new methods that can solve convex (not necessarily differentiable) problems. In particular, we consider two classes of methods: incremental and variable metric. by Angelia Nedić. Ph.D. 2005-05-19T14:59:52Z 2005-05-19T14:59:52Z 2002 2002 Thesis http://hdl.handle.net/1721.1/16843 51441857 eng M.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission. http://dspace.mit.edu/handle/1721.1/7582 174 p. 1040674 bytes 1040074 bytes application/pdf application/pdf application/pdf Massachusetts Institute of Technology
spellingShingle Electrical Engineering and Computer Science.
NediÄ , Angelia
Subgradient methods for convex minimization
title Subgradient methods for convex minimization
title_full Subgradient methods for convex minimization
title_fullStr Subgradient methods for convex minimization
title_full_unstemmed Subgradient methods for convex minimization
title_short Subgradient methods for convex minimization
title_sort subgradient methods for convex minimization
topic Electrical Engineering and Computer Science.
url http://hdl.handle.net/1721.1/16843
work_keys_str_mv AT nediaangelia subgradientmethodsforconvexminimization