Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Algebraic and Geometric Structure in...
~
Charles, Zachary.
Linked to FindBook
Google Book
Amazon
博客來
Algebraic and Geometric Structure in Machine Learning and Optimization Algorithms.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Algebraic and Geometric Structure in Machine Learning and Optimization Algorithms./
Author:
Charles, Zachary.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2017,
Description:
243 p.
Notes:
Source: Dissertations Abstracts International, Volume: 79-07, Section: B.
Contained By:
Dissertations Abstracts International79-07B.
Subject:
Applied Mathematics. -
Online resource:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10687436
ISBN:
9780355523447
Algebraic and Geometric Structure in Machine Learning and Optimization Algorithms.
Charles, Zachary.
Algebraic and Geometric Structure in Machine Learning and Optimization Algorithms.
- Ann Arbor : ProQuest Dissertations & Theses, 2017 - 243 p.
Source: Dissertations Abstracts International, Volume: 79-07, Section: B.
Thesis (Ph.D.)--The University of Wisconsin - Madison, 2017.
This item must not be sold to any third party vendors.
As the scope and importance of machine learning applications widens, it becomes increasingly important to design machine learning and optimization methods that will efficiently and provably obtain desired outputs. We may wish to guarantee that an optimization algorithm quickly converges to a global minimum or that a machine learning model generalizes well to unseen data. We would like ways to better understand how to design and analyze such algorithms so that we can make such guarantees. In this thesis, we take an algebraic and geometric approach towards understanding machine learning and optimization algorithms. Many optimization problems, whether implicitly or explicitly, contain large amounts of algebraic and geometric structure. This can arise from the feasible region of the problem or from the optimization algorithm being analyzed. A similar phenomenon occurs in machine learning, where there is geometric structure in both the loss function used to evaluate the machine learning algorithm, and in the method used to train the machine. As we develop more complicated models and methods (such as deep neural networks), this structure becomes more and more useful to start understanding important properties of our machine learning and optimization algorithms. We show that various problems in both areas have algebraic or geometric structure and that we can use this structure to design more accurate and efficient algorithms. We use this approach in four primary areas. First, we show that by using underlying algebraic structure, we can design improved optimization methods for a control-theoretic problem. Second, we use the geometry of algebraic subspaces to address structured clustering problems, even in the presence of missing data. Third, we show that if certain geometric properties hold, we can directly bound the generalization error of learned models of machine learning algorithms. Finally, we show that we can use tools from linear algebra and random matrix theory to design more robust distributed optimization algorithms.
ISBN: 9780355523447Subjects--Topical Terms:
1669109
Applied Mathematics.
Subjects--Index Terms:
Algebra
Algebraic and Geometric Structure in Machine Learning and Optimization Algorithms.
LDR
:03332nmm a2200409 4500
001
2280550
005
20210907071054.5
008
220723s2017 ||||||||||||||||| ||eng d
020
$a
9780355523447
035
$a
(MiAaPQ)AAI10687436
035
$a
(MiAaPQ)wisc:14989
035
$a
AAI10687436
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Charles, Zachary.
$3
3559092
245
1 0
$a
Algebraic and Geometric Structure in Machine Learning and Optimization Algorithms.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2017
300
$a
243 p.
500
$a
Source: Dissertations Abstracts International, Volume: 79-07, Section: B.
500
$a
Publisher info.: Dissertation/Thesis.
500
$a
Advisor: Boston, Nigel.
502
$a
Thesis (Ph.D.)--The University of Wisconsin - Madison, 2017.
506
$a
This item must not be sold to any third party vendors.
520
$a
As the scope and importance of machine learning applications widens, it becomes increasingly important to design machine learning and optimization methods that will efficiently and provably obtain desired outputs. We may wish to guarantee that an optimization algorithm quickly converges to a global minimum or that a machine learning model generalizes well to unseen data. We would like ways to better understand how to design and analyze such algorithms so that we can make such guarantees. In this thesis, we take an algebraic and geometric approach towards understanding machine learning and optimization algorithms. Many optimization problems, whether implicitly or explicitly, contain large amounts of algebraic and geometric structure. This can arise from the feasible region of the problem or from the optimization algorithm being analyzed. A similar phenomenon occurs in machine learning, where there is geometric structure in both the loss function used to evaluate the machine learning algorithm, and in the method used to train the machine. As we develop more complicated models and methods (such as deep neural networks), this structure becomes more and more useful to start understanding important properties of our machine learning and optimization algorithms. We show that various problems in both areas have algebraic or geometric structure and that we can use this structure to design more accurate and efficient algorithms. We use this approach in four primary areas. First, we show that by using underlying algebraic structure, we can design improved optimization methods for a control-theoretic problem. Second, we use the geometry of algebraic subspaces to address structured clustering problems, even in the presence of missing data. Third, we show that if certain geometric properties hold, we can directly bound the generalization error of learned models of machine learning algorithms. Finally, we show that we can use tools from linear algebra and random matrix theory to design more robust distributed optimization algorithms.
590
$a
School code: 0262.
650
4
$a
Applied Mathematics.
$3
1669109
650
4
$a
Statistics.
$3
517247
650
4
$a
Computer science.
$3
523869
653
$a
Algebra
653
$a
Algorithms
653
$a
Data science
653
$a
Geometry
653
$a
Machine learning
653
$a
Optimization
690
$a
0364
690
$a
0463
690
$a
0984
710
2
$a
The University of Wisconsin - Madison.
$b
Mathematics.
$3
2101076
773
0
$t
Dissertations Abstracts International
$g
79-07B.
790
$a
0262
791
$a
Ph.D.
792
$a
2017
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10687436
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9432283
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login