Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Monte Carlo Tree Methods for Nonline...
~
Zhai, Yaoguang.
Linked to FindBook
Google Book
Amazon
博客來
Monte Carlo Tree Methods for Nonlinear Optimization.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Monte Carlo Tree Methods for Nonlinear Optimization./
Author:
Zhai, Yaoguang.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2024,
Description:
176 p.
Notes:
Source: Dissertations Abstracts International, Volume: 85-10, Section: B.
Contained By:
Dissertations Abstracts International85-10B.
Subject:
Computer science. -
Online resource:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30994656
ISBN:
9798382303819
Monte Carlo Tree Methods for Nonlinear Optimization.
Zhai, Yaoguang.
Monte Carlo Tree Methods for Nonlinear Optimization.
- Ann Arbor : ProQuest Dissertations & Theses, 2024 - 176 p.
Source: Dissertations Abstracts International, Volume: 85-10, Section: B.
Thesis (Ph.D.)--University of California, San Diego, 2024.
As highly nonlinear continuous functions become the prevalent model of computation, NP-hard optimization problems over the continuous domain pose significant challenges to AI/ML algorithms and systems, especially in terms of their robustness and safety. The key to nonlinear optimization is to efficiently search through input regions with potentially widely varying numerical properties to achieve low-regret descent and fast progress toward the optima. Monte Carlo Tree Search (MCTS) methods have recently been introduced to improve global optimization by computing better partitioning of the search space that balances exploration and exploitation.This dissertation investigates the application of Monte Carlo tree methods for nonlinear optimization (encompassing black-box and non-convex optimization) to identify the global optimum, and the crafting of training datasets designed to boost transferability and reduce dataset size in computational molecular dynamics. To tackle global optimization challenges, the study integrates sampling strategies with MCTS frameworks, employing diverse local optimization techniques to highlight promising samples. These techniques span stochastic search, Gaussian Processes regression, numerical overapproximation of the objective function, and analysis of first- and second-order information. In the realm of training dataset development, computational simulations of water serve as a practical case study. An active learning framework is introduced to efficiently condense the size of the training dataset while preserving its quality and comprehensiveness. The research further explores model transferability by assessing various subsets for training set inclusion to the simulation of water molecules, thereby uncovering the model's adaptability challenges across different scenarios.The findings affirm that Monte Carlo tree methods provide cost-effective strategies for managing the complexities inherent in state space exploration. By applying these methods to a range of application areas, the dissertation underscores the robustness and utility of sampling techniques in advancing machine learning research.
ISBN: 9798382303819Subjects--Topical Terms:
523869
Computer science.
Subjects--Index Terms:
Global optimization
Monte Carlo Tree Methods for Nonlinear Optimization.
LDR
:03319nmm a2200385 4500
001
2398167
005
20240812064355.5
006
m o d
007
cr#unu||||||||
008
251215s2024 ||||||||||||||||| ||eng d
020
$a
9798382303819
035
$a
(MiAaPQ)AAI30994656
035
$a
AAI30994656
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Zhai, Yaoguang.
$3
3768076
245
1 0
$a
Monte Carlo Tree Methods for Nonlinear Optimization.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2024
300
$a
176 p.
500
$a
Source: Dissertations Abstracts International, Volume: 85-10, Section: B.
500
$a
Advisor: Gao, Sicun;Paesani, Francesco.
502
$a
Thesis (Ph.D.)--University of California, San Diego, 2024.
520
$a
As highly nonlinear continuous functions become the prevalent model of computation, NP-hard optimization problems over the continuous domain pose significant challenges to AI/ML algorithms and systems, especially in terms of their robustness and safety. The key to nonlinear optimization is to efficiently search through input regions with potentially widely varying numerical properties to achieve low-regret descent and fast progress toward the optima. Monte Carlo Tree Search (MCTS) methods have recently been introduced to improve global optimization by computing better partitioning of the search space that balances exploration and exploitation.This dissertation investigates the application of Monte Carlo tree methods for nonlinear optimization (encompassing black-box and non-convex optimization) to identify the global optimum, and the crafting of training datasets designed to boost transferability and reduce dataset size in computational molecular dynamics. To tackle global optimization challenges, the study integrates sampling strategies with MCTS frameworks, employing diverse local optimization techniques to highlight promising samples. These techniques span stochastic search, Gaussian Processes regression, numerical overapproximation of the objective function, and analysis of first- and second-order information. In the realm of training dataset development, computational simulations of water serve as a practical case study. An active learning framework is introduced to efficiently condense the size of the training dataset while preserving its quality and comprehensiveness. The research further explores model transferability by assessing various subsets for training set inclusion to the simulation of water molecules, thereby uncovering the model's adaptability challenges across different scenarios.The findings affirm that Monte Carlo tree methods provide cost-effective strategies for managing the complexities inherent in state space exploration. By applying these methods to a range of application areas, the dissertation underscores the robustness and utility of sampling techniques in advancing machine learning research.
590
$a
School code: 0033.
650
4
$a
Computer science.
$3
523869
650
4
$a
Computational chemistry.
$3
3350019
650
4
$a
Physical chemistry.
$3
1981412
653
$a
Global optimization
653
$a
Machine learning
653
$a
Monte Carlo Tree Search
653
$a
Computational simulations
690
$a
0984
690
$a
0219
690
$a
0800
690
$a
0494
710
2
$a
University of California, San Diego.
$b
Computer Science and Engineering.
$3
1018473
773
0
$t
Dissertations Abstracts International
$g
85-10B.
790
$a
0033
791
$a
Ph.D.
792
$a
2024
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30994656
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9506487
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login