語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
到查詢結果
[ null ]
切換:
標籤
|
MARC模式
|
ISBD
FindBook
Google Book
Amazon
博客來
Localization for Autonomous Navigation Systems in Large-Scale Outdoor Environments.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Localization for Autonomous Navigation Systems in Large-Scale Outdoor Environments./
作者:
Yu, Yang.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2021,
面頁冊數:
116 p.
附註:
Source: Dissertations Abstracts International, Volume: 83-11, Section: B.
Contained By:
Dissertations Abstracts International83-11B.
標題:
Calibration. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=29221682
ISBN:
9798438718482
Localization for Autonomous Navigation Systems in Large-Scale Outdoor Environments.
Yu, Yang.
Localization for Autonomous Navigation Systems in Large-Scale Outdoor Environments.
- Ann Arbor : ProQuest Dissertations & Theses, 2021 - 116 p.
Source: Dissertations Abstracts International, Volume: 83-11, Section: B.
Thesis (Ph.D.)--Hong Kong University of Science and Technology (Hong Kong), 2021.
This item must not be sold to any third party vendors.
Accurate localization is the most fundamental abilities of fully autonomous robots, including unmanned aerial vehicles (UAVs) [90, 46, 22], unmanned ground vehicles (UGVs) [121, 116, 98], and unmanned surface vehicles (USVs) [122, 109, 10]. Although numerous approaches have been adopted to achieve attractive performance on Simultaneous Localization and Mapping (SLAM) tasks in static and indoor scenarios, these tasks in large-scale and appearance-changing environments with robust performance are still challenging. For example, practical applications of mobile robots usually suffer from ineffective observations due to appearance changes, limitations of sensors, insufficient computational resources in large-scale environments, and accumulated drifts after long-term operation. To conquer these challenges, multiple sensors systems with sensor fusion can provide denser, higher frequency, and more dimensional measurements. Cameras, light detection and rangings (LiDARs), inertial measurement units (IMUs), and wheel encoders are common sensors for autonomous systems, especially for UGVs.In this thesis, I propose sensor fusion-based state estimators and localization systems, especially in large-scale outdoor environments. The thesis is divided into visual-based and LiDAR-based chapters. In the visual-based localization chapter, an omnidirectional visual-inertial state estimator is firstly proposed. It adopts panoramic images and inertial measurements to achieve not only robust performance with robot pose estimation but also online calibration of multiple sensors, robot velocity, and sensor bias. Then, I propose a complete visual localization system in the large-scale outdoor port scene. It combines learning-based semantic segmentation results with the prior map to implement robust and high-accuracy localization. I also utilize the proposed visual state estimator to compensate for the wheel odometry. In the multi-LiDAR localization chapter, I start with an automatic multi-LiDAR calibration method. Motion-based and appearance-based calibration methods are utilized for a novel calibration performance without any extra sensors, calibration target, or prior knowledge about surroundings. Based on that, I introduce a LiDAR-based localization approach without a 3D prebuild map, which is also considered to be suitable in challenging port scenes. In this localization approach, I also introduce LiDAR-wheel-encoder odometry with a four-wheel steering model. By conducting a series of experiments both in simulated and real-world large-scale challenging environments, I show proposed approaches with robust and accurate performance in indoor and outdoor mobile robot localization scenarios.
ISBN: 9798438718482Subjects--Topical Terms:
2068745
Calibration.
Localization for Autonomous Navigation Systems in Large-Scale Outdoor Environments.
LDR
:03910nmm a2200385 4500
001
2347719
005
20220823142341.5
008
241004s2021 ||||||||||||||||| ||eng d
020
$a
9798438718482
035
$a
(MiAaPQ)AAI29221682
035
$a
(MiAaPQ)HongKongSciTech_991013039228303412
035
$a
AAI29221682
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Yu, Yang.
$3
1672361
245
1 0
$a
Localization for Autonomous Navigation Systems in Large-Scale Outdoor Environments.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2021
300
$a
116 p.
500
$a
Source: Dissertations Abstracts International, Volume: 83-11, Section: B.
500
$a
Advisor: Liu, Ming.
502
$a
Thesis (Ph.D.)--Hong Kong University of Science and Technology (Hong Kong), 2021.
506
$a
This item must not be sold to any third party vendors.
520
$a
Accurate localization is the most fundamental abilities of fully autonomous robots, including unmanned aerial vehicles (UAVs) [90, 46, 22], unmanned ground vehicles (UGVs) [121, 116, 98], and unmanned surface vehicles (USVs) [122, 109, 10]. Although numerous approaches have been adopted to achieve attractive performance on Simultaneous Localization and Mapping (SLAM) tasks in static and indoor scenarios, these tasks in large-scale and appearance-changing environments with robust performance are still challenging. For example, practical applications of mobile robots usually suffer from ineffective observations due to appearance changes, limitations of sensors, insufficient computational resources in large-scale environments, and accumulated drifts after long-term operation. To conquer these challenges, multiple sensors systems with sensor fusion can provide denser, higher frequency, and more dimensional measurements. Cameras, light detection and rangings (LiDARs), inertial measurement units (IMUs), and wheel encoders are common sensors for autonomous systems, especially for UGVs.In this thesis, I propose sensor fusion-based state estimators and localization systems, especially in large-scale outdoor environments. The thesis is divided into visual-based and LiDAR-based chapters. In the visual-based localization chapter, an omnidirectional visual-inertial state estimator is firstly proposed. It adopts panoramic images and inertial measurements to achieve not only robust performance with robot pose estimation but also online calibration of multiple sensors, robot velocity, and sensor bias. Then, I propose a complete visual localization system in the large-scale outdoor port scene. It combines learning-based semantic segmentation results with the prior map to implement robust and high-accuracy localization. I also utilize the proposed visual state estimator to compensate for the wheel odometry. In the multi-LiDAR localization chapter, I start with an automatic multi-LiDAR calibration method. Motion-based and appearance-based calibration methods are utilized for a novel calibration performance without any extra sensors, calibration target, or prior knowledge about surroundings. Based on that, I introduce a LiDAR-based localization approach without a 3D prebuild map, which is also considered to be suitable in challenging port scenes. In this localization approach, I also introduce LiDAR-wheel-encoder odometry with a four-wheel steering model. By conducting a series of experiments both in simulated and real-world large-scale challenging environments, I show proposed approaches with robust and accurate performance in indoor and outdoor mobile robot localization scenarios.
590
$a
School code: 1223.
650
4
$a
Calibration.
$3
2068745
650
4
$a
Optimization.
$3
891104
650
4
$a
Signal processing.
$3
533904
650
4
$a
Robots.
$3
529507
650
4
$a
Unmanned aerial vehicles.
$3
3560267
650
4
$a
Navigation systems.
$3
3686148
650
4
$a
Robotics.
$3
519753
650
4
$a
Vehicles.
$3
2145288
650
4
$a
Construction.
$3
3561054
650
4
$a
Cameras.
$3
524039
650
4
$a
Sensors.
$3
3549539
650
4
$a
Maps.
$3
544078
650
4
$a
Mapping.
$3
3355992
650
4
$a
Outdoors.
$3
3564312
650
4
$a
Algorithms.
$3
536374
650
4
$a
Semantics.
$3
520060
650
4
$a
Clouds.
$3
561052
650
4
$a
Aerospace engineering.
$3
1002622
650
4
$a
Civil engineering.
$3
860360
650
4
$a
Computer science.
$3
523869
650
4
$a
Electrical engineering.
$3
649834
650
4
$a
Engineering.
$3
586835
650
4
$a
Linguistics.
$3
524476
650
4
$a
Transportation.
$3
555912
690
$a
0771
690
$a
0538
690
$a
0543
690
$a
0984
690
$a
0544
690
$a
0537
690
$a
0290
690
$a
0709
710
2
$a
Hong Kong University of Science and Technology (Hong Kong).
$3
1022235
773
0
$t
Dissertations Abstracts International
$g
83-11B.
790
$a
1223
791
$a
Ph.D.
792
$a
2021
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=29221682
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9470157
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入
(1)帳號:一般為「身分證號」;外籍生或交換生則為「學號」。 (2)密碼:預設為帳號末四碼。
帳號
.
密碼
.
請在此電腦上記得個人資料
取消
忘記密碼? (請注意!您必須已在系統登記E-mail信箱方能使用。)