Wu, Dongjie and Zhong, Xunyu and Peng, Xiafu and Hu, Huosheng and Liu, Qiang (2022) Multimodal Information Fusion for High-Robustness and Low-Drift State Estimation of UGVs in Diverse Scenes. IEEE Transactions on Instrumentation and Measurement, 71. pp. 1-15. DOI https://doi.org/10.1109/tim.2022.3205687
Wu, Dongjie and Zhong, Xunyu and Peng, Xiafu and Hu, Huosheng and Liu, Qiang (2022) Multimodal Information Fusion for High-Robustness and Low-Drift State Estimation of UGVs in Diverse Scenes. IEEE Transactions on Instrumentation and Measurement, 71. pp. 1-15. DOI https://doi.org/10.1109/tim.2022.3205687
Wu, Dongjie and Zhong, Xunyu and Peng, Xiafu and Hu, Huosheng and Liu, Qiang (2022) Multimodal Information Fusion for High-Robustness and Low-Drift State Estimation of UGVs in Diverse Scenes. IEEE Transactions on Instrumentation and Measurement, 71. pp. 1-15. DOI https://doi.org/10.1109/tim.2022.3205687
Abstract
Currently, the autonomous positioning of unmanned ground vehicles (UGVs) still faces the problems of insufficient persistence and poor reliability, especially in the challenging scenarios where satellites are denied, or the sensing modalities such as vision or laser are degraded. Based on multimodal information fusion and failure detection (FD), this article proposes a high-robustness and low-drift state estimation system suitable for multiple scenes, which integrates light detection and ranging (LiDAR), inertial measurement units (IMUs), stereo camera, encoders, attitude and heading reference system (AHRS) in a loose coupling way. Firstly, a state estimator with variable fusion mode is designed based on the error-state extended Kalman filtering (ES-EKF), which can fuse encoder-AHRS subsystem (EAS), visual-inertial subsystem (VIS), and LiDAR subsystem (LS) and change its integration structure online by selecting a fusion mode. Secondly, in order to improve the robustness of the whole system in challenging environments, an information manager is created, which judges the health status of subsystems by degeneration metrics, and then online selects appropriate information sources and variables to enter the estimator according to their health status. Finally, the proposed system is extensively evaluated using the datasets collected from six typical scenes: street, field, forest, forest-at-night, street-at-night and tunnel-at-night. The experimental results show our framework is better or comparable accuracy and robustness than existing publicly available systems.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | State estimation; Laser radar; Cameras; Reliability; Kalman filters; Feature extraction; Degradation; Error-state extended Kalman filter (ES-EKF); failure detection (FD) and handling; light detection and ranging (LiDAR)-inertial-visual-encoder odometry; multimodal information fusion |
Divisions: | Faculty of Science and Health Faculty of Science and Health > Computer Science and Electronic Engineering, School of |
SWORD Depositor: | Unnamed user with email elements@essex.ac.uk |
Depositing User: | Unnamed user with email elements@essex.ac.uk |
Date Deposited: | 25 Nov 2022 17:19 |
Last Modified: | 30 Oct 2024 20:50 |
URI: | http://repository.essex.ac.uk/id/eprint/34099 |
Available files
Filename: Multimodal_Information_Fusion_for_High-Robustness_and_Low-Drift_State_Estimation_of_UGVs_in_Diverse_Scenes.pdf