Zhenyu Liu, Jing Wang, and Fuli Zhang


ORB-SLAM3, visual SLAM, feature extraction, adaptive threshold, illumination


Simultaneous localisation and mapping (SLAM) is crucial for autonomous robots to navigate and interact with their environment. Traditional visual SLAM systems, such as ORB-SLAM3, often struggle with illumination changes due to their reliance on fixed thresholds for feature extraction. This limitation hampers their adaptability and robustness in diverse lighting conditions, which is critical for real-world applications. Addressing this challenge, inspired by the human eye’s adaptive response to varying light intensity, this study proposes an improved ORB-SLAM. In this algorithm, feature extraction thresholds are dynamically adjusted based on real-time image brightness analysis, mirroring how the human eye regulates pupil size under varying illumination to maintain visual quality. An adaptive threshold strategy is designed using a linear mapping function, enhancing the extraction of stable and reliable features under both low and high lighting conditions. Experimental results show that our approach outperforms the standard ORB-SLAM3 with negligible increment of time consumptions. This enhancement significantly increases the illumination adaptability and robustness of the SLAM system, providing critical technological support for the effective operation of autonomous robots in practical applications.

Important Links:

Go Back