Generative Models, Attention Mechanisms, and Adaptive Methods for Robot Navigation in Complex Environments—A Survey

Autonomous mobile robots, equipped with multiple sensors, have been traditionally used to perform search, rescue and other tasks. A number of new scenarios for application of mobile robots have emerged in the last decade. These include automated logistics handling in warehouse-like environments (in...

Full description

Saved in:
Bibliographic Details
Main Authors: Harinath Sridharan, Nambala Ramsai, Bhaskar Vundurthy
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11084772/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Autonomous mobile robots, equipped with multiple sensors, have been traditionally used to perform search, rescue and other tasks. A number of new scenarios for application of mobile robots have emerged in the last decade. These include automated logistics handling in warehouse-like environments (in the context of e-commerce) and robot-assisted personal care. In these scenarios, there is a need for highly accurate object recognition and semantic knowledge. Also, enhanced safety requirements have come up as robots attempt to interact more with humans and make efforts to recognize their gestures and movements. Mobile robots also increasingly operate in malls and other zones which are pedestrian-rich. Thus, a need has arisen for a relook at navigation strategies for mobile robots. Classical approaches are typically inadequate in these new settings. This survey is aimed at studying the role of contemporary approaches in artificial intelligence in enabling successful robotic navigation in a variety of complex environments. In particular, we discuss how generative models, attention mechanisms and adaptive methods have helped mobile robots navigate in cluttered, uneven and even unknown indoor and outdoor environments. We also point to several interesting possibilities in the future.
ISSN:2169-3536