Real‐Time Motion Generation for Robot Manipulators in Complex Dynamic Environments

Motion generation in human–robot shared workspaces often faces challenges such as discontinuous movement and slow responses. To address these challenges, a real‐time motion generation method for robot manipulators in complex dynamic environments is proposed. The method autonomously generates safe tr...

Full description

Saved in:
Bibliographic Details
Main Authors: Tianyu Zhang, Hongguang Wang, Peng Lv, Xin'an Pan, Daqian Wang, Bingbing Yuan, Huiyang Yu
Format: Article
Language:English
Published: Wiley 2025-07-01
Series:Advanced Intelligent Systems
Subjects:
Online Access:https://doi.org/10.1002/aisy.202400738
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Motion generation in human–robot shared workspaces often faces challenges such as discontinuous movement and slow responses. To address these challenges, a real‐time motion generation method for robot manipulators in complex dynamic environments is proposed. The method autonomously generates safe trajectories and significantly enhances path smoothness and reaction time. An online minimum distance calculation algorithm between human and robot manipulators is developed via an inertial measurement unit‐based motion capture system. The algorithm employs a unified geometric representation of the “human–robot–environment” model using superquadric surfaces. It introduces a closed‐form Minkowski sum‐based minimum distance method for efficient calculations. Additionally, a collision‐free motion generator is proposed that integrates global planning with local control, where the controller simultaneously considers all minimum distances, enhancing motion continuity and human–robot safety. Experiments on robot‐manipulator‐assisted aviation refueling are conducted. The results demonstrate that the method performs tasks safely and efficiently in complex dynamic environments, with notable improvements in path smoothness and reaction time.
ISSN:2640-4567