State of the Field
Recent advancements in robotics navigation are increasingly focused on enhancing the ability of robots to navigate complex, real-world environments with greater efficiency and adaptability. A notable trend is the integration of vision-language models to improve semantic understanding and navigation planning. Systems like SysNav and BEACON leverage these models to facilitate robust object navigation and occlusion handling, respectively, while frameworks such as DreamToNav and OpenFrontier explore generative approaches for intuitive human-robot interaction. The shift from reactive to map-based strategies, as seen in the development of Uni-Walker, emphasizes the importance of retaining learned knowledge across tasks, addressing the challenge of catastrophic forgetting. Additionally, the introduction of datasets like STONE aims to provide comprehensive multi-modal training resources, enhancing the scalability and accuracy of traversability prediction in off-road scenarios. Collectively, these efforts are paving the way for more autonomous, flexible, and efficient robotic navigation systems capable of operating in diverse and unpredictable environments.
Papers
1–10 of 14SysNav: Multi-Level Systematic Cooperation Enables Real-World, Cross-Embodiment Object Navigation
Object navigation (ObjectNav) in real-world environments is a complex problem that requires simultaneously addressing multiple challenges, including complex spatial structure, long-horizon planning an...
BEACON: Language-Conditioned Navigation Affordance Prediction under Occlusion
Language-conditioned local navigation requires a robot to infer a nearby traversable target location from its current observation and an open-vocabulary, relational instruction. Existing vision-langua...
From Reactive to Map-Based AI: Tuned Local LLMs for Semantic Zone Inference in Object-Goal Navigation
Object-Goal Navigation (ObjectNav) requires an agent to find and navigate to a target object category in unknown environments. While recent Large Language Model (LLM)-based agents exhibit zero-shot re...
APPLV: Adaptive Planner Parameter Learning from Vision-Language-Action Model
Autonomous navigation in highly constrained environments remains challenging for mobile robots. Classical navigation approaches offer safety assurances but require environment-specific parameter tunin...
T2Nav Algebraic Topology Aware Temporal Graph Memory and Loop Detection for ZeroShot Visual Navigation
Deploying autonomous agents in real world environments is challenging, particularly for navigation, where systems must adapt to situations they have not encountered before. Traditional learning approa...
DreamToNav: Generalizable Navigation for Robots via Generative Video Planning
We present DreamToNav, a novel autonomous robot framework that uses generative video models to enable intuitive, human-in-the-loop control. Instead of relying on rigid waypoint navigation, users provi...
Lifelong Embodied Navigation Learning
Embodied navigation agents powered by large language models have shown strong performance on individual tasks but struggle to continually acquire new navigation skills, which suffer from catastrophic ...
OpenFrontier: General Navigation with Visual-Language Grounded Frontiers
Open-world navigation requires robots to make decisions in complex everyday environments while adapting to flexible task requirements. Conventional navigation approaches often rely on dense 3D reconst...
SEA-Nav: Efficient Policy Learning for Safe and Agile Quadruped Navigation in Cluttered Environments
Efficiently training quadruped robot navigation in densely cluttered environments remains a significant challenge. Existing methods are either limited by a lack of safety and agility in simple obstacl...
STONE Dataset: A Scalable Multi-Modal Surround-View 3D Traversability Dataset for Off-Road Robot Navigation
Reliable off-road navigation requires accurate estimation of traversable regions and robust perception under diverse terrain and sensing conditions. However, existing datasets lack both scalability an...