Robotics researchers have unveiled a turnkey platform, BestMan, that streamlines the creation and deployment of mobile manipulators. By tackling long-standing hurdles in embodied artificial intelligence—namely, complex software stacks, poor modularity, and mismatched simulation and hardware interfaces—BestMan empowers innovators to transition from algorithm design to real-world trials in hours rather than weeks.
Embodied AI Set to Improve Healthcare, Manufacturing & Home Assistance
Embodied AI, where robots perceive and act in physical environments, holds promise for breakthroughs in healthcare, manufacturing, logistics, and home assistance. Yet today’s teams often waste weeks adapting code from simulators to real machines or rewriting entire modules to accommodate new sensors or arms. BestMan’s unified approach not only speeds up research but also lowers the barrier for small labs and startups to field intelligent robots—potentially accelerating the arrival of safer, more capable service robots in everyday settings.
Key Advances Boost Flexibility and Slash Development Overhead
The study highlights several core advances that collectively enhance flexibility and reduce development overhead:
- Streamlined Skill Chain: BestMan integrates perception, planning, motion, and control into a single workflow, reducing the effort needed to coordinate software layers.
- Plug-and-Play Modularity: Core functions are packaged as interchangeable modular components. Researchers can swap in a new vision, planning, or grasping algorithms without needing to rewrite surrounding code.
- Simulation-to-Reality API: Identical interfaces work in both virtual environments and on physical robots, enabling code tested in simulation to run unmodified on real hardware.
- Hardware Decoupling: Mobile bases, robotic arms, and grippers are defined as independent building blocks, allowing teams to mix and match components without requiring changes to the software.
“We’ve designed BestMan so that perception, planning, motion, and control form a single, seamless pipeline,” explains Prof. Yan Ding. “Researchers can now focus on innovation instead of wrestling with integration.”
Plug-and-Play Modules and Streamlined Skill Chain Simplify Robotics
Built on the open-source PyBullet simulator, BestMan wraps each central capability—such as scene perception, task planning, navigation, and control—into clearly defined modules with standard interfaces. A lightweight middleware layer ensures that commands like “move forward” or “grasp object” invoke the same high-level calls in simulation and on physical robots. Configuration files (in YAML format) enable users to tweak parameters without coding, while a Blender plugin offers richer visual demonstrations than typical physics engines usually allow.
Open-Source PyBullet Framework Powers Seamless Sim-to-Real Integration
With its rigorous yet accessible design, BestMan aims to accelerate embodied AI research and cut years off the development cycle for new robot applications. The platform is freely available to the academic and industrial community, and ongoing work will expand supported sensors, arms, and task libraries—bringing adaptable, intelligent robots ever closer to daily life.
Published in
Frontiers of Computer Science in March 2025 (https://doi.org/10.1007/s11704-025-41109-6), this research was a collaborative effort among Chongqing University, Xi’an Jiaotong-Liverpool University, and the Shanghai Artificial Intelligence Laboratory.
DOI:
10.1007/s11704-025-41109-6