JornadasAutomatica2025

ADAMSim: PyBullet-Based Simulation Environment for Research on Domestic Mobile Manipulator Robots

General integration of the modules developed for ADAMSim.

Abstract

This paper introduces ADAMSim, a PyBullet-based simulation environment tailored for Ambidextrous Domestic Autonomous Manipulator (ADAM), developed to support research in navigation, manipulation, and learning for domestic robotics. The simulator accurately replicates the structure and behavior of the physical robot, enabling robust sim-to-real and real-to-sim algorithm transfer. ADAMSim follows a modular design, including navigation, arm and hand kinematics, perception, and ROS communication. This architecture allows synchronized operation between the real robot and its digital twin. Several example applications were developed, ranging from vision and grasping tasks to navigation and teleoperation, including experiments running both simulated and real robots simultaneously. Its open-source and flexible design makes ADAMSim a powerful tool for safe and reproducible algorithm development and experimentation in household robotics. The platform is also intended to support future research in indoor mapping, advanced manipulation learning, and educational projects, serving as a test bed.

(a) Learning a manipulation task from demonstrations using a real-sim-real pipeline, (b) Simultaneous navigation with ADAM real and simulated environments

ADAMSim Modules

ADAMSim is structured in a modular fashion, allowing flexibility, scalability, and ease of integration with both simulated and real robotic systems. The core modules are:

Navigation Module: Enables control of the mobile base, supporting both manual teleoperation and autonomous waypoint navigation. It includes continuous movement controllers, real-time odometry tracking, and customizable speed settings. Obstacle insertion and detection is also supported, allowing experiments with dynamic path planning.

Navigation Module

(a) ADAM’s motion is defined by the linear velocity v, angular velocity ω, and wheel velocities vr and vl. θ determines the robot’s orientation, (b) Example of use of the Lidar information for navigation. Red color represents obstacles (purple and yellow boxes) in the ray casting.

Arm and Hand Kinematics Module: Provides full kinematic control of both arms and grippers. Users can command the robot using joint angles or Cartesian goals, with inverse kinematics and trajectory generation included. It supports both high-level (goal reaching) and low-level joint manipulation.

Arm Kinematics Module

Sequence of movements of the right arm passing through different waypoints with specific position and orientation.

Perception Module: Simulates onboard RGB-D sensors and provides synthetic visual data. This is ideal for training vision models and performing object detection, segmentation, and grasp planning using realistic 3D input.

Perception Module

Example of vision with the RGB-D sensor. ADAMSim provides RGB information, depth, and segmented masks in a synthetic manner.

ROS Communication Module: Creates a bridge between the simulated and real robots via ROS. This module allows for synchronized execution and bidirectional data transfer between ADAMSim and the physical ADAM robot, supporting both sim-to-real and real-to-sim experiments.

ROS Communication Module

Diagram of the connections between the real model and the simulator through the created ROS bridge.

ADAMSim Examples

To evaluate the efficiency of the simulator, several examples have been carried out both in simulation and by connecting the simulation with the real robot. These examples demonstrate the modularity of ADAMSim for fast, efficient, and simple algorithm development, as well as for debugging without needing access to the robot in the real environment. The examples include various use cases where tasks are executed entirely in simulation, showcasing how the ADAM robot can be directly used in the PyBullet-based simulated environment. Additionally, two use cases have been added to highlight the ROS bridge between the simulated model and the real robot. In these cases, one test involves object manipulation, where a custom Learning from Demonstration algorithm is used to generate solutions for simultaneously manipulating two objects. The second test shows how the robot is capable of navigating autonomously in the simulated environment and, through the communication bridge, replicates the same motion in the real robot.

Manipulation with LfD method of real and simulated objects.

Simultaneous navigation using the real and the simulated ADAM.

Moving arms and hands simultaneously in simulated environment.

Camera information obtained from ADAMSim

Robotic hands simulated grasping and moving a simulated bottle

Teleoperation of the arms and hands using sliders.

Teleoperation of the robot base and detection of simulated boxes with the 2D Lidar