Basic Simulation Overview
Welcome to the DISCOVERSE Basic Simulation Tutorial! This tutorial will guide you through creating and running robot simulations in DISCOVERSE. DISCOVERSE is a high-fidelity robot simulation platform based on the MuJoCo physics engine and supports 3D Gaussian Splatting rendering technology.
đ¯ Learning Objectivesâ
After completing this tutorial, you will be able to:
- Understand the configuration system and basic architecture of DISCOVERSE
- Create and configure robot simulation environments
- Use real robot models (such as AirbotPlay, MMK2)
- Configure sensors and rendering options
- Run basic robot operation tasks
đ Prerequisitesâ
Before you begin, please make sure you have:
- â Completed the Installation Guide
- â Run the Quick Start
- â Read the Basic Concepts
đī¸ DISCOVERSE Architectureâ
DISCOVERSE is built on the following core technologies:
MuJoCo Physics Engineâ
- High-precision rigid body dynamics simulation
- Supports contact and friction simulation
- Real-time physics computation
3D Gaussian Splatting Renderingâ
- High-fidelity scene rendering
- Supports realistic visual effects
- Switchable with traditional MuJoCo renderer
Robot Model Supportâ
- AirbotPlay: 7-DOF robotic arm, suitable for tabletop manipulation tasks
- MMK2: Dual-arm mobile robot, supporting complex collaboration tasks
- LeapHand: 16-DOF dexterous hand, enabling fine manipulation
Sensor Systemâ
- RGB Cameras: Multi-view visual information
- Depth Cameras: 3D perception capability
- LiDAR: High-precision point cloud data
- IMU: Inertial measurement units
- Tactile Sensors: Force and touch feedback
đ Project Structureâ
Understanding DISCOVERSE's directory structure helps you quickly locate required files:
DISCOVERSE/
âââ discoverse/ # Core simulation framework
â âââ envs/ # Environment definitions
â âââ robots/ # Robot model interfaces
â âââ sensors/ # Sensor implementations
â âââ utils/ # Utility functions
âââ mjcf/ # MuJoCo scene description files
â âââ robots/ # Robot MJCF files
â âââ objects/ # Object models
â âââ tasks/ # Task scene files
âââ models/ # 3D models and assets
â âââ meshes/ # Mesh files (.obj, .stl)
â âââ 3dgs/ # 3D Gaussian Splatting models
â âââ textures/ # Texture files
âââ examples/ # Example scripts
â âââ robots/ # Basic robot examples
â âââ tasks_airbot_play/ # AirbotPlay task examples
â âââ tasks_mmk2/ # MMK2 task examples
â âââ mocap_ik/ # Inverse kinematics examples
âââ scripts/ # Utility scripts
âââ data/ # Generated data storage
đ§ Core Configuration Systemâ
DISCOVERSE uses a unified configuration system to manage simulation parameters:
BaseConfig Classâ
The core configuration class contains all simulation settings:
from discoverse.envs import BaseConfig
# Create basic configuration
cfg = BaseConfig()
# Core simulation parameters
cfg.mjcf_file_path = "mjcf/robots/airbot_play.xml" # Scene file
cfg.timestep = 0.002 # Physics timestep
cfg.decimation = 10 # Control decimation
cfg.sync = True # Real-time sync
cfg.headless = False # Show GUI
# Rendering configuration
cfg.render_set = {
"fps": 30, # Frame rate
"width": 640, # Image width
"height": 480 # Image height
}
# Sensor configuration
cfg.obs_rgb_cam_id = [0, 1] # RGB camera IDs
cfg.obs_depth_cam_id = [0] # Depth camera IDs
# High-fidelity rendering (optional)
cfg.use_gaussian_renderer = False
Configuration Options Explainedâ
Basic Simulation Parametersâ
mjcf_file_path
: Path to MuJoCo scene file (.xml or .mjb)timestep
: Physics simulation time step (typically 0.001-0.002 seconds)decimation
: Control frequency reduction factor (actual control rate = 1/(timestep à decimation))sync
: Enable real-time synchronization (useful for teleoperation)headless
: Run without GUI (useful for data generation or server deployment)
Rendering Parametersâ
render_set
: Dictionary containing rendering settingsfps
: Target frame rate for visualizationwidth
/height
: Rendered image dimensions
Sensor Configurationâ
obs_rgb_cam_id
: List of RGB camera IDs to use for observationsobs_depth_cam_id
: List of depth camera IDs to use for observations
Advanced Featuresâ
use_gaussian_renderer
: Enable 3D Gaussian Splatting for high-fidelity renderingrb_link_list
: Robot body names (for 3DGS rendering)obj_list
: Interactive object names (for 3DGS rendering)gs_model_dict
: Mapping from body names to 3DGS model paths
đ¤ Robot Platforms Overviewâ
AirbotPlayâ
A 7-DOF robotic arm perfect for learning and tabletop manipulation:
Specifications:
- 7 degrees of freedom
- Parallel gripper
- Reach: ~65cm
- Payload: 1kg
Use Cases:
- Pick and place tasks
- Object manipulation
- Basic learning algorithms
Example Configuration:
cfg.mjcf_file_path = "mjcf/robots/airbot_play.xml"
MMK2 Dual-Arm Mobile Robotâ
A sophisticated dual-arm mobile platform for complex tasks:
Specifications:
- Dual 7-DOF arms
- Mobile base with omnidirectional wheels
- Stereo cameras
- Lift mechanism
Use Cases:
- Multi-object manipulation
- Mobile manipulation
- Collaboration tasks
Example Configuration:
cfg.mjcf_file_path = "mjcf/robots/mmk2.xml"
LeapHandâ
A 16-DOF dexterous hand for fine manipulation:
Specifications:
- 16 degrees of freedom
- 4 fingers with tactile sensing
- High dexterity for complex grasping
Use Cases:
- Dexterous manipulation
- In-hand manipulation
- Object reorientation
đ Simulation Workflowâ
A typical simulation workflow in DISCOVERSE follows these steps:
- Environment Setup: Configure robot, scene, and sensors
- Initialization: Load models and initialize physics
- Control Loop: Execute actions and collect observations
- Data Collection: Save trajectories and sensor data
- Analysis: Process and analyze results
Basic Simulation Loopâ
import discoverse
# 1. Create environment
env = discoverse.make("AirbotPlayManipulation")
# 2. Reset environment
obs = env.reset()
# 3. Run simulation loop
for step in range(1000):
# Generate action (random or from policy)
action = env.action_space.sample()
# Execute action
obs, reward, done, info = env.step(action)
# Check if episode finished
if done:
obs = env.reset()
env.close()
đŽ Interactive Featuresâ
DISCOVERSE provides rich interactive features for development and debugging:
Keyboard Controlsâ
- Basic Controls: Help (h), Reset (r), Reload scene (F5)
- View Controls: Camera switching ([/]), Free camera (Esc)
- Rendering: Toggle Gaussian rendering (Ctrl+g), Depth view (Ctrl+d)
Mouse Controlsâ
- Left Drag: Rotate camera view
- Right Drag: Pan camera view
- Scroll Wheel: Zoom in/out
Real-time Debuggingâ
- Print States: Press 'p' to print robot joint states and poses
- Visual Markers: Add visual markers for debugging kinematics
- Sensor Visualization: Real-time display of camera and LiDAR data
đ Performance Considerationsâ
Simulation Speedâ
- Physics Timestep: Smaller timesteps increase accuracy but reduce speed
- Decimation: Higher decimation reduces control frequency but improves performance
- Headless Mode: Disable GUI for faster data generation
Memory Usageâ
- 3DGS Models: Can be memory-intensive; adjust quality settings as needed
- Sensor Data: Large images/point clouds can consume significant memory
- Batch Processing: Process data in batches for large datasets
GPU Accelerationâ
- 3DGS Rendering: Requires CUDA-capable GPU
- LiDAR Simulation: Benefits from GPU acceleration with Taichi
- Parallel Environments: Multiple environments can share GPU resources
đ Debugging Tipsâ
Common Issuesâ
- Scene Loading Errors: Check MJCF file paths and model references
- Rendering Problems: Verify GPU drivers and CUDA installation
- Physics Instability: Adjust timestep and contact parameters
- Performance Issues: Profile bottlenecks and optimize accordingly
Diagnostic Toolsâ
check_installation.py
: Verify system setup- Verbose Logging: Enable detailed logging for debugging
- Visual Debugging: Use wireframe and collision visualization
- Profiling: Monitor CPU/GPU usage and memory consumption
đ Learning Pathâ
To master DISCOVERSE basic simulation, we recommend following this learning path:
- Environment Setup - Configure your first simulation environment
- Robot Control - Learn robot control interfaces and kinematics
- Sensor Integration - Add sensors and process data
- Task Design - Create custom manipulation tasks
- Advanced Features - Explore 3DGS rendering and advanced capabilities
đ¯ What's Next?â
After completing this overview, you're ready to:
- Set up your first custom simulation environment
- Learn detailed robot control mechanisms
- Integrate sensors for perception tasks
- Explore advanced rendering capabilities
Let's start with Environment Setup to begin your hands-on journey with DISCOVERSE!
This tutorial provides a comprehensive foundation for robot simulation in DISCOVERSE. Each subsequent section will dive deeper into specific aspects of the framework.