[Project Page] | [Video] | [Data]
Xudong Han1, Haoran Sun1, 2, Ning Guo1, Sheng Ge1, Jia Pan2, Fang Wan1, Chaoyang Song1
1 Southern University of Science and Technology, 2 The University of Hong Kong
This repository contains the code and brief introduction of DLRG's approach for the euROBIN Manipulation Skill Versatility Challenge (MSVC) at the 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2024). For more details, please refer to the project page.
The robot system used in the competition consists of:
- Collebrative robot: Universal Robots UR10e
- Adaptive gripper: Robotiq Hand-E
- Fingertip: 3D-printed fingertip
- RGB-D Camera: Intel RealSense D435i
- Camera bracket: CNC-milled camera bracket
- Fill light (Optional): ZHIYUN FIVERAY M20
And there is also the task board and the Hikvision NP-Y1-S smoke detector used as task objects.
The repository is developed with C++ and Python, and tested on Ubuntu 20.04 with the following dependencies:
- Intel RealSense SDK 2.0
- ONNX Runtime
- OpenCV
- Protocol Buffers
- Real-time Data Exchange (RTDE)
- Rerun
- Ultralytics YOLOv8
First, download the latest release:
git clone https://github.com/ancorasir/DesignLearnRG_euROBIN.git
cd DesignLearnRG_euROBIN
Then, install the Python dependencies:
pip install -r requirements.txt
There are three main components in the repository: vision detection, robot motion, and user interface. You need to run them together to complete the task.
To detect the task board and screen on it, a vision detector is provided in ./vision/
. To run the vision detector, run:
cd vision
mkdir build
cd build
cmake ..
cmake --build .
./main
The vision detector will output the detected objects' positions and orientations in the robot base frame.
The code for robot motion is provided in ./motion/
. To run the tasks in normal order, run:
cd motion
python scripts/execute_task.py
You can also modify the order of the tasks, changing the order of the task execution function. To modify the motions for a specific task, you can edit the task files in ./motion/tasks/
.
An interface is provided in ./interface/
, including 3D scene and curves, to visualize and record the robot data. You can also set the order of robot actions. To open the interface, run:
cd interface
python server.py
The interface will be available at 127.0.0.1:8000
.
The data recorded during trials is available at Google Drive, which includes:
- Positions and velocities of UR10e's joints
- Trajectories of UR10e's tool center point (TCP)
- Images captured by Intel Realsense D435i camera
All data is recorded at 10 Hz, and you can place it into ./data/
folder, and run the following code to visualize:
cd interface
python server.py -m log-data -d {DATA FOLDER NAME}
{DATA FOLDER NAME}
is the folder name formatted with the time at which data recording started. For example, if you want to view the data in ./data/20240929-054829/
, {DATA FOLDER NAME}
should be 20240929-054829
. After running the code, you can view the data at 127.0.0.1:8000
.
This repository is released under the MIT License. See LICENSE for more information.