Skip to content

ancorasir/DesignLearnRG_euROBIN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Design & Learning Research Group's (DLRG) Solution for euROBIN MSVC @ IROS 2024

Fastest Automated Task Board Solution Award
[Project Page] | [Video] | [Data]

Xudong Han1, Haoran Sun1, 2, Ning Guo1, Sheng Ge1, Jia Pan2, Fang Wan1, Chaoyang Song1
1 Southern University of Science and Technology, 2 The University of Hong Kong

teaser

Overview

This repository contains the code and brief introduction of DLRG's approach for the euROBIN Manipulation Skill Versatility Challenge (MSVC) at the 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2024). For more details, please refer to the project page.

Hardware Setup

The robot system used in the competition consists of:

And there is also the task board and the Hikvision NP-Y1-S smoke detector used as task objects.

Software Dependencies

The repository is developed with C++ and Python, and tested on Ubuntu 20.04 with the following dependencies:

Quick Start

First, download the latest release:

git clone https://github.com/ancorasir/DesignLearnRG_euROBIN.git
cd DesignLearnRG_euROBIN

Then, install the Python dependencies:

pip install -r requirements.txt

There are three main components in the repository: vision detection, robot motion, and user interface. You need to run them together to complete the task.

Vision Detection

To detect the task board and screen on it, a vision detector is provided in ./vision/. To run the vision detector, run:

cd vision
mkdir build
cd build
cmake ..
cmake --build .
./main

The vision detector will output the detected objects' positions and orientations in the robot base frame.

Robot Motion

The code for robot motion is provided in ./motion/. To run the tasks in normal order, run:

cd motion
python scripts/execute_task.py

You can also modify the order of the tasks, changing the order of the task execution function. To modify the motions for a specific task, you can edit the task files in ./motion/tasks/.

User Interface

An interface is provided in ./interface/, including 3D scene and curves, to visualize and record the robot data. You can also set the order of robot actions. To open the interface, run:

cd interface
python server.py

The interface will be available at 127.0.0.1:8000.

Data Availability

The data recorded during trials is available at Google Drive, which includes:

  • Positions and velocities of UR10e's joints
  • Trajectories of UR10e's tool center point (TCP)
  • Images captured by Intel Realsense D435i camera

All data is recorded at 10 Hz, and you can place it into ./data/ folder, and run the following code to visualize:

cd interface
python server.py -m log-data -d {DATA FOLDER NAME}

{DATA FOLDER NAME} is the folder name formatted with the time at which data recording started. For example, if you want to view the data in ./data/20240929-054829/, {DATA FOLDER NAME} should be 20240929-054829. After running the code, you can view the data at 127.0.0.1:8000.

License

This repository is released under the MIT License. See LICENSE for more information.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •