Running RTD (Autonomous Reachability-based Manipulator Trajectory Design) in various platforms and simulate in pybullet, and create rendered animation.
It would be a good idea to install the python code in a virtual environment using using either conda
or venv
.
This installation assumes that you have correctly compiled armour
's dependencies, which includes ipopt
.
rtd-pybullet
has a number of dependencies. To generate nice visuals, you'll need Blender:
There are also a number of python dependencies that can be installed by
pip install -r requirements.txt
Make sure Blender is installed first. Then install the blender plugin:
sudo snap install blender --classic
Look at the Blender documentation for installing plugins on Windows or MacOS.
git clone https://github.com/roahmlab/rtd-pybullet
cd rtd-pybullet
git submodule update --init --recursive
pip install -e .
cd rtd-pybullet/zonopy
pip install -e .
cd armtd-dev/cuda-dev/PZsparse-Bernstein/build
cmake ..
make
If the above doesn't work, there might be an issue with using Pybind11
as a submodule. You can get around this by installing it using pip.
We need to add the path to PYTHONPATH
so that python can find the pybind module created by CMake.
export PYTHONPATH=$PYTHONPATH:$PWD
Go to scripts/
and run test_bullet_planner.py
to see what happens.
Go to scripts/
and run test_zonopy.py
to see what happens.
Zonopy environment will generate an obstacle avoidance task, where a Kinova arm starts from a random position moves towards a random goal positions, with random obstacles around. Run scripts/test_zonopy.py
to see what happens.
In order to visualize the reachable sets in Pybullet and in Blender, it has to be saved as mesh files so that they can be imported. Use MATLAB to run zonotope/FO2stl_zonopy.m
will save those reachable sets as convex set in the format .stl
. Don't forget to change folder.
Since the pybullet_blender_recorder works the best with .urdf
format, we can convert those files to .urdf
first. This process is done using pymeshlab and object2urdf. These packages make things easy. Running utils/stl2obj2urdf.py
will do the job.
The pre-computed trajectories are discrete time and are assumed to be perfect tracking. We can track the trajectories using Pybullet with much smaller time step and real physics. The Pybullet environment is set up in here. Run scripts/bullet_zonopy_obstacle_avoidance.py
to see the Pybullet simulation. In the mean time, the motions of the arm and the reachable sets are saved as .pkl
files.
With all the .pkl
files, it is easy to do the rendering by importing them to Blender. Check out the instructions here.
- Improve the installation instructions
- Remove ARMOUR and zonopy as submodules