Skip to content
/ PRISM Public

[Under review IEEE RAL 2025] A Hyperspectral Imaging Guided Robotic Grasping System

Notifications You must be signed in to change notification settings

ZainZh/PRISM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A Hyperspectral Imaging Guided Robotic Grasping System,

This is the official code release of A Hyperspectral Imaging Guided Robotic Grasping System.

[paper] [project] [code] [Datasets] [CAD files]

## Environment

The complete deployment of the project includes the following components:

  • Model Training and Inference
  • Robotic Manipulation
  • PRISM Control (Hyperspectral Camera, Motors)

Due to the windows required of the hyperspectral camera control interface, the project is developed on :

  • Windows 10.

But the model training and inference can be run on any platform such as Ubuntu 20.04 (tested) that supports PyTorch.

Installation

  1. Create conda environment and install pytorch

    This code is tested on Python 3.10.14 on Ubuntu 20.04 and Windows 10.

    conda create -n prism python=3.10
    conda activate prism
    # pytorch with cuda 11.8
    pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118
    
  2. Dependencies

    Install dependencies

    pip install joblib
    pip install tqdm
    pip install tensorboard
    pip install omegaconf
    pip install opencv-python
    pip install matplotlib 
    pip install scipy
    pip install scikit-learn
    pip install plantcv
    pip install spectral
    pip install numpy==1.26.4
    pip install h5py
    

Inference and Visualization

Only tested under the pycharms environment. please unclick the "Run with Python Console" and "view > Scientific Mode" option in the run configuration.

Run commands below to run the prism working animation:

python scripts/prism_animation.py
After finished, you shall see printed 3D affordance results w/ grasp and visualization at `run_realworld/gym_outputs/drawer_open/` like below:

Train and Test

You can modify the config parameter model_type in config/train.yaml to train the specific model.

python scripts/train.py

You can also run the test script to evaluate the trained model.

python scripts/test.py

Control Codes

All C++ device control codes are in the "c_device" folder. This includes control modules for Modbus devices, the Nachi robot, and the Specim linescan camera.

--c_device
    --libModbus
    --nachi  
    --specim 

About

[Under review IEEE RAL 2025] A Hyperspectral Imaging Guided Robotic Grasping System

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published