Skip to content

prgumd/FeelAnyForce

Repository files navigation


FeelAnyForce: Estimating Contact Force Feedback from Tactile Sensation for Vision-Based Tactile Sensors

Table of Contents

Update

  • 01/30/25 Accepted to ICRA25!
  • 03/01/25 Data release
  • 03/24/25 First release of the code

⚙️ Installation

1️⃣ Clone this repository

To clone this repository you can launch the following command

git clone https://github.com/prgumd/FeelAnyForce.git
cd FeelAnyForce

2️⃣ Create and activate the virtual environment

To create and activate the virtual environment to work with this repository you can launch the following command

conda env create -f env.yml
conda activate anyforce_env

or

virtualenv anyforce_env
source anyforce_env/bin/activate
pip install -r requirements.txt

3️⃣ Download the data

Create dataset folder and download the data with

bash dataset_downloader.sh

4️⃣ Training from scratch

To train the network,

python training_multi_head.py --epochs 100 --labels_train ./dataset/TacForce_train_set.csv --labels_val ./dataset/TacForce_val_set.csv --ckpt_dir $ckpt_dir --data_basedir ./dataset/ --training_name TacForce_v1 

🎬 Real-Time prediction on GelSight Mini

wget -P ckpt/ https://huggingface.co/amirsh1376/FeelAnyForce/resolve/main/checkpoint_v1.pth.tar   # download the checkpoint
python online_test.py --checkpoint ckpt/checkpoint_v1.pth.tar

🩹 Calibration

To record calibration data, you can 3D print a fixture for your force sensor along with a mount that allows the tactile sensor to be attached securely to the force sensor.

We’ve designed an indenter to replicate the indentation from a robot’s end effector. This calibration setup allows for stable indentation on the tactile sensor, ensuring the force sensor readings can stabilize. Additionally, we included placeholders for calibration weights to apply various forces on the indenter.

python record_deForce.py --record_count 100 --interval 1  

Feel free to adjust the --interval and --record_count parameters to suit your data collection needs.

Running the script will generate a .csv file containing paths to the saved tactile and depth images, along with corresponding force readings.

📊 Training with Calibration Data

It’s recommended to balance your dataset for a more uniform force distribution, as discussed on the project website.

Once your calibration data is collected, you can train the model using:

python training_multi_head.py \
  --labels_train calibration_sphere/calibration.csv \
  --ckpt_dir $ckpt_dir \
  --data_basedir ./dataset/ \
  --lr_backbone 0.00001 \
  --lr_calibration 0.0001 \
  --tactile_backbone_training calibration

Citing this paper

@misc{shahidzadeh2024feelanyforceestimatingcontactforce,
      title={FeelAnyForce: Estimating Contact Force Feedback from Tactile Sensation for Vision-Based Tactile Sensors}, 
      author={Amir-Hossein Shahidzadeh and Gabriele Caddeo and Koushik Alapati and Lorenzo Natale and Cornelia Fermüller and Yiannis Aloimonos},
      year={2024},
      eprint={2410.02048},
      archivePrefix={arXiv},
      primaryClass={cs.RO},
      url={https://arxiv.org/abs/2410.02048}, 
}

🧔 Maintainer

This repository is maintained by:

@AmirShahid
@gabrielecaddeo

Acknowledegments

Thanks to these great repositories: DinoV2, Dino, Sim2Surf, gsrobotics

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published