Skip to content

Wshtm/Prefix-LoRA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Prefix-LoRA: Combining Prefix Tuning with LoRA


Prerequisites:

  • Python 3.8
  • Pytorch-lightning 0.9.0
  • Pytorch

Setup:

cd transformer; pip install -e .; Download xsum dataset


Train with BART model

Hybrid Method:

cd seq2seq; 

python train_bart.py --use_prefix_tuning yes --use_lora yes --mode xsum --preseqlen 200 --do_train yes --fp16 yes --bsz 2  --epoch 30  --gradient_accumulation_step 3 --learning_rate 0.00005  --mid_dim 800

Prefix-Tuning:

cd seq2seq; 

python train_bart.py --use_prefix_tuning yes --use_lora no --mode xsum --preseqlen 200 --do_train yes --fp16 yes --bsz 2  --epoch 30  --gradient_accumulation_step 3 --learning_rate 0.00005  --mid_dim 800

LoRA:

cd seq2seq; 

python train_bart.py --use_prefix_tuning no --use_lora yes --mode xsum --preseqlen 200 --do_train yes --fp16 yes --bsz 2  --epoch 30  --gradient_accumulation_step 3 --learning_rate 0.00005  --mid_dim 800

Solving NLU Problems

cd seq2seq; 

python train_bart.py --use_prefix_tuning {same as training} --use_lora {same as training} --mode xsum --do_train no --prefix_model_path {checkpoint_path} --preseqlen {same as training} --mid_dim {same as training}

Interfaces

Training Interfaces are as follow:

run.png

train.png

Solving NLU Problems Interfaces are as follow:

decode.png

About

Final Project of Natural Language Processing

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages