Skip to content

Latest commit

 

History

History
28 lines (21 loc) · 710 Bytes

README.md

File metadata and controls

28 lines (21 loc) · 710 Bytes

llm-toolkit

Easy to use

cd llm-toolkit
pip install .

Check the examples to use our toolkit!

Including the minimal usecase for benchmarking, fine-tuning and evaluating.

Citation

llm-toolkit🛠️ is under Apache License 2.0. If you use llm-toolkit, please cite our work:

@misc{zhang2023lorafamemoryefficientlowrankadaptation,
      title={LoRA-FA: Memory-efficient Low-rank Adaptation for Large Language Models Fine-tuning},
      author={Longteng Zhang and Lin Zhang and Shaohuai Shi and Xiaowen Chu and Bo Li},
      year={2023},
      eprint={2308.03303},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2308.03303},
}