cd llm-toolkit
pip install .
Including the minimal usecase for benchmarking, fine-tuning and evaluating.
llm-toolkit
🛠️ is under Apache License 2.0. If you use llm-toolkit, please cite our work:
@misc{zhang2023lorafamemoryefficientlowrankadaptation,
title={LoRA-FA: Memory-efficient Low-rank Adaptation for Large Language Models Fine-tuning},
author={Longteng Zhang and Lin Zhang and Shaohuai Shi and Xiaowen Chu and Bo Li},
year={2023},
eprint={2308.03303},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2308.03303},
}