Skip to content

AaronZLT/llm-toolkit

Repository files navigation

llm-toolkit

Easy to use

cd llm-toolkit
pip install .

Check the examples to use our toolkit!

Including the minimal usecase for benchmarking, fine-tuning and evaluating.

Citation

llm-toolkit🛠️ is under Apache License 2.0. If you use llm-toolkit, please cite our work:

@misc{zhang2023lorafamemoryefficientlowrankadaptation,
      title={LoRA-FA: Memory-efficient Low-rank Adaptation for Large Language Models Fine-tuning},
      author={Longteng Zhang and Lin Zhang and Shaohuai Shi and Xiaowen Chu and Bo Li},
      year={2023},
      eprint={2308.03303},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2308.03303},
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published