Skip to content

Support Llama-api as an LLM provider #10451

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

seyeong-han
Copy link
Contributor

Add Llama API support

This PR adds support for the Llama API through a new LlamaAPIConfig model class.

Key features:

  • Built upon OpenAIGPTConfig class using Llama's OpenAI-compatible endpoint (https://api.llama.com/compat/v1/)
  • Supports chat completion and streaming
  • Implements structured output via the "json_schema" method

Current limitations:

  • Tool calling is not yet stable in the Llama API, so it's not supported in this implementation
  • Modified unit tests to only test the "json_schema" structured output format

We plan to add support for tool calling in a future update once the API stabilizes.

Type

🆕 New Feature

Copy link

vercel bot commented Apr 30, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 3, 2025 10:01pm

@CLAassistant
Copy link

CLAassistant commented Apr 30, 2025

CLA assistant check
All committers have signed the CLA.

@krrishdholakia
Copy link
Contributor

krrishdholakia commented May 1, 2025

Hey @seyeong-han can you please add a test for your integration either inside test_main or test_meta_llama_chat_transformation.py inside tests/litellm - https://github.com/BerriAI/litellm/tree/be885e4a174203fb28fdb6683666b4657b1144d4/tests/litellm

@ishaan-jaff ishaan-jaff changed the base branch from main to litellm_meta_llama_api May 3, 2025 22:20
Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ishaan-jaff ishaan-jaff merged commit 9dd1030 into BerriAI:litellm_meta_llama_api May 3, 2025
4 of 6 checks passed
ishaan-jaff added a commit that referenced this pull request May 3, 2025
* Support Llama-api as an LLM provider (#10451)

* init: support llama-api as a llm provider

* docs: fix endpoint url

* fix: rename meta dir to meta-llama

* docs: add meta-llama info

* fix: mv LlamaAPIConfig under chat directory

* feat: add LlamaAPIConfig in ProviderConfigManager

* fix: provider_config from ProviderConfigManager

* feat: add supports_tool_choice param

* fix: remove optional_params using model_info

* fix: rename meta-llama to meta_llama

* init: test for meta_llama

* fix: model names

---------

Co-authored-by: Krish Dholakia <[email protected]>

* fix file naming convention

* fix file naming convention for meta_llama

* docs meta llama api litellm

---------

Co-authored-by: Young Han <[email protected]>
Co-authored-by: Krish Dholakia <[email protected]>
S1LV3RJ1NX pushed a commit that referenced this pull request May 6, 2025
* Support Llama-api as an LLM provider (#10451)

* init: support llama-api as a llm provider

* docs: fix endpoint url

* fix: rename meta dir to meta-llama

* docs: add meta-llama info

* fix: mv LlamaAPIConfig under chat directory

* feat: add LlamaAPIConfig in ProviderConfigManager

* fix: provider_config from ProviderConfigManager

* feat: add supports_tool_choice param

* fix: remove optional_params using model_info

* fix: rename meta-llama to meta_llama

* init: test for meta_llama

* fix: model names

---------

Co-authored-by: Krish Dholakia <[email protected]>

* fix file naming convention

* fix file naming convention for meta_llama

* docs meta llama api litellm

---------

Co-authored-by: Young Han <[email protected]>
Co-authored-by: Krish Dholakia <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants