Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use models such as Paligemma that are not instruction fine-tuned #1519

Open
AayGpt opened this issue Feb 24, 2025 · 3 comments
Open

Use models such as Paligemma that are not instruction fine-tuned #1519

AayGpt opened this issue Feb 24, 2025 · 3 comments

Comments

@AayGpt
Copy link

AayGpt commented Feb 24, 2025

Hi,

I want to use models such as Paligemma-3b that have not been explicitly instruction finetuned and hence do not support the OpenAI Chat completions API.

Is there a way to run such models with baml? Thanks!

Some related open issues on other repos which might be useful:-

  1. [Bug]: PaliGemma2 not working with OpenAI Docker serve vllm-project/vllm#12052
  2. [Bug]: PaliGemma serving vllm-project/vllm#6644
Copy link

linear bot commented Feb 24, 2025

@sxlijin
Copy link
Collaborator

sxlijin commented Feb 25, 2025

If they don't support the chat API or completions API, we'd have to add a new client integration type for it. Do you have any pointers to documentation for how to communicate with these models?

@AayGpt
Copy link
Author

AayGpt commented Mar 6, 2025

Hi. Would something like this help:- https://huggingface.co/google/paligemma2-3b-pt-224

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants