Add a new chat_snowflake() provider #258
Open
+355
−19
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This commit adds
chat_snowflake()
for chatting with models hosted through Snowflake's Cortex LLM REST API:On the backend it looks fairly similar to OpenAI, though it has only the basic textual functionality, and so many advanced ellmer features are not available. I also reused quite a bit of the credential support and utilities from
chat_cortex()
, so this commit also includes some minor refactoring of that provider.Right now the default model for
chat_snowflake()
is Llama 3.1 70B, but we should change it to Claude 3.5 Sonnet when that gets rolled out more widely; it's only available to customers in theus-west-1
Snowflake region right now.Unit tests are included.
Part of #255.