feat: add mistral AI as LLM provider #2496
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
This PR adds Mistral AI as a new LLM provider to mem0-ts, allowing users to leverage Mistral's language models as an alternative to OpenAI, Anthropic, and other supported providers. The implementation follows the existing pattern for LLM providers and properly handles Mistral's response format including content type handling.
Key changes:
MistralLLM
class that implements the LLM interfacefactory.ts
to include Mistral as a provider optionindex.ts
package.json
with Mistral SDK dependencyThis addition gives mem0-ts users more flexibility in choosing LLM providers and aligns with the project's goal of supporting multiple AI services.
Type of change
How Has This Been Tested?
I created a test script at
mem0-ts/src/oss/examples/llms/mistral-example.ts
that tests both basic chat completion and tool calling functionality. The test validates that the Mistral API integration works correctly for different use cases.The test can be run with:
Checklist:
Maintainer Checklist