-
Notifications
You must be signed in to change notification settings - Fork 941
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added suport to anthropic prompt cache #1413
base: main
Are you sure you want to change the base?
Added suport to anthropic prompt cache #1413
Conversation
thanks, can you explain the comment "Chatmodel/AnthropicChatOptions not can set header beta to works cache" please? |
Anthropic models added caching as a beta feature that requires a custom header to be used. I couldn't find a way to send a custom header using the options in the Chatmodel/AnthropicChatOptions classes. That's why I added integration tests only in the AnthropicApiIT.java class. It would be nice if in the future there was an option to send a custom header in ChatModel class to test beta features like langchain4j langchain4j/langchain4j#1591 (comment). With this type of option it would be easier to add integration tests for beta features in the AnthropicChatModelIT.java class. |
Hi. Thanks for the clarification. I think it is a bit too much to get into M3 only because it modifies the AbstractMessage class and I'd like to think through the implications. I guess much like function calling, this is going to "table stakes" for leading model providers. I'll schedule it for M4 with the hopes we have a 5-ish week release cadence on our march to GA. thanks for the nice PR. |
@Claudio-code @markpollack @tzolov any chance this be merged any sooner than March? |
We are waiting for prompt cache feature too. It should help us to save tokens. Our app sends screenshots to Anthropic and we chose Spring AI framework to deal with LLMs |
I saw that the cache left beta, now I can add more complete tests to validate the creation of token caches and the reading of these caches. |
Thank you for taking time to contribute this pull request!
You might have already read the [contributor guide][1], but as a reminder, please make sure to:
issue #1403
I added support to user prompt use cache, added test only to api because Chatmodel/AnthropicChatOptions not can set header beta to works cache
main
branch and squash your commits