-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
✨ #172 Add Anthropic integration for chat streaming #182
base: develop
Are you sure you want to change the base?
Conversation
|
||
"glide/pkg/api/schemas" | ||
"glide/pkg/providers/clients" | ||
) | ||
|
||
type Client struct { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In Golang uniquely, compared to other programming languages, allows struct/object methods can be separate across go files in the module. For example, in case of Anthropic client is spread across these three files:
Specifically, the client is defined in https://github.com/EinStack/glide/blob/develop/pkg/providers/anthropic/client.go#L22
and it has already pulling a lot of useful values like API key (e.g. c.config.APIKey) and a bunch of other configurations. So these three files are really worth checking and reuse that existing client.
} | ||
|
||
func (c *Client) ChatStream(ctx context.Context, chatReq *schemas.ChatRequest) (clients.ChatStream, error) { | ||
apiURL := "https://api.anthropic.com/v1/complete" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would this existing chatURL field work here? https://github.com/EinStack/glide/blob/develop/pkg/providers/anthropic/client.go
Seems like it's one and the same completion endpoint for both sync and streaming API in Anthropic just like in OpenAI
|
||
func (c *Client) ChatStream(ctx context.Context, chatReq *schemas.ChatRequest) (clients.ChatStream, error) { | ||
apiURL := "https://api.anthropic.com/v1/complete" | ||
requestBody := map[string]interface{}{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have this interesting method to combines configs provided by user in Glide provider YAML config + incoming request data to map them into a real Anthropic request:
https://github.com/EinStack/glide/blob/develop/pkg/providers/anthropic/chat.go#L82-L88
Would be great to leverage it here.
return true | ||
} | ||
|
||
func (c *Client) ChatStream(ctx context.Context, chatReq *schemas.ChatRequest) (clients.ChatStream, error) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Over the weekend, I was finalizing the ChatStream interface and added this Open()
method where the request initialization is supposed to happen (here is an example from OpenAI). Would good to move most of this code to AnthropicChatStream.Open()
so we can properly track the initial request latency.
responseBody io.ReadCloser | ||
} | ||
|
||
func (s *AnthropicChatStream) Receive() (string, error) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be called Recv()
to fulfil the ChatStream interface:
func (s *AnthropicChatStream) Receive() (string, error) { | |
func (s *AnthropicChatStream) Recv() (string, error) { |
It also should return (*schemas.ChatStreamChunk, error)
but I think you will get to that:
https://github.com/EinStack/glide/blob/develop/pkg/providers/clients/stream.go#L9
} | ||
|
||
func (c *Client) ChatStream(_ context.Context, _ *schemas.ChatRequest) (clients.ChatStream, error) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There has to be ChatStream()
method that would create an instance of AnthropicChatStream in this case.
For example, in OpenAI case, it looked this way:
https://github.com/EinStack/glide/blob/develop/pkg/providers/openai/chat_stream.go#L162-L177
Without that there is nothing that would use the AnthropicChatStream struct
} | ||
|
||
decoder := json.NewDecoder(s.response.Body) | ||
var chunk schemas.ChatStreamChunk |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
schemas.ChatStreamChunk
is Glide's unified schema for stream chunk, but it's not going to be useful directly to parse Anthropic chunks most likely. You need to define a Anthropic-specific chunk schema, use it to parse incoming chunks, and then finally remap useful fields to an instance of schemas.ChatStreamChunk
.
This is how it's done in OpenAI case:
https://github.com/EinStack/glide/blob/develop/pkg/providers/openai/chat_stream.go#L115-L147
return nil, fmt.Errorf("stream not opened") | ||
} | ||
|
||
decoder := json.NewDecoder(s.response.Body) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We are using just json.Unmarshal()
to unmarshal chunks with the default decoder config:
https://github.com/EinStack/glide/blob/develop/pkg/providers/openai/chat_stream.go#L115
so I feel like the same goes in Anthropic case, too:
|
||
// Recv listens for and decodes incoming messages from the chat stream into ChatStreamChunk objects. | ||
func (s *AnthropicChatStream) Recv() (*schemas.ChatStreamChunk, error) { | ||
if s.response == nil { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does Anthropic uses server-side events (SSE) for chat streaming? If so, you need to use a special parser to read that stream just like OpenAI:
https://github.com/EinStack/glide/blob/develop/pkg/providers/openai/chat_stream.go#L75-L95
SSE has a special format that has to be parsed before you can even unmarshal the real chunk from JSON into an Anthropic chat stream struct.
Pushing the code for Anthropic integration at the following link
#172