Skip to content

Commit

Permalink
Increase timeout for non-streaming results (#247)
Browse files Browse the repository at this point in the history
Fixes #213
  • Loading branch information
hadley authored Jan 10, 2025
1 parent 67b3d30 commit 3e440c2
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 0 deletions.
2 changes: 2 additions & 0 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
# ellmer (development version)

* `chat_openai()` should be less likely to timeout when not streaming chat results (#213).

# ellmer 0.1.0

* New `chat_vllm()` to chat with models served by vLLM (#140).
Expand Down
6 changes: 6 additions & 0 deletions R/provider-openai.R
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,12 @@ method(chat_request, ProviderOpenAI) <- function(provider,
req <- req_url_path_append(req, "/chat/completions")
req <- req_auth_bearer_token(req, provider@api_key)
req <- req_retry(req, max_tries = 2)

if (!stream) {
# Give extra time for non-streaming responses to complete
req <- req_timeout(req, 60)
}

req <- req_error(req, body = function(resp) {
if (resp_content_type(resp) == "application/json") {
resp_body_json(resp)$error$message
Expand Down

0 comments on commit 3e440c2

Please sign in to comment.