-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Vercel Long Running task (edge function) #1151
Comments
In #1150 (comment), @nilooy mentions that it works when switching to |
Hi @nilooy! According to the Next.js docs, switching to |
Thanks for switching it to issue. the error mentioned was gone after switching back to so, as far i understood, quirrel calls the api back on the scheduled time (or as background task) which is hosted in vercel and has a timeout limit of 60s but i need a long running task. in the code block below, i want to keep this process on until open ai finishes it's response's stream. main goal here is to, bypass the vercel timeout limit of 60s which can be extended to higher number by streaming the response. export default TestQueue("api/test", async (params) => {
const response = await openai.createChatCompletion({
model: "gpt-3.5-turbo",
stream: true,
messages: [{ role: "user", content: "explain the next js" }],
});
const stream = OpenAIStream(response);
// Respond with the stream
return new StreamingTextResponse(stream);
}); mentioned in vercel stream docs: https://vercel.com/docs/concepts/functions/edge-functions/streaming
|
I don't think that's currently possible with Quirrel. You're returning a Line 720 in 139c74c
Before we think about solving this, please elaborate on your usecase for this. What's the reason you're accessing OpenAI from a Queue, where you can't send data to your frontend? |
the main usecase here is, i'm making a open ai call with quite a large prompt and then parsing the data into json, in total process can take upto 2 min. i can't let my users wait for that long in the frontend. so i must take the path of fan out jobs and let the user know when it's done instead of them keeping in the page for 2 mins. at the moment there's few talks but not valid solution except defer or a custom one. some info: |
Makes sense, thank you! I don't think that Quirrel currently supports that, and i'd need to think a bit about the best way of implementing support for these long-running jobs. Have you looked into https://www.inngest.com/, would something like that solve your needs? |
Ok perfect, inngest has a very different way to solve this issue, but with that, i need to change my entire workflow of next.js api, and will get into strong vendor lock in. I have another way i tested with gcp function, probably will go with that or make a node.js server with bull mq. |
@Skn0tt can you please rename the title of the issue to
|
Discussed in #1150
Originally posted by nilooy June 25, 2023
is it possible to use quirrel queue with vercel edge function? i was looking specifically for this to run as background job by quirrel
https://github.com/inngest/vercel-ai-sdk/blob/main/examples/next-openai/app/api/chat/route.ts
i tried the following approach
and ran from another route
this results in following error while running
The text was updated successfully, but these errors were encountered: