Open AI - how to throttle usage? #1494
Unanswered
njproductionsus
asked this question in
Support / Q&A
Replies: 1 comment
-
@njproductionsus, Unfortunately, there are no docs present for this - #1428 (comment) usecaes at the moment. I'll tag @BeeBombshell , if there are any other workaround available. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I was VERY excited to discover Rowy with Parallel GPT to allow running Open AI API through Rowy with scripts.
The point seems to be for doing BATCH Open AI requests. My issue is that I can't figure out how to THROTTLE the derivative scripts so that I don't exceed Open AI usage limits (primarily tokens per minute for the Vision API which is currently very low).
I have tried a great number of things using Firebase Firestore with a counter variable, timestamp variable, etc... I have tried on Rowy's script side to implement a feature to add to the counter and then pause at a certain number for a certain amount of time- but the PAUSE feature doesn't seem to be available in Rowy so I am stuck.
Is there ANY advise for if I have 100 rows added in a column, to set a delay of how quickly those scripts per row run? I am 100% open to any solution in Rowy / Firebase that can create a controllable delay.
I've seen a thread here similar that suggests "a queue system that manages the requests and ensures that you stay within the rate limit":
#1428
Is that the only route and is there any documentation or examples of implementing this?
Beta Was this translation helpful? Give feedback.
All reactions