You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
TL;DR: ChatGPT responses stream in real-time when accessed locally, but Cloudflare Tunnel appears to be blocking progressive streaming, only delivering the response after full generation.
I'm hosting a SvelteKit app on my Raspberry Pi that interacts with OpenAI's ChatGPT API. When accessed locally via its IP, the API responses stream in real-time as expected. However, when accessed through my Cloudflare Tunnel, the responses only appear once the entire message has been generated, rather than progressively streaming.
How to recreate:
Set up a SvelteKit app that calls OpenAI’s ChatGPT API with streaming enabled.
Run the app locally and confirm that responses stream in real-time.
Expose the app using Cloudflare Tunnels.
Access the app through the tunnel URL and observe that responses are delayed until fully generated.
Is there a way to ensure real-time streaming works properly when routing through Cloudflare Tunnels?
The text was updated successfully, but these errors were encountered:
I'm hosting a SvelteKit app on my Raspberry Pi that interacts with OpenAI's ChatGPT API. When accessed locally via its IP, the API responses stream in real-time as expected. However, when accessed through my Cloudflare Tunnel, the responses only appear once the entire message has been generated, rather than progressively streaming.
How to recreate:
Is there a way to ensure real-time streaming works properly when routing through Cloudflare Tunnels?
The text was updated successfully, but these errors were encountered: