Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatGPT API Responses Not Streaming in Real-Time via Cloudflare Tunnel #1404

Open
cowboycodr opened this issue Jan 30, 2025 · 0 comments
Open

Comments

@cowboycodr
Copy link

TL;DR: ChatGPT responses stream in real-time when accessed locally, but Cloudflare Tunnel appears to be blocking progressive streaming, only delivering the response after full generation.

I'm hosting a SvelteKit app on my Raspberry Pi that interacts with OpenAI's ChatGPT API. When accessed locally via its IP, the API responses stream in real-time as expected. However, when accessed through my Cloudflare Tunnel, the responses only appear once the entire message has been generated, rather than progressively streaming.

How to recreate:

  1. Set up a SvelteKit app that calls OpenAI’s ChatGPT API with streaming enabled.
  2. Run the app locally and confirm that responses stream in real-time.
  3. Expose the app using Cloudflare Tunnels.
  4. Access the app through the tunnel URL and observe that responses are delayed until fully generated.

Is there a way to ensure real-time streaming works properly when routing through Cloudflare Tunnels?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant