Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Got exception from REDIS Connection closed by server #9024

Open
langgg0511 opened this issue Mar 6, 2025 · 1 comment
Open

[Bug]: Got exception from REDIS Connection closed by server #9024

langgg0511 opened this issue Mar 6, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@langgg0511
Copy link

What happened?

general_settings:
  disable_master_key_return: true
  disable_reset_budget: true
  disable_retry_on_max_parallel_request_limit_error: true
  master_key: "sk-123"
  alerting: ["slack"]
  alert_types: ["llm_exceptions", "llm_requests_hanging"]
  database_url: ""
  database_connection_pool_limit: 5
  database_connection_timeout: 60
  proxy_batch_write_at: 60
  store_model_in_db: true
litellm_settings:
  ssl_verify: false
  json_logs: true
  cache: True
  cache_params:
    type: redis
    namespace: "litellm_caching"
    host: "redis.db"
    port: 6379
    ttl: 86400
    supported_call_types: ["acompletion", "atext_completion", "aembedding", "atranscription"]
  drop_params: true
  num_retries: 0
  redact_user_api_key_info: true

Relevant log output

{"message": "LiteLLM Redis Caching: async set_cache_pipeline() - Got exception from REDIS Connection closed by server., Writing value=None", "level": "ERROR", "timestamp": "2025-03-06T09:31:49.040450"}

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.61.3

Twitter / LinkedIn details

No response

@langgg0511 langgg0511 added the bug Something isn't working label Mar 6, 2025
@PRANJALRANA11
Copy link

hey

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants