-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for Amazon Bedrock from AWS #1249
Comments
This would help me enormously - specifically Claude 3 Sonnet via Bedrock (and later Opus). |
+1 Please consider bedrock support now that Opus is supported. It supports people that use company AWS infrastructure for everything. |
+1 |
+1 |
+1 should be fairly straightforward since Anthropic API is already integrated |
+1
I would greatly appreciate it if you could support calling the Claude Model using AWS IAM Keys. |
+1 to this! would really appreciate bedrock support. |
+1 could be really useful to have this! |
+1 would be useful for employees since they are only allowed to share code artifacts with Bedrock's internal models. |
I found https://continue.dev a good replacement for Cursor that works with Bedrock out of the box. |
Disappointing that this still isn't implemented. May have to test out continue.dev in that case. |
Much required with Sonnet gaining popularity |
So GitHub Models came out yesterday. This is their inference platform similar to Amazon Bedrock. Google has Model Garden. Please support all of these. But yes, yes, yes please do add support for Amazon Bedrock first ;) |
Much needed |
I would also like to bump this. Working in a corporate environment where everything must go through our AWS LZ. This would open up Cursor for corporate scenarios where strict backend control is required. |
+1 |
subscribed +1 |
+1 As a workaround I suggest using this repository to work as a proxy: Then in cursor override the OpenAI base URL and set the custom API key. |
@l4time I tried following your workaround and I've verified that the proxy is working. However, I wasn't able to use Cursor's chat function - it says that I wonder if you figured out a way around this |
+1
|
+1 |
7 similar comments
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
Really lookin fordward to this feature. It would be huge |
Is cursor going to add this? |
+1 |
3 similar comments
+1 |
+1 |
+1 |
+1 all the workarounds to trying to adapt models to OpenAI API feel too fragile. |
+1 |
14 similar comments
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
AWS Bedrock was just added to Zed if you are looking for an alternative: zed-industries/zed#21092 |
+1 |
+1 will really appreciate this |
+1 Necessary for company use |
@truell20 @Sanger2000 PLEASE |
+1 |
Using continue so far, but would be awesome if cursor can supports this. I work in a company with highly sensitive data so we would want to encapsulate models within our network and have specific guard rails. We use below and some tabAutocomplete and embeddings, all using bedrock provider. Works as long as you are logged on with aws cli on your local. `"models": [
I will try the approach mentioned by Sugi275 https://docs.anthropic.com/en/api/claude-on-amazon-bedrock if it works until then with cursor. |
+1 |
1 similar comment
+1 |
+10086 |
Is your feature request related to a problem? Please describe.
Currently, only OpenAI models are supported by Cursor. I would like to use AWS models from Amazon Bedrock, such as Amazon Titan or Anthropic Claude or Meta Llama-2.
Describe the solution you'd like
In the Cursor Settings, I would like to be able to connect to my AWS account, and configure my Bedrock model.
Additional context
Why only OpenAI? :(

The text was updated successfully, but these errors were encountered: