Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Amazon Bedrock from AWS #1249

Open
dgallitelli opened this issue Feb 21, 2024 · 158 comments
Open

Add support for Amazon Bedrock from AWS #1249

dgallitelli opened this issue Feb 21, 2024 · 158 comments

Comments

@dgallitelli
Copy link

Is your feature request related to a problem? Please describe.
Currently, only OpenAI models are supported by Cursor. I would like to use AWS models from Amazon Bedrock, such as Amazon Titan or Anthropic Claude or Meta Llama-2.

Describe the solution you'd like
In the Cursor Settings, I would like to be able to connect to my AWS account, and configure my Bedrock model.

Additional context

Why only OpenAI? :(
image

@forkfork
Copy link

This would help me enormously - specifically Claude 3 Sonnet via Bedrock (and later Opus).

@stirredo
Copy link

+1

Please consider bedrock support now that Opus is supported. It supports people that use company AWS infrastructure for everything.

@arvehisa
Copy link

+1
Would be very appreciate if cursor can support Claude 3 via Amazon Bedrock.

@josegtmonteiro
Copy link

+1

@yeralin
Copy link

yeralin commented May 29, 2024

+1 should be fairly straightforward since Anthropic API is already integrated

@Sugi275
Copy link

Sugi275 commented Jun 21, 2024

+1
As shown in the following URL, the Anthropic SDK is able to request Claude Model through Amazon Bedrock.
https://docs.anthropic.com/en/api/claude-on-amazon-bedrock

from anthropic import AnthropicBedrock

client = AnthropicBedrock(
    # Authenticate by either providing the keys below or use the default AWS credential providers, such as
    # using ~/.aws/credentials or the "AWS_SECRET_ACCESS_KEY" and "AWS_ACCESS_KEY_ID" environment variables.
    aws_access_key="<access key>",
    aws_secret_key="<secret key>",
    # Temporary credentials can be used with aws_session_token.
    # Read more at https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp.html.
    aws_session_token="<session_token>",
    # aws_region changes the aws region to which the request is made. By default, we read AWS_REGION,
    # and if that's not present, we default to us-east-1. Note that we do not read ~/.aws/config for the region.
    aws_region="us-west-2",
)

message = client.messages.create(
    model="anthropic.claude-3-5-sonnet-20240620-v1:0",
    max_tokens=256,
    messages=[{"role": "user", "content": "Hello, world"}]
)
print(message.content)

I would greatly appreciate it if you could support calling the Claude Model using AWS IAM Keys.

@wilsonhou
Copy link

+1 to this! would really appreciate bedrock support.

@wooferclaw
Copy link

+1 could be really useful to have this!

@tonyrusignuolo
Copy link

+1 would be useful for employees since they are only allowed to share code artifacts with Bedrock's internal models.

@yeralin
Copy link

yeralin commented Jul 21, 2024

@tonyrusignuolo

+1 would be useful for employees since they are only allowed to share code artifacts with Bedrock's internal models.

I found https://continue.dev a good replacement for Cursor that works with Bedrock out of the box.

@gwailoTr0n5000
Copy link

Disappointing that this still isn't implemented. May have to test out continue.dev in that case.

@KaliCharan-V
Copy link

Much required with Sonnet gaining popularity

@metaskills
Copy link

So GitHub Models came out yesterday. This is their inference platform similar to Amazon Bedrock. Google has Model Garden. Please support all of these. But yes, yes, yes please do add support for Amazon Bedrock first ;)

@vinodvarma24
Copy link

Much needed

@refactorthis
Copy link

I would also like to bump this. Working in a corporate environment where everything must go through our AWS LZ. This would open up Cursor for corporate scenarios where strict backend control is required.

@jubinpyli
Copy link

+1

@binarycrayon
Copy link

subscribed +1

@l4time
Copy link

l4time commented Aug 23, 2024

+1

As a workaround I suggest using this repository to work as a proxy:
https://github.com/aws-samples/bedrock-access-gateway

Then in cursor override the OpenAI base URL and set the custom API key.

@kingstarfly
Copy link

+1

As a workaround I suggest using this repository to work as a proxy: aws-samples/bedrock-access-gateway

Then in cursor override the OpenAI base URL and set the custom API key.

@l4time I tried following your workaround and I've verified that the proxy is working.

However, I wasn't able to use Cursor's chat function - it says that Seems like we are having an issue with your API key - please confirm the base URL is publicly accessible and the API key is correct in the settings. If this persists, please email us at [email protected]..

I wonder if you figured out a way around this

@adityapapu
Copy link

adityapapu commented Aug 24, 2024

+1

I would also like to bump this. Working in a corporate environment where everything must go through our AWS LZ. This would open up Cursor for corporate scenarios where strict backend control is required.

@Thandden
Copy link

+1

7 similar comments
@dzhou1221
Copy link

+1

@Eternaux
Copy link

+1

@martin-nginio
Copy link

+1

@ron137
Copy link

ron137 commented Aug 26, 2024

+1

@orangewise
Copy link

+1

@JurgenLangbroek
Copy link

+1

@davidshtian
Copy link

+1

@juan-abia
Copy link

juan-abia commented Aug 29, 2024

Really lookin fordward to this feature. It would be huge

@kevin-longe-unmind
Copy link

Is cursor going to add this?

@EthanChen39
Copy link

+1

3 similar comments
@FloRul
Copy link

FloRul commented Feb 9, 2025

+1

@fanfan2289
Copy link

+1

@LeoLuo0115
Copy link

+1

@tonisyvanen
Copy link

+1 all the workarounds to trying to adapt models to OpenAI API feel too fragile.

@nickw-adonis
Copy link

+1

14 similar comments
@ArmandoArias
Copy link

+1

@arbiyanto
Copy link

+1

@JakeSelby
Copy link

+1

@dpkjnr
Copy link

dpkjnr commented Feb 17, 2025

+1

@existeundelta
Copy link

+1

@kjs-aws
Copy link

kjs-aws commented Feb 19, 2025

+1

@miodrage
Copy link

+1

@CrazyFunker
Copy link

+1

@godott
Copy link

godott commented Feb 25, 2025

+1

@HwangJohn
Copy link

+1

@mameen-omar
Copy link

+1

@pathetic
Copy link

+1

@billykwok
Copy link

+1

@Philipp-hinderberger
Copy link

+1

@ChrisDryden
Copy link

AWS Bedrock was just added to Zed if you are looking for an alternative: zed-industries/zed#21092

@qasimalitrilogy
Copy link

+1

@abdul-050
Copy link

+1 will really appreciate this

@sevdesk-ryanp
Copy link

+1 Necessary for company use

@fuhcq3
Copy link

fuhcq3 commented Feb 27, 2025

@truell20 @Sanger2000 PLEASE

@miverbec
Copy link

+1

@vvyas-arcus
Copy link

vvyas-arcus commented Feb 27, 2025

Using continue so far, but would be awesome if cursor can supports this. I work in a company with highly sensitive data so we would want to encapsulate models within our network and have specific guard rails.

We use below and some tabAutocomplete and embeddings, all using bedrock provider. Works as long as you are logged on with aws cli on your local.

`"models": [
{

  "title": "Claude 3.7 Sonnet from AWS",
  "provider": "bedrock",
  "model": "us.anthropic.claude-3-7-sonnet-20250219-v1:0",
  "region": "us-east-1",
  "profile": "bedrock"
  
}`

I will try the approach mentioned by Sugi275 https://docs.anthropic.com/en/api/claude-on-amazon-bedrock if it works until then with cursor.

@rudnypc
Copy link

rudnypc commented Feb 28, 2025

+1

1 similar comment
@upman
Copy link

upman commented Mar 1, 2025

+1

@darcula1993
Copy link

+10086

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests