-
-
Notifications
You must be signed in to change notification settings - Fork 715
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for AzureOpenAI #1161
Comments
Question - I'm not super familiar with the Azure OpenAI client. Can you not use the normal OpenAI client and just change the api endpoint and auth params? How does the azure OpenAI client differ |
@ivanleomk This is very simple. from openai import AzureOpenAI
llm = AzureOpenAI(your params)
client = instructor.from_openai(llm) In fact, any LLM which is compatible with openAI SDK can use this method. |
Here is more precise exemple import instructor
from openai import AzureOpenAI
from pydantic import BaseModel
# Define your data model
class UserInfo(BaseModel):
name: str
age: int
# Initialize the Azure OpenAI client with your credentials
client = AzureOpenAI(
api_key="your-azure-api-key",
azure_endpoint="https://your-resource-name.openai.azure.com", # Your Azure endpoint
#add your oder params
)
# Patch the client with instructor
client = instructor.from_openai(client)
# Use the patched client to extract structured data
user_info = client.chat.completions.create(
model="your-deployed-model-name", # The name you gave to your deployed model
response_model=UserInfo,
messages=[
{"role": "user", "content": "John Doe is 30 years old."}
]
)
print(user_info.name) # Output: John Doe
print(user_info.age) # Output: 30 |
Thank you so much. |
Closing this issue since it's possible to do so with the default OpenAI integration |
No description provided.
The text was updated successfully, but these errors were encountered: