Skip to content

Commit 0a352fa

Browse files
authored
Merge pull request #10 from paulyuk/main
Modernized for updated Langchain, Flex Consumption and MI
2 parents 7c091bb + 4251003 commit 0a352fa

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

43 files changed

+899
-2240
lines changed

README.md

Lines changed: 36 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -26,23 +26,28 @@ This sample shows how to take a human prompt as HTTP Get or Post input, calculat
2626
### Pre-reqs
2727
1) [Python 3.8+](https://www.python.org/)
2828
2) [Azure Functions Core Tools](https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=v4%2Cmacos%2Ccsharp%2Cportal%2Cbash#install-the-azure-functions-core-tools)
29-
3) [Azure OpenAPI API key, endpoint, and deployment](https://portal.azure.com)
30-
4) Add this `local.settings.json` file to this folder to simplify local development and include Key from step 3
29+
3) [Azure Developer CLI](https://aka.ms/azd)
30+
4) Once you have your Azure subscription, run the following in a new terminal window to create Azure OpenAI and other resources needed:
31+
```bash
32+
azd provision
33+
```
34+
35+
Take note of the value of `AZURE_OPENAI_ENDPOINT` which can be found in `./.azure/<env name from azd provision>/.env`. It will look something like:
36+
```bash
37+
AZURE_OPENAI_ENDPOINT="https://cog-<unique string>.openai.azure.com/"
38+
```
3139

32-
`./local.settings.json`
40+
5) Add this `local.settings.json` file to the root of the repo folder to simplify local development. Replace `AZURE_OPENAI_ENDPOINT` with your value from step 4. Optionally you can choose a different model deployment in `AZURE_OPENAI_CHATGPT_DEPLOYMENT`. This file will be gitignored to protect secrets from committing to your repo, however by default the sample uses Entra identity (user identity and mananaged identity) so it is secretless.
3341
```json
3442
{
3543
"IsEncrypted": false,
3644
"Values": {
3745
"FUNCTIONS_WORKER_RUNTIME": "python",
3846
"AzureWebJobsFeatureFlags": "EnableWorkerIndexing",
39-
"AzureWebJobsStorage": "",
40-
"AZURE_OPENAI_KEY": "...",
41-
"AZURE_OPENAI_ENDPOINT": "https://<service_name>.openai.azure.com/",
42-
"AZURE_OPENAI_SERVICE": "...",
43-
"AZURE_OPENAI_CHATGPT_DEPLOYMENT": "...",
44-
"OPENAI_API_VERSION": "2023-05-15",
45-
"USE_LANGCHAIN": "True"
47+
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
48+
"AZURE_OPENAI_ENDPOINT": "https://<your deployment>.openai.azure.com/",
49+
"AZURE_OPENAI_CHATGPT_DEPLOYMENT": "chat",
50+
"OPENAI_API_VERSION": "2023-05-15"
4651
}
4752
}
4853
```
@@ -120,3 +125,24 @@ To provision and deploy:
120125
```bash
121126
azd up
122127
```
128+
129+
## Source Code
130+
131+
The key code that makes the prompting and completion work is as follows in [function_app.py](function_app.py). The `/api/ask` function and route expects a prompt to come in the POST body using a standard HTTP Trigger in Python. Then once the environment variables are set to configure OpenAI and LangChain frameworks, we can leverage favorite aspects of LangChain. In this simple example we take a prompt, build a better prompt from a template, and then invoke the LLM. By default the LLM deployment is `gpt-35-turbo` as defined in [./infra/main.parameters.json](./infra/main.parameters.json) but you can experiment with other models.
132+
133+
```python
134+
llm = AzureChatOpenAI(
135+
deployment_name=AZURE_OPENAI_CHATGPT_DEPLOYMENT,
136+
temperature=0.3,
137+
openai_api_key=AZURE_OPENAI_KEY
138+
)
139+
llm_prompt = PromptTemplate.from_template(
140+
"The following is a conversation with an AI assistant. " +
141+
"The assistant is helpful.\n\n" +
142+
"A:How can I help you today?\nHuman: {human_prompt}?"
143+
)
144+
formatted_prompt = llm_prompt.format(human_prompt=prompt)
145+
146+
response = llm.invoke(formatted_prompt)
147+
logging.info(response.content)
148+
```

azure.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
name: langchain-py-ai-func
44
metadata:
5-
template: langchain-py-ai-func@0.0.1-beta
5+
template: langchain-py-ai-func@1.0.0
66
services:
77
api:
88
project: ./

function_app.py

Lines changed: 60 additions & 57 deletions
Original file line numberDiff line numberDiff line change
@@ -2,76 +2,79 @@
22
import logging
33
import os
44
import openai
5-
from langchain.prompts import PromptTemplate
6-
from langchain.llms import OpenAI
7-
from langchain.llms.openai import AzureOpenAI
5+
from langchain_core.prompts import PromptTemplate
6+
from langchain_openai import AzureChatOpenAI
7+
from azure.identity import DefaultAzureCredential
88

99
app = func.FunctionApp()
1010

11-
@app.function_name(name='ask')
12-
@app.route(route='ask', auth_level='anonymous', methods=['POST'])
13-
def main(req):
1411

15-
prompt = req.params.get('prompt')
16-
if not prompt:
17-
try:
18-
req_body = req.get_json()
19-
except ValueError:
20-
raise RuntimeError("prompt data must be set in POST.")
21-
else:
22-
prompt = req_body.get('prompt')
23-
if not prompt:
24-
raise RuntimeError("prompt data must be set in POST.")
12+
# Initializes Azure OpenAI environment
13+
def init():
14+
global credential
15+
global AZURE_OPENAI_ENDPOINT
16+
global AZURE_OPENAI_KEY
17+
global AZURE_OPENAI_CHATGPT_DEPLOYMENT
18+
global OPENAI_API_VERSION
19+
20+
# Use the Entra Id DefaultAzureCredential to get the token
21+
credential = DefaultAzureCredential()
22+
# Set the API type to `azure_ad`
23+
os.environ["OPENAI_API_TYPE"] = "azure_ad"
24+
# Set the API_KEY to the token from the Azure credential
25+
os.environ["OPENAI_API_KEY"] = credential.get_token(
26+
"https://cognitiveservices.azure.com/.default"
27+
).token
2528

26-
# init OpenAI: Replace these with your own values, either in environment variables or directly here
27-
USE_LANGCHAIN = os.getenv("USE_LANGCHAIN", 'True').lower() in ('true', '1', 't')
28-
AZURE_OPENAI_KEY = os.environ.get("AZURE_OPENAI_KEY")
29+
# Initialize Azure OpenAI environment
2930
AZURE_OPENAI_ENDPOINT = os.environ.get("AZURE_OPENAI_ENDPOINT")
30-
AZURE_OPENAI_SERVICE = os.environ.get("AZURE_OPENAI_SERVICE") or "myopenai"
31-
AZURE_OPENAI_GPT_DEPLOYMENT = os.environ.get("AZURE_OPENAI_GPT_DEPLOYMENT") or "davinci"
32-
AZURE_OPENAI_CHATGPT_DEPLOYMENT = os.environ.get("AZURE_OPENAI_CHATGPT_DEPLOYMENT") or "chat" #GPT turbo
33-
if 'AZURE_OPENAI_KEY' not in os.environ:
34-
raise RuntimeError("No 'AZURE_OPENAI_KEY' env var set. Please see Readme.")
31+
AZURE_OPENAI_KEY = credential.get_token(
32+
"https://cognitiveservices.azure.com/.default"
33+
).token
34+
AZURE_OPENAI_CHATGPT_DEPLOYMENT = os.environ.get(
35+
"AZURE_OPENAI_CHATGPT_DEPLOYMENT") or "chat"
36+
OPENAI_API_VERSION = os.environ.get(
37+
"OPENAI_API_VERSION") or "2023-05-15"
3538

36-
# configure azure openai for langchain and/or llm
39+
# Configure base OpenAI framework for LangChain and/or llm
3740
openai.api_key = AZURE_OPENAI_KEY
38-
openai.api_base = AZURE_OPENAI_ENDPOINT # your endpoint should look like the following https://YOUR_RESOURCE_NAME.openai.azure.com/
39-
openai.api_type = 'azure'
40-
openai.api_version = '2023-05-15' # this may change in the future
41-
# for langchain, set this version in environment variables using OPENAI_API_VERSION
41+
openai.api_base = AZURE_OPENAI_ENDPOINT
42+
openai.api_type = "azure"
43+
openai.api_version = OPENAI_API_VERSION
4244

43-
if bool(USE_LANGCHAIN):
44-
logging.info('Using Langchain')
4545

46-
llm = AzureOpenAI(deployment_name=AZURE_OPENAI_CHATGPT_DEPLOYMENT, temperature=0.3, openai_api_key=AZURE_OPENAI_KEY)
47-
llm_prompt = PromptTemplate(
48-
input_variables=["human_prompt"],
49-
template="The following is a conversation with an AI assistant. The assistant is helpful.\n\nAI: I am an AI created by OpenAI. How can I help you today?\nHuman: {human_prompt}?",
50-
)
51-
from langchain.chains import LLMChain
52-
chain = LLMChain(llm=llm, prompt=llm_prompt)
53-
return chain.run(prompt)
54-
55-
else:
56-
logging.info('Using ChatGPT LLM directly')
46+
# Initialize Azure OpenAI environment
47+
init()
5748

58-
completion = openai.Completion.create(
59-
engine=AZURE_OPENAI_CHATGPT_DEPLOYMENT,
60-
prompt=generate_prompt(prompt),
61-
temperature=0.3,
62-
max_tokens=200
63-
)
64-
return completion.choices[0].text
6549

50+
# Function App entry point route for /api/ask
51+
@app.function_name(name="ask")
52+
@app.route(route="ask", auth_level="function", methods=["POST"])
53+
def main(req):
6654

67-
def generate_prompt(prompt):
68-
capitalized_prompt = prompt.capitalize()
55+
try:
56+
req_body = req.get_json()
57+
prompt = req_body.get("prompt")
58+
except ValueError:
59+
raise RuntimeError("prompt data must be set in POST.")
60+
else:
61+
if not prompt:
62+
raise RuntimeError("prompt data must be set in POST.")
6963

70-
# Chat
71-
return f'The following is a conversation with an AI assistant. The assistant is helpful, creative, clever, and very friendly.\n\nHuman: Hello, who are you?\nAI: I am an AI created by OpenAI. How can I help you today?\nHuman: {capitalized_prompt}'
64+
# LangChain user code goes here
65+
llm = AzureChatOpenAI(
66+
deployment_name=AZURE_OPENAI_CHATGPT_DEPLOYMENT,
67+
temperature=0.3
68+
)
69+
llm_prompt = PromptTemplate.from_template(
70+
"The following is a conversation with an AI assistant. " +
71+
"The assistant is helpful.\n\n" +
72+
"A:How can I help you today?\n" +
73+
"Human: {human_prompt}?"
74+
)
75+
formatted_prompt = llm_prompt.format(human_prompt=prompt)
7276

73-
# Classification
74-
#return 'The following is a list of companies and the categories they fall into:\n\nApple, Facebook, Fedex\n\nApple\nCategory: '
77+
response = llm.invoke(formatted_prompt)
78+
logging.info(response.content)
7579

76-
# Natural language to Python
77-
#return '\"\"\"\n1. Create a list of first names\n2. Create a list of last names\n3. Combine them randomly into a list of 100 full names\n\"\"\"'
80+
return func.HttpResponse(response.content)

infra/app/ai-Cog-Service-Access.bicep

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
param principalID string
2+
param principalType string = 'ServicePrincipal' // Workaround for https://learn.microsoft.com/en-us/azure/role-based-access-control/role-assignments-template#new-service-principal
3+
param roleDefinitionID string
4+
param aiResourceName string
5+
6+
resource cognitiveService 'Microsoft.CognitiveServices/accounts@2023-05-01' existing = {
7+
name: aiResourceName
8+
}
9+
10+
// Allow access from API to this resource using a managed identity and least priv role grants
11+
resource roleAssignment 'Microsoft.Authorization/roleAssignments@2022-04-01' = {
12+
name: guid(cognitiveService.id, principalID, roleDefinitionID)
13+
scope: cognitiveService
14+
properties: {
15+
roleDefinitionId: resourceId('Microsoft.Authorization/roleDefinitions', roleDefinitionID)
16+
principalId: principalID
17+
principalType: principalType
18+
}
19+
}
20+
21+
output ROLE_ASSIGNMENT_NAME string = roleAssignment.name

infra/app/ai.bicep

Lines changed: 0 additions & 38 deletions
This file was deleted.

infra/app/api.bicep

Lines changed: 28 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -1,47 +1,50 @@
11
param name string
22
param location string = resourceGroup().location
33
param tags object = {}
4-
5-
param allowedOrigins array = []
64
param applicationInsightsName string = ''
75
param appServicePlanId string
86
param appSettings object = {}
9-
param keyVaultName string
7+
param runtimeName string
8+
param runtimeVersion string
109
param serviceName string = 'api'
1110
param storageAccountName string
12-
param openAiAccountName string
13-
param openAiResourceGroupName string
11+
param deploymentStorageContainerName string
12+
param virtualNetworkSubnetId string = ''
13+
param instanceMemoryMB int = 2048
14+
param maximumInstanceCount int = 100
15+
param identityId string = ''
16+
param identityClientId string = ''
17+
param aiServiceUrl string = ''
18+
19+
var applicationInsightsIdentity = 'ClientId=${identityClientId};Authorization=AAD'
1420

15-
module api '../core/host/functions.bicep' = {
16-
name: '${serviceName}-functions-python-module'
21+
module api '../core/host/functions-flexconsumption.bicep' = {
22+
name: '${serviceName}-functions-module'
1723
params: {
1824
name: name
1925
location: location
2026
tags: union(tags, { 'azd-service-name': serviceName })
21-
allowedOrigins: allowedOrigins
22-
alwaysOn: false
23-
appSettings: union(appSettings, {
24-
AZURE_OPENAI_KEY: openai.listKeys().key1
27+
identityType: 'UserAssigned'
28+
identityId: identityId
29+
appSettings: union(appSettings,
30+
{
31+
AzureWebJobsStorage__clientId : identityClientId
32+
APPLICATIONINSIGHTS_AUTHENTICATION_STRING: applicationInsightsIdentity
33+
AZURE_OPENAI_ENDPOINT: aiServiceUrl
34+
AZURE_CLIENT_ID: identityClientId
2535
})
2636
applicationInsightsName: applicationInsightsName
2737
appServicePlanId: appServicePlanId
28-
keyVaultName: keyVaultName
29-
//py
30-
numberOfWorkers: 1
31-
minimumElasticInstanceCount: 0
32-
//--py
33-
runtimeName: 'python'
34-
runtimeVersion: '3.9'
38+
runtimeName: runtimeName
39+
runtimeVersion: runtimeVersion
3540
storageAccountName: storageAccountName
36-
scmDoBuildDuringDeployment: false
41+
deploymentStorageContainerName: deploymentStorageContainerName
42+
virtualNetworkSubnetId: virtualNetworkSubnetId
43+
instanceMemoryMB: instanceMemoryMB
44+
maximumInstanceCount: maximumInstanceCount
3745
}
3846
}
3947

40-
resource openai 'Microsoft.CognitiveServices/accounts@2023-05-01' existing = {
41-
name: openAiAccountName
42-
scope: resourceGroup(openAiResourceGroupName)
43-
}
44-
45-
output SERVICE_API_IDENTITY_PRINCIPAL_ID string = api.outputs.identityPrincipalId
4648
output SERVICE_API_NAME string = api.outputs.name
4749
output SERVICE_API_URI string = api.outputs.uri
50+
output SERVICE_API_IDENTITY_PRINCIPAL_ID string = api.outputs.identityPrincipalId

infra/app/db.bicep

Lines changed: 0 additions & 31 deletions
This file was deleted.

0 commit comments

Comments
 (0)