Skip to content

Commit

Permalink
more docs
Browse files Browse the repository at this point in the history
  • Loading branch information
jxnl committed Jul 9, 2023
1 parent b0c3c56 commit abe6ebc
Show file tree
Hide file tree
Showing 7 changed files with 102 additions and 51 deletions.
2 changes: 1 addition & 1 deletion docs/chat-completion.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Using the Chatcompletion
# Using the Prompt Pipeline

To get started with this api we must first instantiate a `ChatCompletion` object and build the api call
by piping messages and functions to it.
Expand Down
3 changes: 0 additions & 3 deletions docs/help.md

This file was deleted.

80 changes: 46 additions & 34 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,19 @@
# Welcome to OpenAI Function Call

We try to provides a powerful and efficient approach to output parsing when interacting with OpenAI's Function Call API. One that is framework agnostic and minimizes any dependencies. It leverages the data validation capabilities of the Pydantic library to handle output parsing in a more structured and reliable manner. If you have any feedback, leave an issue or hit me up on [twitter](https://twitter.com/jxnlco).
We offer a minially invasive extention of `Pydantic.BaseModel` named `OpenAISchema`. It only has two methods, one to generate the correct schema, and one to produce the class from the completion.

This library is more, so a list of examples and a helper class so I'll keep the example as just structured extraction.

If the OpenAI is a chef's knife of code, I hope you sell you a nice handle which comes with a little pamplet of cuttign techniques.

It leverages the data validation capabilities of the Pydantic library to handle output parsing in a more structured and reliable manner.

If you have any feedback, leave an issue or hit me up on [twitter](https://twitter.com/jxnlco).

If you're looking for something more batteries included I strongly recommend [MarvinAI](https://www.askmarvin.ai/) which offers a high level api but does not provide as much access to prompting.

!!! tip "Just rip it out!"
If you don't want to install dependencies. I recommend literally ripping the `function_calls.py` into your own codebase. [[source code]](https://github.com/jxnl/openai_function_call/blob/main/openai_function_call/function_calls.py)

## Installation

Expand All @@ -10,9 +23,39 @@ pip install openai_function_call

## Usage

This module simplifies the interaction with the OpenAI API, enabling a more structured outputs. Below are examples showcasing the use of function calls and schemas with OpenAI and Pydantic. In later modoules we'll go over a wide array of more creative uses.
Below are examples showcasing the use of function calls and schemas with OpenAI and Pydantic. In later docs we'll go over a wide array of more creative uses.

### Example 1: Function Calls
### Example 1: Extraction

!!! Tip
Prompt are now sourced from docstrings and descriptions, so write clear and descriptive documentation!

```python
import openai
from openai_function_call import OpenAISchema

from pydantic import Field

class UserDetails(OpenAISchema):
"""Details of a user"""
name: str = Field(..., description="users's full name")
age: int

completion = openai.ChatCompletion.create(
model="gpt-3.5-turbo-0613",
functions=[UserDetails.openai_schema],
function_call={"name": UserDetails.openai_schema["name"]},
messages=[
{"role": "system", "content": "Extract user details from my requests"},
{"role": "user", "content": "My name is John Doe and I'm 30 years old."},
],
)

user_details = UserDetails.from_response(completion)
print(user_details) # name="John Doe", age=30
```

### Example 2: Function Calls

```python
import openai
Expand Down Expand Up @@ -43,34 +86,3 @@ completion = openai.ChatCompletion.create(
result = sum.from_response(completion)
print(result) # 9
```

### Example 2: Schema Extraction

```python
import openai
from openai_function_call import OpenAISchema

from pydantic import Field

class UserDetails(OpenAISchema):
"""Details of a user"""
name: str = Field(..., description="users's full name")
age: int

completion = openai.ChatCompletion.create(
model="gpt-3.5-turbo-0613",
functions=[UserDetails.openai_schema],
function_call={"name": UserDetails.openai_schema["name"]},
messages=[
{"role": "system", "content": "Extract user details from my requests"},
{"role": "user", "content": "My name is John Doe and I'm 30 years old."},
],
)

user_details = UserDetails.from_response(completion)
print(user_details) # name="John Doe", age=30
```

# Code

::: openai_function_call
15 changes: 14 additions & 1 deletion docs/multitask.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,18 @@
# MultiTask

We define a helper function `MultiTask` that dynamitcally creates a new schema that has a task attribute defined as a list of the task subclass, it including some prebuild prompts and allows us to avoid writing some extra code.
Defining a task and creating a list of classes is a common enough pattern that we define a helper function `MultiTask` that dynamitcally creates a new schema that has a task attribute defined as a list of the task subclass, it including some prebuild prompts and allows us to avoid writing some extra code.

!!! example "Extending user details"

Using the previous example with extracting `UserDetails` we might want to extract multiple users rather than a single user, `MultiTask` makes it easy!

```python
class UserDetails(OpenAISchema):
"""Details of a user"""
name: str = Field(..., description="users's full name")
age: int

MultiUserDetails = MultiTask(UserDetails)
```

::: openai_function_call.dsl.multitask
11 changes: 6 additions & 5 deletions docs/openai_schema.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,16 @@
# OpenAI Schema

The most generic helper is a light weight extention of Pydantic's BaseModel `OpenAISchema`.
It has a method to help you produce the schema and parse the result of function calls

This library is moreso a list of examples and a helper class so I'll keep the example as just structured extraction.
We offer a minimally invasive extention of `Pydantic.BaseModel` named `OpenAISchema`. It only has two methods, one to generate the correct schema, and one to produce the class from the completion.

!!! note "Where does the prompt go?"
Instead of defining your prompts in the messages the prompts you would usually use are now defined as part of the dostring of your class and the field descriptions. This is nice since it allows you to colocate the schema with the class you use to represent the structure.
Our philosphy is that the prompt should live beside the code. Prompting is done via dostrings and field descriptions which allows you to colocate prompts with your schema.

## Structured Extraction

You can directly use the class in your `openai` create calls by passing in the classes `openai_schema` and extract the class out with `from_completion`.

With this style of usage you get as close to the api call as possible giving you full control over configuration and prompting.

```python
import openai
from openai_function_call import OpenAISchema
Expand Down
36 changes: 32 additions & 4 deletions docs/pipeline-example.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,12 @@
# Using the pipeline
# Using the ChatCompletion pipeline

The pipeapi is some syntactic sugar to help build prompts in a readable way that avoids having to remember best practices around wording and structure. Examples include adding tips, tagging data with xml, or even including the chain of thought prompt as an assistant message.
The pipeline api is just syntactic sugar to help build prompts in a readable way that avoids having to remember best practices around wording and structure. Examples include adding tips, tagging data with xml, or even including the chain of thought prompt as an assistant message.

### Example Pipeline
## Example Pipeline

Here we'll define a task to segment queries and add some more instructions via the prompt pipeline api.

### Designing the schema

```python
from openai_function_call import OpenAISchema, dsl
Expand All @@ -19,10 +23,31 @@ class SearchQuery(OpenAISchema):
SearchResponse = dsl.MultiTask(
subtask_class=SearchQuery,
)
```

!!! tip "MultiTask"
To learn more about what multi task does, checkout the [MultiTask](multitask.md) documentation


### Building our prompts

We dont deal with prompt templates and treat chat, message, output schema as first class citizens and then pipe them into a completion object.

!!! note "Whats that?"
The pipe `|` is an overloaded operator that lets us cleanly compose our prompts.

`ChatCompletion` contains all the configuration for the model while we use `|` to build our prompt

We can then chain `|` together to add `Messages` or `OpenAISchema` and `ChatCompletion` will build out query for us while giving us a readable block to code to look ad

To see what 'message templates' are available check out our [docs](chat-completion.md)

```python
task = (
dsl.ChatCompletion(name="Segmenting Search requests example")
dsl.ChatCompletion(
name="Segmenting Search requests example",
model='gpt-3.5-turbo-0613,
max_token=1000)
| dsl.SystemTask(task="Segment search results")
| dsl.TaggedMessage(
content="can you send me the data about the video investment and the one about spot the dog?",
Expand All @@ -42,6 +67,9 @@ assert isinstance(search_request, SearchResponse)
print(search_request.json(indent=2))
```

!!! tip
If you want to see what its actually sent to OpenAI scroll to the bottom of the page!

Output

```json
Expand Down
6 changes: 3 additions & 3 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,8 +35,8 @@ nav:
- Home: 'index.md'
- API Reference:
- 'OpenAISchema': 'openai_schema.md'
- "Helper: MultiTask": "multitask.md"
- "Example: Pipeline API": "pipeline-example.md"
- "Docs": "chat-completion.md"
- "MultiTask Schema": "multitask.md"
- "Introduction: Pipeline API": "pipeline-example.md"
- "Message Templates": "chat-completion.md"
- Examples:
- 'Segmented Search': 'examples/search.md'

0 comments on commit abe6ebc

Please sign in to comment.