Skip to content

Commit

Permalink
Merge pull request #14 from tsterbak/dev
Browse files Browse the repository at this point in the history
Release 0.1.3
  • Loading branch information
tsterbak authored Oct 3, 2024
2 parents 76bc960 + dac2f25 commit 61ae300
Show file tree
Hide file tree
Showing 22 changed files with 980 additions and 29 deletions.
16 changes: 15 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,21 @@ To use promptmage, run the following command:
promptmage run <path-to-flow>
```

This will start the promptmage server and run the flow at the given path. You can now access the promptmage interface at `http://localhost:8000/gui/`.
This will start the local promptmage server and run the flow at the given path. You can now access the promptmage interface at `http://localhost:8000/gui/`.

To run the remote backend server, run the following command:

```bash
promptmage serve --port 8021
```

To make it work with your promptmage script, you should add the following lines to your script:

```python
from promptmage import PromptMage

mage = PromptMage(remote="http://localhost:8021") # or the URL of your remote server
```

Have a look at the examples in the [examples](https://github.com/tsterbak/promptmage/tree/main/examples) folder to see how to use promptmage in your application or workflow.

Expand Down
23 changes: 23 additions & 0 deletions docs/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,29 @@ promptmage run flow.py

This will start the promptmage server and run the flow at the given path. You can now access the promptmage interface at `http://localhost:8000/gui/`.

## Usage with a remote backend server

For a production setup and collaborative usage with teams you can run the promptmage server with a remote backend. To run the remote backend on a remote server, run the following command:

```bash
promptmage serve --port 8021
```

To connect your promptmage script to the remote backend, you need to add the `remote` url to the PromptMage instance of your script:

```python
mage = PromptMage(
name="example",
remote="http://localhost:8021" #(1)!
)
```

Now you can run your script and the promptmage server will use the remote backend to run the flow and store the results.

1. The `remote` parameter is used to specify the URL of the remote backend to use. If this is set, the `PromptMage` instance will use the remote backend instead of the local one.



## GUI walkthrough

The promptmage interface is divided into four main sections: the flow playground, the run history, the prompt repository, and the evaluation section.
Expand Down
89 changes: 89 additions & 0 deletions docs/reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,92 @@

This page contains the API reference with the most important classes and methods of promptmage.


## PromptMage CLI

The `promptmage` CLI is the command line interface to run the promptmage server and interact with the promptmage backend.

### version
Show the installed promptmage version.

Usage:
```bash
promptmage version
```

### run
Run a flow with the given path. A flow is a python script that defines the flow of the promptmage application.

Usage:
```bash
promptmage run <path-to-flow>
```

Available options:
- **`--port`** (`int`):
The port to run the server on. Default is `8000`.
- **`--host`** (`str`):
The host to run the server on. Default is `localhost`.

### serve
Start the promptmage backend server.

Usage:
```bash
promptmage serve
```

Available options:
- **`--port`** (`int`):
The port to run the server on. Default is `8021`.
- **`--host`** (`str`):
The host to run the server on. Default is `localhost`.

### export
Export the promptmage database to json.

Usage:
```bash
promptmage export --filename <filename>
```

Available options:
- **`--filename`** (`str`):
The filename to export the database to.
- **`--runs`** (`bool`):
Whether to export the runs as well. Default is `False`.
- **`--prompts`** (`bool`):
Whether to export the prompts as well. Default is `False`.

### backup
Backup the promptmage database to a json file.

Usage:
```bash
promptmage backup --json_path <json_path>
```

Available options:
- **`--json_path`** (`str`):
The path to the json file to backup the database to.

### restore
Restore the promptmage database from a json file.

!!! warning

This will ask for confirmation before restoring and will overwrite the current database.

Usage:
```bash
promptmage restore --json_path <json_path>
```

Available options:
- **`--json_path`** (`str`):
The path to the json file to restore the database from.


## PromptMage `class`

The `PromptMage` class is the main class of promptmage. It is used store all the information about the flow and to run the flow.
Expand All @@ -11,6 +97,9 @@ The `PromptMage` class is the main class of promptmage. It is used store all the
- **name** (`str`):
The name of the `PromptMage` instance.

- **remote** (`str`):
The URL of the remote backend to use. If this is set, the `PromptMage` instance will use the remote backend instead of the local one.

- **available_models** (`List[str]`):
A list of available models to use for the flow.

Expand Down
10 changes: 5 additions & 5 deletions docs/roadmap.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,19 +10,19 @@

### September

- [ ] Implement a remote backend for PromptMage
- [ ] Improve error handling and reporting
- [ ] More complex use-case examples
- [x] Implement a remote backend for PromptMage
- [x] Improve error handling and reporting

### October
- [ ] More complex use-case examples
- [ ] Implement a robust task queue for LLM calls

### November

- [ ]
- [ ] Implement automatic evaluation with llm-as-a-judge

### December

- [ ]
- [ ] more to come!

## 2025
85 changes: 85 additions & 0 deletions examples/summarize_article_by_facts_remote.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
import json
from typing import List
from openai import OpenAI
from dotenv import load_dotenv

from promptmage import PromptMage, Prompt, MageResult

load_dotenv()


client = OpenAI()

# Create a new PromptMage instance
mage = PromptMage(
name="fact-extraction",
available_models=["gpt-4o", "gpt-4o-mini", "gpt-4-turbo", "gpt-4", "gpt-3.5-turbo"],
remote_url="http://localhost:8021",
)


# Application code #


@mage.step(name="extract", prompt_name="extract_facts", initial=True)
def extract_facts(
article: str, focus: str | None, prompt: Prompt, model: str = "gpt-4o-mini"
) -> List[MageResult]:
"""Extract the facts as a bullet list from an article."""
response = client.chat.completions.create(
model=model,
messages=[
{"role": "system", "content": prompt.system},
{
"role": "user",
"content": prompt.user.format(article=article, focus=focus),
},
],
)
raw_facts = response.choices[0].message.content
raw_facts = raw_facts.replace("```json", "").strip("```").strip()
return [
MageResult(next_step="check_facts", fact=str(f)) for f in json.loads(raw_facts)
]


@mage.step(
name="check_facts",
prompt_name="check_facts",
)
def check_facts(fact: str, prompt: Prompt, model: str = "gpt-4o-mini") -> MageResult:
"""Check the extracted facts for accuracy."""
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": prompt.system},
{
"role": "user",
"content": prompt.user.format(fact=fact),
},
],
)
return MageResult(
next_step="summarize",
check_results=f"Fact: {fact}\n\nCheck result: {response.choices[0].message.content}",
)


@mage.step(
name="summarize",
prompt_name="summarize_facts",
many_to_one=True,
)
def summarize_facts(check_results: str, prompt: Prompt) -> MageResult:
"""Summarize the given facts as a single sentence."""
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": prompt.system},
{
"role": "user",
"content": prompt.user.format(check_result=check_results),
},
],
)
return MageResult(result=response.choices[0].message.content)
4 changes: 4 additions & 0 deletions promptmage/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -129,6 +129,10 @@ async def list_steps():
response_model=EndpointResponse,
tags=[flow.name],
)
# add a websocket for the flow
app.add_websocket_route(
f"/api/{slugify(flow.name)}/ws", flow.websocket_handler
)

return app

Expand Down
69 changes: 68 additions & 1 deletion promptmage/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,10 @@
from promptmage import __version__, title
from promptmage.utils import get_flows
from promptmage.api import PromptMageAPI
from promptmage.remote import RemoteBackendAPI
from promptmage.frontend import PromptMageFrontend
from promptmage.storage import SQLiteDataBackend, SQLitePromptBackend
from promptmage.storage.utils import backup_db_to_json, restore_db_from_json


@click.group()
Expand All @@ -32,7 +34,7 @@ def version():
exists=True,
),
)
@click.option("--host", default="0.0.0.0", help="The host IP to run the server on.")
@click.option("--host", default="localhost", help="The host IP to run the server on.")
@click.option("--port", default=8000, type=int, help="The port to run the server on.")
@click.option(
"--browser",
Expand Down Expand Up @@ -117,9 +119,74 @@ def export(runs: bool = False, prompts: bool = False, filename: str = "promptmag
click.echo("Export complete.")


@click.command()
@click.option("--host", help="The host IP to run the server on.", default="localhost")
@click.option("--port", help="The port to run the server on.", default=8021)
def serve(host: str, port: int):
"""Serve the PromptMage collaborative backend and frontend."""
logger.info(f"\nWelcome to\n{title}")
logger.info(f"Running PromptMage backend version {__version__}")
# create the .promptmage directory to store all the data
dirPath = Path(".promptmage")
dirPath.mkdir(mode=0o777, parents=False, exist_ok=True)

# create the FastAPI app
backend = RemoteBackendAPI(
url=f"http://{host}:{port}",
data_backend=SQLiteDataBackend(),
prompt_backend=SQLitePromptBackend(),
)
app = backend.get_app()

# run the applications
uvicorn.run(app, host=host, port=port, log_level="info")


@click.command()
@click.option(
"--json_path",
type=click.Path(
exists=True,
),
help="The path to write the JSON file containing the database backup.",
required=True,
)
def backup(json_path: str):
"""Backup the database from the PromptMage instance to json."""
click.echo(f"Backing up the database to '{json_path}'...")
backup_db_to_json(db_path=".promptmage/promptmage.db", json_path=json_path)
click.echo("Backup complete.")


@click.command()
@click.option(
"--json_path",
type=click.Path(
exists=True,
),
help="The path to the JSON file containing the database backup.",
required=True,
)
def restore(json_path: str):
"""Restore the database from json to the PromptMage instance."""
click.echo(f"Restoring the database from the backup '{json_path}'...")
# check if the database already exists
if Path(".promptmage/promptmage.db").exists():
click.confirm(
"Are you sure you want to overwrite the current database?",
abort=True,
)
# restore the database
restore_db_from_json(db_path=".promptmage/promptmage.db", json_path=json_path)
click.echo("Database restored successfully.")


promptmage.add_command(version)
promptmage.add_command(run)
promptmage.add_command(export)
promptmage.add_command(serve)
promptmage.add_command(backup)
promptmage.add_command(restore)


if __name__ == "__main__":
Expand Down
10 changes: 10 additions & 0 deletions promptmage/frontend/components/runs_page.py
Original file line number Diff line number Diff line change
Expand Up @@ -154,6 +154,16 @@ def display_comparison():
"flex-grow: 1; display: flex; flex-direction: column;"
):
ui.label(f"step_run_id: {run_data['step_run_id']}")
ui.label("Prompt:").classes("text-lg")
with ui.row().classes("w-full"):
with ui.column().classes("gap-0"):
ui.label("Version:").classes("text-sm text-gray-500")
ui.label("System Prompt:").classes("text-sm text-gray-500")
ui.label("User Prompt:").classes("text-sm text-gray-500")
with ui.column().classes("gap-0 w-1/2"):
ui.label(f"{run.prompt.version}")
ui.label(f"{run.prompt.system}")
ui.label(f"{run.prompt.user}")
ui.label("Output Data:").classes("text-lg")
try:
for key, value in run.output_data.items():
Expand Down
Loading

0 comments on commit 61ae300

Please sign in to comment.