Skip to content

Commit

Permalink
Merge pull request #4 from tsterbak/dev
Browse files Browse the repository at this point in the history
Release 0.1.1
  • Loading branch information
tsterbak authored Aug 23, 2024
2 parents ecad0f6 + 2c8586a commit f559656
Show file tree
Hide file tree
Showing 38 changed files with 1,379 additions and 601 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@

The approach with "PromptMage" is to provide a pragmatic solution that bridges the current gap in LLM workflow management. We aim to empower developers, researchers, and organizations by making LLM technology more accessible and manageable, thereby supporting the next wave of AI innovations.

![PromptMage](docs/images/screenshots/playground-finished.png)
![PromptMage](https://github.com/tsterbak/promptmage/tree/main/docs/images/screenshots)

Take the [walkthrough](https://promptmage.io/walkthrough/) to see what you can do with PromptMage.

Expand Down
119 changes: 119 additions & 0 deletions docs/getting-started.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,119 @@

# Getting Started

## Installation

To install promptmage, run the following command:

```bash
pip install promptmage
```

## Annotated Code Example

Here is an example of how to use promptmage in your application:

``` python
from promptmage import PromptMage, Prompt, MageResult

# Create a new promptmage instance
mage = PromptMage(#(1)!
name="example",#(2)!
)
```

1. The [`PromptMage`](/reference/#promptmage) class is the main class of promptmage. It is used store all the information about the flow and to run the flow.
2. The `name` parameter is used to give the promptmage instance a unique name. This allows to run multiple promptmage instances in parallel.

Steps are the building blocks of a flow. They are used to define the different parts of the flow and to connect them together. A step is just a python function with the [`@mage.step()`](/reference/#promptmagestep) decorator which returns a [`MageResult`](/reference/#mageresult). Here is an example of how to create a step:

``` python
@mage.step(
name="step1", #(1)!
prompt_name="prompt1", #(2)!
initial=True #(3)!
)
def step1(question: str, prompt: Prompt) -> MageResult: #(4)!
response = client.chat.completions.create( #(5)!
model="gpt-4o-mini",
messages=[
{"role": "system", "content": prompt.system},
{
"role": "user",
"content": prompt.user.format(question=question),
},
],
)
answer = response.choices[0].message.content
return MageResult(
next_step=None, #(6)!
result=answer
)
```

1. The `name` parameter is used to give the step a unique name.
2. The `prompt_name` parameter is used to specify the name of the prompt that should be used for this step.
3. The `initial` parameter is used to specify if this is the initial step of the flow.
4. The `step1` function is a step that takes a question and a prompt as input and returns a [`MageResult`](/reference/#mageresult) with the result of the step and the name of the next step to run. The prompt is managed by the promptmage instance and is automatically passed to the step.
5. The step uses the OpenAI API to generate a response to the question using the prompt.
6. The `next_step` parameter is used to specify the name of the next step to run. If `None` is returned, the flow will stop.


## Usage

Put the above code in a file called `flow.py` and setup the OpenAI client. To run the flow with promptmage, run the following command:

```bash
promptmage run flow.py
```

This will start the promptmage server and run the flow at the given path. You can now access the promptmage interface at `http://localhost:8000/gui/`.

## GUI walkthrough

The promptmage interface is divided into four main sections: the flow playground, the run history, the prompt repository, and the evaluation section.

### Flow playground

<figure markdown="span">
![Flow playground Example](images/screenshots/promptmage-example-flow-1.png){ width="70%" }
<figcaption>Initial flow playground for the example flow.</figcaption>
</figure>

<figure markdown="span">
![Flow playground Edit step prompt](images/screenshots/promptmage-example-flow-2.png){ width="70%" }
<figcaption>Edit the step prompt of step 1.</figcaption>
</figure>

<figure markdown="span">
![Flow playground run](images/screenshots/promptmage-example-flow-3.png){ width="70%" }
<figcaption>After the run you can see the execution graph and the results.</figcaption>
</figure>

### Run history

<figure markdown="span">
![Run history overview](images/screenshots/promptmage-example-flow-4.png){ width="70%" }
<figcaption>Here you can see all your runs and the results.</figcaption>
</figure>

<figure markdown="span">
![Detailed run history](images/screenshots/promptmage-example-flow-5.png){ width="70%" }
<figcaption>By clicking on a run, you can look at the details.</figcaption>
</figure>

### Prompt repository

<figure markdown="span">
![Prompt repository](images/screenshots/promptmage-example-flow-6.png){ width="70%" }
<figcaption>You can see all your prompts and versions in the prompts repository.</figcaption>
</figure>


## More examples

Have a look at the examples in the [examples](https://github.com/tsterbak/promptmage/tree/main/examples) folder to see how to use promptmage in your application or workflow.

### Use with Docker

You can find an usage example with docker here: [Docker example](https://github.com/tsterbak/promptmage/tree/main/examples/docker).
Binary file added docs/images/screenshots/plaground-dark.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
121 changes: 77 additions & 44 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -1,65 +1,103 @@
<br />
<div align="center">
<a href="https://github.com/tsterbak/promptmage">
<img src="images/promptmage-logo.png" alt="PromptMage-Logo" width="120" height="120">
</a>
---
title: PromptMage
summary: PromptMage simplifies the process of creating and managing LLM workflows as a self-hosted solution.
date: 2024-08-23
authors:
- Tobias Sterbak
hide:
- navigation
extra:
class: hide-title
---

<div class="hero">
<div class="hero-image">
<img src="images/screenshots/plaground-dark.png" alt="PromptMage Playground">
</div>
<div class="hero-content">
<h1>Welcome to PromptMage</h1>
<p>
PromptMage is designed to offer an intuitive interface that simplifies the process of creating and managing complex LLM workflows.
</p>
<a href="getting-started" class="button">Get Started</a>
<a href="tutorial" class="button secondary">Learn More</a>
</div>
</div>

<h1 align="center">PromptMage</h1>
!!! warning "WARNING"

This application is currently in alpha state and under active development. Please be aware that the API and features may change at any time.

<p align="center">
simplifies the process of creating and managing LLM workflows as a self-hosted solution.
</p>
</div>
<center>
<div class="grid cards" markdown>

> [!WARNING]
> This application is currently in alpha state and under active development. Please be aware that the API and features may change at any time.
- :material-clock-fast:{ .lg .middle } __Set up in 5 minutes__

---

## About the Project
Get PromptMage up and running quickly with simple installation steps. Deploy locally or on your server with ease.

"PromptMage" is designed to offer an intuitive interface that simplifies the process of creating and managing LLM workflows as a self-hosted solution. It facilitates prompt testing and comparison, and it incorporates version control features to help users track the development of their prompts. Suitable for both small teams and large enterprises, "PromptMage" seeks to improve productivity and foster the practical use of LLM technology.
[:octicons-arrow-right-24: Getting started](getting-started)

The approach with "PromptMage" is to provide a pragmatic solution that bridges the current gap in LLM workflow management. We aim to empower developers, researchers, and organizations by making LLM technology more accessible and manageable, thereby supporting the next wave of AI innovations.
- :fontawesome-brands-github:{ .lg .middle } __Version Control Built-in__

![PromptMage](images/screenshots/playground-finished.png)
---

Take the [walkthrough](walkthrough.md) to see what you can do with PromptMage.
Track prompt development with integrated version control, making collaboration and iteration seamless.

## Philosophy
- Integrate the prompt playground into your workflow for fast iteration
- Prompts as first-class citizens with version control and collaboration features
- Manual and automatic testing and validation of prompts
- Easy sharing of results with domain experts and stakeholders
- build-in, automatically created API with fastAPI for easy integration and deployment
- Type-hint everything for automatic inference and validation magic
[:octicons-arrow-right-24: Learn more](/getting-started/#prompt-repository)

- :material-play-box:{ .lg .middle } __Prompt Playground__

---

## Getting Started
Test, compare, and refine prompts in an intuitive interface designed for rapid iteration.

### Installation
[:octicons-arrow-right-24: Playground](/getting-started/#flow-playground)

To install promptmage, run the following command:
- :material-api:{ .lg .middle } __Auto-generated API__

```bash
pip install promptmage
```
---

## Usage
Leverage a FastAPI-powered, automatically created API for easy integration and deployment.

To use promptmage, run the following command:
[:octicons-arrow-right-24: API Documentation](#)

```bash
promptmage run <path-to-flow>
```
- :material-check-decagram:{ .lg .middle } __Evaluation Mode__

This will start the promptmage server and run the flow at the given path. You can now access the promptmage interface at `http://localhost:8000/gui/`.
---

Have a look at the examples in the [examples](https://github.com/tsterbak/promptmage/tree/main/examples) folder to see how to use promptmage in your application or workflow.
Assess prompt performance through manual and automatic testing, ensuring reliability before deployment.

[:octicons-arrow-right-24: Evaluation Guide](#)

## Use with Docker
- :material-update:{ .lg .middle } __More to Come__

You can find an usage example with docker here: [Docker example](https://github.com/tsterbak/promptmage/tree/main/examples/docker).
---

Stay tuned for upcoming features and enhancements as we continue to evolve PromptMage.

[:octicons-arrow-right-24: Roadmap](roadmap)

</div>

</center>

## About the Project

"PromptMage" is designed to offer an intuitive interface that simplifies the process of creating and managing LLM workflows as a self-hosted solution. It facilitates prompt testing and comparison, and it incorporates version control features to help users track the development of their prompts. Suitable for both small teams and large enterprises, "PromptMage" seeks to improve productivity and foster the practical use of LLM technology.

The approach with "PromptMage" is to provide a pragmatic solution that bridges the current gap in LLM workflow management. We aim to empower developers, researchers, and organizations by making LLM technology more accessible and manageable, thereby supporting the next wave of AI innovations.

Take the [walkthrough](walkthrough.md) to see what you can do with PromptMage.

## Philosophy
- Integrate the prompt playground into your workflow for fast iteration
- Prompts as first-class citizens with version control and collaboration features
- Manual and automatic testing and validation of prompts
- Easy sharing of results with domain experts and stakeholders
- build-in, automatically created API with fastAPI for easy integration and deployment
- Type-hint everything for automatic inference and validation magic

## Development

Expand All @@ -76,11 +114,6 @@ We welcome contributions from the community! If you're interested in improving P
* **Pull Requests**: Contributions via pull requests are highly appreciated. Please ensure your code adheres to the coding standards of the project, and submit a pull request with a clear description of your changes.


## License

This project is licensed under the MIT License - see the [LICENSE.md](https://github.com/tsterbak/promptmage/blob/main/LICENSE.md) file for details.
Original development by [Tobias Sterbak](https://tobiassterbak.com). Copyright (C) 2024.

## Contact
For any inquiries or further information, feel free to reach out at [[email protected]](mailto:[email protected]).

Expand Down
13 changes: 13 additions & 0 deletions docs/license.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
---
title: PromptMage - License
summary: PromptMage is licensed under the MIT License.
date: 2024-08-23
authors:
- Tobias Sterbak
hide:
- navigation
---
# License

This project is licensed under the MIT License - see the [LICENSE.md](https://github.com/tsterbak/promptmage/blob/main/LICENSE.md) file for details.
Original development by [Tobias Sterbak](https://tobiassterbak.com).
81 changes: 81 additions & 0 deletions docs/reference.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
# API Reference

This page contains the API reference with the most important classes and methods of promptmage.

## PromptMage `class`

The `PromptMage` class is the main class of promptmage. It is used store all the information about the flow and to run the flow.

### Attributes

- **name** (`str`):
The name of the `PromptMage` instance.

- **available_models** (`List[str]`):
A list of available models to use for the flow.

!!! info

The available models are just strings that are passed to the step function to specify the model to use for the completion. You have to handle the model selection in the step function.

### Methods

#### `PromptMage.step()` `decorator`

Decorator to define a step in the flow.

!!! tip

A step is just a python function with the `@mage.step()` decorator which returns a `MageResult`.

##### Arguments

- **name** (`str`):
The name of the step.

- **prompt_name** (`str`):
The name of the prompt to use for this step.

- **initial** (`bool`):
Whether this is the initial step of the flow.

- **one_to_many** (`bool`):
Whether this step should be run for each item in the input list.

- **many_to_one** (`bool`):
Whether this step should be run for each item in the input list and the results should be combined.

---

## MageResult `class`

The `MageResult` class is used to return the result of a step.

### Attributes

- **next_step** (`str | None`):
The name of the next step to run.

- **error** (`str | None`):
An error message if the step failed.

- **\*\*kwargs** (`Any`):
All additional keyword arguments are stored as the result by name and can be used by the next step.

---

## Prompt `class`

The `Prompt` class is used to store the prompt information.

!!! warning

This class should not be created by the user. It is automatically created by the `PromptMage` instance and only used to pass the prompt to the step functions and retrieve the prompts from the database.

### Attributes

- **system** (`str`):
The system prompt.

- **user** (`str`):
The user prompt.
3 changes: 3 additions & 0 deletions docs/roadmap.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# Roadmap

Coming soon...
Loading

0 comments on commit f559656

Please sign in to comment.