Skip to content

Commit

Permalink
Merge pull request #1 from tsterbak/dev-v0.1.0
Browse files Browse the repository at this point in the history
Release verison 0.1.0
  • Loading branch information
tsterbak authored Aug 16, 2024
2 parents 4bf3249 + 7f2c16e commit a69a01d
Show file tree
Hide file tree
Showing 38 changed files with 2,096 additions and 862 deletions.
18 changes: 17 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,29 @@
<p align="center">
simplifies the process of creating and managing LLM workflows as a self-hosted solution.
</p>

[![License](https://img.shields.io/github/license/tsterbak/promptmage?color=green)](https://github.com/tsterbak/promptmage/blob/main/LICENSE)
[![Monthly downloads](https://img.shields.io/pypi/dm/promptmage
)](https://pypi.org/project/promptmage/)
[![PyPI version](https://img.shields.io/pypi/v/promptmage)](https://pypi.org/project/promptmage/)
[![GitHub issues](https://img.shields.io/github/issues/tsterbak/promptmage)](https://github.com/tsterbak/promptmage/issues)
[![GitHub stars](https://img.shields.io/github/stars/tsterbak/promptmage)](https://github.com/tsterbak/promptmage/stargazers)
</div>

> [!WARNING]
> This application is currently in alpha state and under active development. Please be aware that the API and features may change at any time.

## About the Project

"PromptMage" is designed to offer an intuitive interface that simplifies the process of creating and managing LLM workflows as a self-hosted solution. It facilitates prompt testing and comparison, and it incorporates version control features to help users track the development of their prompts. Suitable for both small teams and large enterprises, "PromptMage" seeks to improve productivity and foster the practical use of LLM technology.

The approach with "PromptMage" is to provide a pragmatic solution that bridges the current gap in LLM workflow management. We aim to empower developers, researchers, and organizations by making LLM technology more accessible and manageable, thereby supporting the next wave of AI innovations.

![PromptMage](docs/images/screenshots/playground-finished.png)

Take the [walkthrough](https://promptmage.io/walkthrough/) to see what you can do with PromptMage.

## Philosophy
- Integrate the prompt playground into your workflow for fast iteration
- Prompts as first-class citizens with version control and collaboration features
Expand Down Expand Up @@ -64,13 +79,14 @@ Contributing
We welcome contributions from the community! If you're interested in improving PromptMage, you can contribute in the following ways:

* **Reporting Bugs**: Submit an issue in our repository, providing a detailed description of the problem and steps to reproduce it.
* **Feature Requests**: Have ideas on how to make FlowForge better? We'd love to hear from you! Please submit an issue, detailing your suggestions.
* **Feature Requests**: Have ideas on how to make PromptMage better? We'd love to hear from you! Please submit an issue, detailing your suggestions.
* **Pull Requests**: Contributions via pull requests are highly appreciated. Please ensure your code adheres to the coding standards of the project, and submit a pull request with a clear description of your changes.


## License

This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details.
Original development by [Tobias Sterbak](https://tobiassterbak.com). Copyright (C) 2024.

## Contact
For any inquiries or further information, feel free to reach out at [[email protected]](mailto:[email protected]).
Expand Down
Binary file added docs/images/screenshots/gui-overview.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/screenshots/playground-empty.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/screenshots/playground-finished.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/screenshots/playground-run.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/screenshots/playground-step.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/screenshots/prompts-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/screenshots/runs-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
22 changes: 18 additions & 4 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,20 @@
</p>
</div>

> [!WARNING]
> This application is currently in alpha state and under active development. Please be aware that the API and features may change at any time.

## About the Project

"PromptMage" is designed to offer an intuitive interface that simplifies the process of creating and managing LLM workflows as a self-hosted solution. It facilitates prompt testing and comparison, and it incorporates version control features to help users track the development of their prompts. Suitable for both small teams and large enterprises, "PromptMage" seeks to improve productivity and foster the practical use of LLM technology.

The approach with "PromptMage" is to provide a pragmatic solution that bridges the current gap in LLM workflow management. We aim to empower developers, researchers, and organizations by making LLM technology more accessible and manageable, thereby supporting the next wave of AI innovations.

![PromptMage](images/screenshots/playground-finished.png)

Take the [walkthrough](walkthrough.md) to see what you can do with PromptMage.

## Philosophy
- Integrate the prompt playground into your workflow for fast iteration
- Prompts as first-class citizens with version control and collaboration features
Expand All @@ -40,17 +48,22 @@ pip install promptmage
To use promptmage, run the following command:

```bash
promptmage run <path-to-flow>.py
promptmage run <path-to-flow>
```

This will start the promptmage server and run the flow at the given path. You can now access the promptmage interface at `http://localhost:8000/gui/`.

Have a look at the examples in the [examples](https://github.com/tsterbak/promptmage/tree/main/examples) folder to see how to use promptmage in your application or workflow.


## Use with Docker

You can find an usage example with docker here: [Docker example](https://github.com/tsterbak/promptmage/tree/main/examples/docker).


## Development

To develop PromptMage, check out the [DEVELOPMENT.md](DEVELOPMENT.md) file.
To develop PromptMage, check out the [DEVELOPMENT.md](https://github.com/tsterbak/promptmage/blob/main/DEVELOPMENT.md) file.

## Contributing

Expand All @@ -59,13 +72,14 @@ Contributing
We welcome contributions from the community! If you're interested in improving PromptMage, you can contribute in the following ways:

* **Reporting Bugs**: Submit an issue in our repository, providing a detailed description of the problem and steps to reproduce it.
* **Feature Requests**: Have ideas on how to make FlowForge better? We'd love to hear from you! Please submit an issue, detailing your suggestions.
* **Feature Requests**: Have ideas on how to make PromptMage better? We'd love to hear from you! Please submit an issue, detailing your suggestions.
* **Pull Requests**: Contributions via pull requests are highly appreciated. Please ensure your code adheres to the coding standards of the project, and submit a pull request with a clear description of your changes.


## License

This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details.
This project is licensed under the MIT License - see the [LICENSE.md](https://github.com/tsterbak/promptmage/blob/main/LICENSE.md) file for details.
Original development by [Tobias Sterbak](https://tobiassterbak.com). Copyright (C) 2024.

## Contact
For any inquiries or further information, feel free to reach out at [[email protected]](mailto:[email protected]).
Expand Down
34 changes: 20 additions & 14 deletions docs/tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,36 +21,38 @@ First, we need to install PromptMage. You can install PromptMage using pip:
pip install promptmage
```

It is recommended to install PromptMage in a virtual environment to avoid conflicts with other packages.

## Step 2: Add PromptMage to your project

First, you need to add PromptMage to your project. You do that by adding the following to your `summarizer.py` file:

```python
data_store = DataStore(backend=SQLiteDataBackend())
prompt_store = PromptStore(backend=SQLitePromptBackend())
# Create a new PromptMage instance
mage = PromptMage(
name="fact-summarizer", prompt_store=prompt_store, data_store=data_store
)
mage = PromptMage(name="fact-summarizer")
```

Next, you need to define the prompts and dependencies between the steps. You can do that by adding the following code to the functions in the `summarizer.py` file:

```python
@mage.step(name="extract", prompt_name="extract_facts", depends_on=None)
@mage.step(name="extract", prompt_name="extract_facts", initial=True)
def extract_facts(article: str, prompt: Prompt) -> str:
...
return facts
# <your application code here>
return MageResult(facts=facts, next_step="summarize")
```


As a first step, this needs to be the initial step, so we set the `initial` parameter to `True`. This will be the first step that is executed when the application is run. Every step needs to return a `MageResult` object, which contains the output of the step and the name of the next step to be executed. In this case, the next step is the `summarize` step. Note, that you can also return a list of `MageResult` objects if you want to execute multiple steps in parallel.

```python
@mage.step(name="summarize", prompt_name="summarize_facts", depends_on="extract")
@mage.step(name="summarize", prompt_name="summarize_facts")
def summarize_facts(facts: str, prompt: Prompt) -> str:
...
return summary
# <your application code here>
return MageResult(summary=summary)
```

Now you can access the prompts within the functions using the `prompt` argument. The `prompt` argument is an instance of the `Prompt` class, which provides methods to interact with the prompt.
If the next_step is not specified, the step will be considered a terminal step and the application will stop after executing this step.

Now you can access the prompts within the step functions using the `prompt` argument. The `prompt` argument is an instance of the `Prompt` class, which provides methods to interact with the prompt.
By default we have a system and a user prompt available by `prompt.system` and `prompt.user` respectively. The prompts are later created in the web UI.

You don't need to worry about saving the prompts and data, PromptMage will take care of that for you.
Expand All @@ -63,4 +65,8 @@ Now you can run the application by
promptmage run summarizer.py
```

This will start the PromptMage web UI, where you can interact with the prompts and run and see the output of the steps.
This will start the PromptMage web UI, where you can interact with the prompts and run and see the output of the steps.
You can access the web UI at `http://localhost:8000/gui/`.


More examples can be found in the [examples](https://github.com/tsterbak/promptmage/tree/main/examples) folder.
64 changes: 64 additions & 0 deletions docs/walkthrough.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
# Walkthrough

## Launching the application

After you installed promptmage and added it to your project following the [tutorial](tutorial.md), you can now run the application and interact with it in the web UI.

To run the application, you can use the following command:

```bash
promptmage run summarizer.py
```

This will start the promptmage server and run the application at the given path.

## Accessing the API

PromptMage automatically creates an API for your application using FastAPI. You can access the API at `http://localhost:8000/api/` and the Swagger documentation at `http://localhost:8000/docs/`.

![Swagger UI](images/screenshots/swagger.png)

You can use the API to interact with your application programmatically or integrate it into other services.

## Interacting with the web UI

You can access the web UI at `http://localhost:8000/gui/`. Here you can interact with the prompts and see the output of the steps.

### Application Overview

The application overview shows all available flows.

![Application Overview](images/screenshots/gui-overview.png)

### Flow Overview

The flow overview shows all steps of the flow and their status as well as an execution graph for the flow once executed.

![Flow Overview](images/screenshots/playground-finished.png)

### Step interaction

You can interact with the steps by clicking on them. This will expand the step and show the prompts and the output of the step.
This also allows you to manually run the step and tweak the input and prompts.

![Step Interaction](images/screenshots/playground-step.png)


## Runs page

The runs page shows all runs of the application and allows you to see the output of the steps for each run.

![Runs Page](images/screenshots/runs-page.png)

You can also replay runs to see the output of the steps and the prompts that were used during the run.

## Prompt repository

The prompt repository allows you to manage your prompts. You can create new prompt versions, edit existing prompts, and delete prompts. You can also see the history of a prompt and see which runs used the prompt.

![Prompt Repository](images/screenshots/prompts-page.png)


## Conclusion

This concludes the walkthrough of PromptMage. You have seen how to install and use PromptMage, how to create a simple application, and how to interact with the web UI. You can now integrate PromptMage into your workflow and use it to build and test your applications faster and more efficiently.
2 changes: 2 additions & 0 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ This repository contains examples for using PromptMage in your application or wo

- Answer questions about YouTube videos: A multi-step LLM application that extracts information from a YouTube video and answers questions about the video. [View Example](https://github.com/tsterbak/promptmage/blob/main/examples/youtube_understanding.py)

- Multi-flow example: An example that demonstrates how to use multiple flows in a single application. [View Example](https://github.com/tsterbak/promptmage/blob/main/examples/multiflow.py)


## Getting Started

Expand Down
10 changes: 5 additions & 5 deletions examples/docker/summarize_article_by_facts.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from dotenv import load_dotenv
from openai import OpenAI

from promptmage import PromptMage, Prompt
from promptmage import PromptMage, Prompt, MageResult
from promptmage.storage import (
SQLitePromptBackend,
SQLiteDataBackend,
Expand Down Expand Up @@ -29,7 +29,7 @@
# Application code


@mage.step(name="extract", prompt_name="extract_facts", depends_on=None)
@mage.step(name="extract", prompt_name="extract_facts", initial=True)
def extract_facts(article: str, prompt: Prompt) -> str:
"""Extract the facts as a bullet list from an article."""
response = client.chat.completions.create(
Expand All @@ -42,10 +42,10 @@ def extract_facts(article: str, prompt: Prompt) -> str:
},
],
)
return response.choices[0].message.content
return MageResult(next_step="summarize", facts=response.choices[0].message.content)


@mage.step(name="summarize", prompt_name="summarize_facts", depends_on="extract")
@mage.step(name="summarize", prompt_name="summarize_facts")
def summarize_facts(facts: str, prompt: Prompt) -> str:
"""Summarize the given facts as a single sentence."""
response = client.chat.completions.create(
Expand All @@ -58,4 +58,4 @@ def summarize_facts(facts: str, prompt: Prompt) -> str:
},
],
)
return response.choices[0].message.content
return MageResult(summary=response.choices[0].message.content)
6 changes: 6 additions & 0 deletions examples/multiflow.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
from summarize_article_by_facts import mage as flow1
from youtube_understanding import mage as flow2

flow1

flow2
Loading

0 comments on commit a69a01d

Please sign in to comment.