Note
This repository contains a demo of using CrewAI for content generation with human-in-the-loop capabilities, along with an API wrapper to serve the CrewAI application.
This project demonstrates how to build AI agent workflows using the CrewAI framework and expose them as a RESTful API. Key features include:
- Human-in-the-Loop (HITL) approval processes
- Asynchronous execution of AI agent workflows
- Webhook notifications for job status updates
- Support for multiple LLM providers (OpenRouter, Azure OpenAI)
Click to see the project structure
crewai_demo/ # Root project directory
│
├── crewai_app/ # CrewAI application
│ ├── __init__.py # Package initialization
│ └── orchestrator.py # Main orchestrator for CrewAI using @CrewBase pattern
│
├── api_wrapper/ # API wrapper service
│ ├── __init__.py # Package initialization
│ ├── api_wrapper.py # Main API service
│ └── api_client.py # Client for the API
│
├── docs/ # Documentation
│ ├── api_wrapper_documentation.md
│ ├── azure_openai_integration.md
│ ├── HITL_WORKFLOW.md
│ ├── HITL_IMPLEMENTATION.md
│ └── curl_examples.md
│
├── scripts/ # Utility scripts
│ ├── run_api.sh # Script to run the API service
│ ├── check_jobs.sh # Script to check job status
│ ├── curl_workflow.sh # Example curl commands for workflows
│ ├── direct_mode_example.sh # Example for direct mode
│ ├── hitl_mode_example.sh # Example for HITL mode
│ └── webhook_receiver.py # Simple webhook receiver for testing
│
├── .env.sample # Environment variable template
├── .env.azure.sample # Azure-specific environment template
└── requirements.txt # Project dependencies
The crewai_app
directory contains the core CrewAI application, which is responsible for content generation using AI agents. The main component is:
orchestrator.py
: The main orchestrator that defines the CrewAI crew, agents, and tasks using the@CrewBase
decorator pattern. All CrewAI functionality is consolidated in this file.
The api_wrapper
directory contains the API service that wraps the CrewAI application and exposes it via a REST API. The main components are:
api_wrapper.py
: The main FastAPI application that provides endpoints for interacting with the CrewAI application.api_client.py
: A client for the API that can be used to interact with the API programmatically.
The project includes comprehensive documentation in the docs/
directory:
- API Wrapper Documentation - How to use the API wrapper
- Azure OpenAI Integration - Using Azure OpenAI with the application
- HITL Workflow - Human-in-the-Loop workflow guide
- HITL Implementation - Implementation details for HITL
- cURL Examples - Example API requests using curl
The scripts
directory contains utility scripts for working with the application:
run_api.sh
- Script to start the API servercheck_jobs.sh
- Script to check the status of jobscurl_workflow.sh
- Example workflow using curl commandsdirect_mode_example.sh
- Example of using direct modehitl_mode_example.sh
- Example of using HITL modewebhook_receiver.py
- Simple webhook receiver for testing notifications
Click to expand setup instructions
-
Clone the repository
git clone https://github.com/yourusername/crewai_demo.git cd crewai_demo
-
Install dependencies
pip install -r requirements.txt
-
Configure environment variables
cp .env.sample .env # Edit .env with your API keys and configuration
-
Run the API wrapper
bash scripts/run_api.sh # Or directly with uvicorn: # uvicorn api_wrapper.api_wrapper:app --host 0.0.0.0 --port 8888 --reload
-
Make a test request
curl -X POST http://localhost:8888/kickoff \ -H "Content-Type: application/json" \ -d '{ "crew": "ContentCreationCrew", "inputs": { "topic": "Artificial Intelligence" } }'
For more examples, see the scripts in the
scripts/
directory or the cURL Examples documentation.
Important
The application supports multiple LLM providers. Make sure to configure the appropriate environment variables.
OpenRouter Configuration (Default)
# In .env file
LLM_PROVIDER=openrouter
OPENAI_API_KEY=your_api_key_here
OPENAI_API_BASE=https://openrouter.ai/api/v1
OPENROUTER_MODEL=openai/gpt-4o-mini
Note: OpenRouter uses the OPENAI_API_KEY
environment variable for authentication.
Azure OpenAI Configuration
# In .env file
LLM_PROVIDER=azure
AZURE_OPENAI_API_KEY=your_azure_api_key_here
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
AZURE_OPENAI_API_VERSION=2023-05-15
AZURE_OPENAI_DEPLOYMENT_ID=gpt-35-turbo-0125
See the Azure OpenAI Integration documentation for more details.