Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Graph Memory in AI Assistant #1887

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
334 changes: 334 additions & 0 deletions cookbooks/mem0_ai_assistant_with_graph_memory.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,334 @@
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"provenance": []
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
}
},
"cells": [
{
"cell_type": "markdown",
"source": [
"# AI Companion with Memory\n",
"\n",
"This Jupyter notebook implements an AI companion that engages in conversations and remembers previous interactions using OpenAI's GPT-4 model and a memory system.\n",
"\n",
"## Overview\n",
"\n",
"The AI Companion utilizes the following components:\n",
"- OpenAI's GPT-4 for natural language processing\n",
"- mem0ai for memory management\n",
"- Qdrant for vector storage\n",
"- Neo4j for graph storage\n",
"\n",
"### Notes\n",
"\n",
"*1. Ensure all necessary credentials and API keys are properly set up before running the notebook. Do not share sensitive information when pushing to a public repository.*\n",
"\n",
"*2. This notebook is designed to run in Google Colab. Ensure that you have set up the necessary credentials in Google Colab's userdata before running the notebook.*"
],
"metadata": {
"id": "vKj2pyq-uOyD"
}
},
{
"cell_type": "markdown",
"source": [
"## Installing mem0ai and other dependencies"
],
"metadata": {
"id": "ynNCkRW15HHy"
}
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"id": "8mi0QJY44u4x"
},
"outputs": [],
"source": [
"!pip install mem0ai openai"
]
},
{
"cell_type": "markdown",
"source": [
"# Importing Libraries"
],
"metadata": {
"id": "EuKQ9vkcvbqB"
}
},
{
"cell_type": "code",
"source": [
"import os\n",
"import json\n",
"from google.colab import userdata\n",
"from openai import OpenAI\n",
"from mem0 import Memory\n",
"from typing import List, Dict"
],
"metadata": {
"id": "2HMT2YxuvbPd"
},
"execution_count": 11,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## Setting the OpenAI API keys"
],
"metadata": {
"id": "KMlWWxEn5OCL"
}
},
{
"cell_type": "code",
"source": [
"os.environ['OPENAI_API_KEY'] = userdata.get('OPENAI_API_KEY')"
],
"metadata": {
"id": "OJYlo6hU437u"
},
"execution_count": 12,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## Setting Credentials"
],
"metadata": {
"id": "YPFSKwur5RMO"
}
},
{
"cell_type": "code",
"source": [
"URL = userdata.get('URL')\n",
"USERNAME = userdata.get('USERNAME')\n",
"PASSWORD = userdata.get('PASSWORD')\n",
"QDRANT_COLLECTION = userdata.get('QDRANT_COLLECTION')\n",
"QDRANT_URL = userdata.get('QDRANT_URL')\n",
"QDRANT_API_KEY = userdata.get('QDRANT_API_KEY')\n",
"user_id=userdata.get('USER_ID')"
],
"metadata": {
"id": "4Xf4m4QT43-S"
},
"execution_count": 13,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"# Setting Configurations"
],
"metadata": {
"id": "A2d0hnwoitMy"
}
},
{
"cell_type": "code",
"source": [
"# Initialize the OpenAI client\n",
"client = OpenAI()\n",
"\n",
"config = {\n",
" \"llm\": {\n",
" \"provider\": \"openai\",\n",
" \"config\": {\n",
" \"model\": \"gpt-4o\",\n",
" \"temperature\": 0.2,\n",
" }\n",
" },\n",
" \"vector_store\": {\n",
" \"provider\": \"qdrant\",\n",
" \"config\": {\n",
" \"collection_name\": QDRANT_COLLECTION,\n",
" \"url\": QDRANT_URL,\n",
" \"api_key\": QDRANT_API_KEY\n",
" }\n",
" },\n",
" \"graph_store\": {\n",
" \"provider\": \"neo4j\",\n",
" \"config\": {\n",
" \"url\": URL,\n",
" \"username\": USERNAME,\n",
" \"password\": PASSWORD\n",
" },\n",
" },\n",
" \"version\": \"v1.1\"\n",
" }\n",
"\n",
"memory = Memory.from_config(config_dict=config)"
],
"metadata": {
"id": "qZZLUyF8HM24"
},
"execution_count": 14,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"**Checking the Memory**"
],
"metadata": {
"id": "pp0GSjqCrYAC"
}
},
{
"cell_type": "code",
"source": [
"memory.get_all(user_id=user_id)"
],
"metadata": {
"id": "oEMRFg5zXnVR"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"# AI Assistant"
],
"metadata": {
"id": "EQzZtcG20mSz"
}
},
{
"cell_type": "code",
"source": [
"class Companion:\n",
" \"\"\"\n",
" A class representing an AI companion that can engage in conversations and remember previous interactions.\n",
" \"\"\"\n",
"\n",
" def __init__(self, client: OpenAI):\n",
" \"\"\"\n",
" Initialize the Companion.\n",
"\n",
" :param client: An instance of the OpenAI client for API interactions.\n",
" \"\"\"\n",
" self.client = client\n",
"\n",
" def ask(self, question: str, user_id: str) -> str:\n",
" \"\"\"\n",
" Process a user's question and generate a response.\n",
"\n",
" :param question: The user's input question.\n",
" :param user_id: The unique identifier for the user.\n",
" :return: The AI-generated response to the question.\n",
" \"\"\"\n",
" # Retrieve relevant previous memories\n",
" previous_memories = memory.search(question, user_id=user_id)\n",
" relevant_memories_text = \"\\n\".join(mem[\"memory\"] for mem in previous_memories['results'])\n",
"\n",
" # Construct the prompt with the question and relevant memories\n",
" prompt = f\"User input: {question}\\nPrevious memories: {relevant_memories_text}\"\n",
" messages = [\n",
" {\n",
" \"role\": \"system\",\n",
" \"content\": \"You are the user's companion. Use the user's input and previous memories to respond. Answer based on the context provided.\"\n",
" },\n",
" {\n",
" \"role\": \"user\",\n",
" \"content\": prompt\n",
" }\n",
" ]\n",
"\n",
" try:\n",
" # Generate a response using the OpenAI API\n",
" stream = self.client.chat.completions.create(\n",
" model=\"gpt-4\",\n",
" stream=True,\n",
" messages=messages\n",
" )\n",
"\n",
" answer = \"\"\n",
" for chunk in stream:\n",
" if chunk.choices[0].delta.content is not None:\n",
" content = chunk.choices[0].delta.content\n",
" print(content, end=\"\", flush=True)\n",
" answer += content\n",
"\n",
" # Add the new interaction to memory\n",
" memory.add(question, user_id=user_id)\n",
" return answer\n",
" except Exception as e:\n",
" print(f\"An error occurred: {e}\")\n",
" return \"\"\n",
"\n",
" def __enter__(self):\n",
" \"\"\"\n",
" Enter the runtime context for the Companion.\n",
"\n",
" :return: The Companion instance.\n",
" \"\"\"\n",
" return self\n",
"\n",
" def __exit__(self, exc_type, exc_val, exc_tb):\n",
" \"\"\"\n",
" Exit the runtime context for the Companion.\n",
"\n",
" :param exc_type: The exception type, if any.\n",
" :param exc_val: The exception value, if any.\n",
" :param exc_tb: The exception traceback, if any.\n",
" \"\"\"\n",
" pass\n",
"\n",
"def main():\n",
" \"\"\"\n",
" The main function to run the AI companion interaction loop.\n",
" \"\"\"\n",
" with Companion(client) as ai_companion:\n",
" while True:\n",
" text_input = input(\"\\nEnter text (or 'quit' to exit):\\t\")\n",
" if text_input.lower() == 'quit':\n",
" print(\"Goodbye :)\")\n",
" break\n",
" print(\"\\nAssistant:\")\n",
" ai_companion.ask(text_input, user_id)\n",
"\n",
"if __name__ == \"__main__\":\n",
" main()"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "t_Y9wWL8sBth",
"outputId": "ee012152-c489-4a13-f815-bf28a86282e7"
},
"execution_count": 17,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"Enter text (or 'quit' to exit):\tHi\n",
"\n",
"Assistant:\n",
"Hello James! How can I assist you today?\n",
"Enter text (or 'quit' to exit):\tquit\n",
"Goodbye :)\n"
]
}
]
}
]
}