A full-stack Q&A chatbot built with React and Django, powered by LangChain and Hugging Face’s Meta-Llama-3-8B-Instruct model.
- Real-time chatbot interface
- LangChain-powered natural language processing
- Uses Meta-Llama-3-8B-Instruct from Hugging Face
- Simple and clean React-based UI
- Option to clear chat history
- Frontend: React
- Backend: Django + Django REST Framework
- LLM Integration: LangChain + Hugging Face (Meta-Llama-3-8B-Instruct)
git clone https://github.com/your-username/LangChain-Q-A.git
cd LangChain-Q-A
cd backend
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
Create a .env
file in the backend
folder and add your Hugging Face token:
HUGGINGFACEHUB_API_TOKEN=your_token_here
Run the Django server:
python manage.py runserver
cd ../frontend
npm install
npm start
The React app will open at http://localhost:3000
.
LangChain-Q-A/
├── backend/ # Django backend with LangChain logic
├── frontend/ # React frontend for the chat UI
└── README.md
This project is licensed under the MIT License.