This project is a real-time chat application built with Go, HTMX, and WebSockets, integrating with an AI language model for interactive conversations.
- Real-time chat interface
- AI-powered responses
- Markdown rendering for rich text formatting
- Code syntax highlighting
- Responsive design with Tailwind CSS
- Go 1.20 or later
- Docker (optional)
- Clone the repository:
git clone https://github.com/developersdigest/go-htmx-llm.git
cd go-htmx-llm
- Set up your environment variables:
Create a .env
file in the project root and add your OpenAI API key:
OPENAI_API_KEY=your_api_key_here
- Install dependencies:
go mod download
- Build and run the application:
go build -o main .
./main
- Build the Docker image:
docker build -t go-htmx-llm .
- Run the container:
docker run -p 8080:8080 --env-file .env go-htmx-llm
The application will be available at http://localhost:8080
.
main.go
: Main application file containing the server setup and WebSocket handlingstatic/index.html
: Frontend HTML file with HTMX integrationDockerfile
: Instructions for building the Docker imagego.mod
andgo.sum
: Go module files for dependency management
- Go: Backend server
- Fiber: Web framework for Go
- HTMX: Frontend interactivity
- WebSockets: Real-time communication
- Tailwind CSS: Styling
- Marked: Markdown parsing
- Highlight.js: Code syntax highlighting
Contributions are welcome! Please feel free to submit a Pull Request.
This project is open source and available under the MIT License.