An open source wireframe to app generator. Powered by Llama 3.2 Vision & groq.
- Llama 3.2 vision 90B from Meta for the LLM
- Llama 3.2 text 90B from Meta for the Vision model
- GROQ for LLM inference
- Uploadthing for image storage
- Next.js app router with Tailwind
- Expo snack SDK
- Clone the repo:
git clone https://github.com/mundume/quickchat-ai
- Create a
.env.local
file and add your GROQ:GROQ_API_KEY=
- Create an uploadthing accout and add the credentials to your
.env.local
file. All required values are in the.env.example
file. - Run
pnpm i
andpnpm dev
to install dependencies and run locally
This project was inspired by the Amazing Nutlope's Napkins.dev