Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

server : (web ui) Various improvements, now use vite as bundler #10599

Open
wants to merge 9 commits into
base: master
Choose a base branch
from

Conversation

ngxson
Copy link
Collaborator

@ngxson ngxson commented Nov 30, 2024

Motivation

The new web UI has received significantly more positive feedback than anticipated, prompting consideration for further enhancements.

Currently, we operate without a bundler, running code directly from index.html and 3rd party libraries from deps.sh. However, this approach has limitations, particularly with daisyui component compatibility. It also leave a big binary size with many redundant parts inside.

Given that many llama.cpp contributors have a lower-level programming background, here's a brief overview of the current UI tech stack and why we need to use them:

  • Tailwindcss: A CSS framework that simplifies styling, used by major platforms like ChatGPT, Claude, and Hugging Face
  • Daisyui: A tailwindcss-based component library offering ready-to-use elements like chat bubbles, buttons, and themes
  • Due to the large size of these libraries (up to 2MB for pre-compiled versions), we need vite as a bundler to eliminate unused code. This also enables compilation into a single .html file, eliminating runtime dependencies.

Improvements

Key updates in this PR:

  • Project relocated to server/webui using npm for dependency management (with deps.sh script removed)
  • Integration of vite bundler (build instructions in server/README.md). Output index.html size is just under 500kb, which is smaller than the old approach with deps.sh (Remind: this new index.html contains everything needed)
  • Enhanced mobile compatibility (screenshots below)

For binary size, this PR recudes 1MB of the final compiled binary compared to master:

# master
$ ls -lah llama-server
-rwxr-xr-x  1 ngxson  staff   5.5M Nov 30 15:02 llama-server

# PR
-rwxr-xr-x  1 ngxson  staff   4.5M Nov 30 14:52 llama-server
IMG_1847 IMG_1846 IMG_1845

|

@github-actions github-actions bot added examples devops improvements to build systems and github actions server labels Nov 30, 2024
@ngxson ngxson marked this pull request as ready for review November 30, 2024 13:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
devops improvements to build systems and github actions examples server
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants