From 5a1df10918082f1b2b36d811d3221a0ceaaee7bb Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Dominik=20Weckm=C3=BCller?=
<47481567+do-me@users.noreply.github.com>
Date: Sat, 7 Sep 2024 15:06:57 +0200
Subject: [PATCH] Update README.md
---
README.md | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)
diff --git a/README.md b/README.md
index 94fda11..356b9ce 100644
--- a/README.md
+++ b/README.md
@@ -2,20 +2,20 @@
-
Frontend-only live semantic search and chat-with-your-documents built on transformers.js
+Frontend-only live semantic search and chat-with-your-documents built on transformers.js. Supports Wasm and WebGPU!

## [Try the web app](https://do-me.github.io/SemanticFinder/), [install the Chrome extension](#browser-extension) or read the [introduction blog post](https://geo.rocks/post/semanticfinder-semantic-search-frontend-only/).
-Semantic search right in your browser! Calculates the embeddings and cosine similarity client-side without server-side inferencing, using [transformers.js](https://xenova.github.io/transformers.js/) and latest SOTA embedding models from Huggingface.
+## 🔥 For best performance try the [WebGPU Version here!](https://do-me.github.io/SemanticFinder/webgpu/) 🔥
-## Upcoming: WebGPU support!
-Stay tuned for announcements
+Semantic search right in your browser! Calculates the embeddings and cosine similarity client-side without server-side inferencing, using [transformers.js](https://xenova.github.io/transformers.js/) and latest SOTA embedding models from Huggingface.
## Models
All transformers.js-compatible feature-extraction models are supported. Here is a sortable list you can go through: [daily updated list](https://do-me.github.io/trending-huggingface-models/). Download the compatible models table as xlsx, csv, json, parquet, or html here: https://github.com/do-me/trending-huggingface-models/.
+Note that the wasm backend in transformers.js supports all mentioned models. If you want the best performance, make sure to use a WebGPU-compatible model.
## Catalogue
You can use super fast pre-indexed examples for *really* large books like the Bible or Les Misérables with hundreds of pages and search the content in less than 2 seconds 🚀. Try one of these and convince yourself: