Slow sentence encoding models #1020
ivanstepanovftw
started this conversation in
General
Replies: 1 comment 10 replies
-
Did you test in release ? I get:
|
Beta Was this translation helpful? Give feedback.
10 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have exported
sentence-transformers/all-MiniLM-L6-v2
(~ 60 MiB), and encoding is very slow. Encoding sentence takes 3 seconds bare metal and 10 seconds WASM, while pytorch takes about 0.03 seconds.Bare metal results:
Source code:
sentence-transformers.zip
Beta Was this translation helpful? Give feedback.
All reactions