-
-
Notifications
You must be signed in to change notification settings - Fork 222
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OSError: CUDA_HOME environment variable is not set. #291
Comments
I don't think it runs on metal. Only AMD/Nvidia so far. |
I don't know the situation around running CUDA on Macs, if that's even possible, but yes, if you're trying to run it on Metal you definitely won't get very far. As far as I know llama.cpp is the only option for that right now. |
Okay thanks for answering, I follow. I guess consider adding to the README that macOS without CUDA is currently unsupported, otherwise feel free to close this out |
I'm getting this errors related to exllama when trying to load a model on Text gneration web UI on Macbook M1: During handling of the above exception, another exception occurred: Traceback (most recent call last): |
@MrOiseau look in that stack trace near the top:
Then look here: https://github.com/oobabooga/text-generation-webui#amd-metal-intel-arc-and-cpus-without-avx2 cd text-generation-webui
git clone https://github.com/turboderp/exllama repositories/exllama In other words, I think you need to clone this repo to Also, technically your issue is outside the scope of this issue and this repo |
@jamesbraza Did you find any solution to this, or is it not possible to run Exllama on mac m1? |
I think it's possible, just didn't have time to tackle it focused 😄 |
Can you share which implementation of llama are you using if you are? |
I am getting this on my Mac M1 (Ventura 13.5.2) with Python 3.11.5:
My Mac doesn't have a GPU, so I don't have CUDA. How can I get past this error?
The text was updated successfully, but these errors were encountered: