Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

who to run aqlm model #250

Open
werruww opened this issue Feb 16, 2025 · 19 comments
Open

who to run aqlm model #250

werruww opened this issue Feb 16, 2025 · 19 comments

Comments

@werruww
Copy link

werruww commented Feb 16, 2025

who to run aqlm model??????

@werruww
Copy link
Author

werruww commented Feb 16, 2025

ISTA-DASLab/Llama-3.2-1B-AQLM-PV-2Bit-2x8

@werruww
Copy link
Author

werruww commented Feb 17, 2025

Some models require installing special programs
pip install aqlm[gpu,cpu]
How do I install it in the program?

@werruww
Copy link
Author

werruww commented Feb 17, 2025

???????

pip install optimum

@werruww
Copy link
Author

werruww commented Feb 17, 2025

How to install programs in the program؟

@werruww
Copy link
Author

werruww commented Feb 17, 2025

Is it possible to install specific programs in the program?

@werruww
Copy link
Author

werruww commented Feb 17, 2025

How do I get to miniconda3transformerlab-app

@dadmobile
Copy link
Member

Depending on what you are trying to do, the main ways you would extend the program would be:

  • add a plugin to extend functionality. Plugins have a setup script that installs required libraries and then can add functionality to various parts of the application (serving models, training, converting between formats...see plugins tab)
  • editing existing plugins to add libraries and then update the code to use those libraries

If you just want to install libraries to the environment you can do so but you need to be in the right conda environment. If you run conda info --envs you will see your conda list...there should be an environment with a name like Users/username/.transformerlab/envs/transformerlab.

If you want to change core functionality in the engine that runs the app, you might want to actually clone the API. In that case I suggest doing "Manual Step-by-step install" on this page:
https://transformerlab.ai/docs/install/advanced-install

@werruww
Copy link
Author

werruww commented Feb 17, 2025

I searched and did not find. Is there a way to add specific libraries?

@dadmobile
Copy link
Member

As I mentioned, if you only want to install a library you would use conda on the command line like conda activate <environment>. You get by doing conda info --envs.

But usually, if you are installing a library you also want to add functionality to use it. To do that you would create or edit a plugin.

@werruww
Copy link
Author

werruww commented Feb 17, 2025

(base) C:\Windows\system32>conda info --envs

conda environments:

base * C:\ProgramData\anaconda3

(base) C:\Windows\system32>

@werruww
Copy link
Author

werruww commented Feb 17, 2025

Image

@werruww
Copy link
Author

werruww commented Feb 17, 2025

win10

@dadmobile
Copy link
Member

If you are running on Windows, the engine is running in WSL. So in order to interact with the environment you will have to work inside of WSL.

@werruww
Copy link
Author

werruww commented Feb 18, 2025

I am in Windows without wsl

@werruww
Copy link
Author

werruww commented Feb 18, 2025

I have run
unsloth/Phi-3-mini-4k-instruct-v0-bnb-4bit
Does it support other compressed models?

@werruww
Copy link
Author

werruww commented Feb 18, 2025

Image
Image

@werruww
Copy link
Author

werruww commented Feb 18, 2025

I use Anaconda and the program uses Mini Conda3 and I don't know where its path is

@werruww
Copy link
Author

werruww commented Feb 18, 2025

How to install libraries manually

@werruww
Copy link
Author

werruww commented Feb 18, 2025

in win 10
not wsl

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants