You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
LocalAI's documentation states that "LocalAI is focused on making AI accessible to anyone." However, while the LocalAI Gallery (https://localai.io/gallery.html) is comprehensive, it lacks clear documentation on the minimum system requirements needed to run each model variation. Specifically, it does not specify the minimum amount of RAM, GPU memory, or other resource requirements. This omission can lead to wasted time troubleshooting errors when a model fails to load due to insufficient resources, rather than an issue with the LocalAI installation itself.
Describe the solution you'd like
Display Minimum Requirements: Update the LocalAI Gallery to clearly list the minimum resource requirements (e.g., RAM, GPU memory, etc.) for each model variation.
Metadata Integration: Incorporate metadata to each gallery item so that LocalAI can automatically check if the current system meets the minimum resource requirements. If not, it will warn the user.
Enhanced Error Messaging: Provide more informative error messages that alert users when a model cannot be loaded because the system does not meet the necessary resource requirements.
Describe alternatives you've considered
The current workaround involves manually researching and verifying system requirements, which is not user-friendly, especially for those with limited technical knowledge. An alternative could be to create a separate documentation page outlining these requirements, but integrating this information directly into the gallery and providing system checks would offer a more streamlined and accessible solution.
Is your feature request related to a problem? Please describe.
LocalAI's documentation states that "LocalAI is focused on making AI accessible to anyone." However, while the LocalAI Gallery (https://localai.io/gallery.html) is comprehensive, it lacks clear documentation on the minimum system requirements needed to run each model variation. Specifically, it does not specify the minimum amount of RAM, GPU memory, or other resource requirements. This omission can lead to wasted time troubleshooting errors when a model fails to load due to insufficient resources, rather than an issue with the LocalAI installation itself.
Describe the solution you'd like
Describe alternatives you've considered
The current workaround involves manually researching and verifying system requirements, which is not user-friendly, especially for those with limited technical knowledge. An alternative could be to create a separate documentation page outlining these requirements, but integrating this information directly into the gallery and providing system checks would offer a more streamlined and accessible solution.
Update:
The JanAI have a similar docs and built-in feature https://jan.ai/docs/models/manage-models#local-model
The text was updated successfully, but these errors were encountered: