You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running the llama 8B model, it claims, that it only has get_assistant_functions() tools available, not get_general_assistant_functions(). For gpt both are working well.
Is this an issue of the size of the model? If so, should we make it possible to select which tools should be prioritized?
For example getting the uniprot information to support the model might be more important, than enabling dimenionality reduction.
The text was updated successfully, but these errors were encountered:
When running the llama 8B model, it claims, that it only has get_assistant_functions() tools available, not get_general_assistant_functions(). For gpt both are working well.
Is this an issue of the size of the model? If so, should we make it possible to select which tools should be prioritized?
For example getting the uniprot information to support the model might be more important, than enabling dimenionality reduction.
The text was updated successfully, but these errors were encountered: