Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to use Groq Api key #19

Closed
aniketqw opened this issue Feb 21, 2025 · 4 comments
Closed

Unable to use Groq Api key #19

aniketqw opened this issue Feb 21, 2025 · 4 comments

Comments

@aniketqw
Copy link

I have attached the code, the error message, and a picture illustrating how the environment variable has been declared. I am currently using Colab.

Image Image
@kevinwitte
Copy link

kevinwitte commented Feb 21, 2025

I have the same issue. I run it locally on Mac. The strange thing is that the error occurs even if I have set all API keys.

@rlancemartin
Copy link
Collaborator

rlancemartin commented Feb 21, 2025

Good call out. The problem is that the assistant expected ANTHROPIC_API_KEY to be set for the writer model, but I realize this was not clear based upon the notebook example. I've cut a new release and also updated the notebook to make it clear that 1) you should supply a writer model and 2) also make it possible to use a GROQ model as a writer. Note that I also found that R1 via Groq is not great at tool-calling, which we use to produce structured outputs for search queries. During section writing, we make many tool calls for queries and Groq team recommended llama-3.3-70b-versatile. However, I found deepseek-r1-distill-llama-70b to still be OK during the planning phase b/c there are only 2 tool calls (one query generation and one tool call to produce the sections), but I added this as a call-out in the README to be aware of if you want to use R1 even for planning.

# Fast config with DeepSeek-R1-Distill-Llama-70B
thread = {"configurable": {"thread_id": str(uuid.uuid4()),
                           "search_api": "tavily",
                           "planner_provider": "groq",
                           "planner_model": "deepseek-r1-distill-llama-70b",
                           "writer_provider": "groq",
                           "writer_model": "llama-3.3-70b-versatile",
                           "max_search_depth": 1,}
                           }

@aniketqw
Copy link
Author

aniketqw commented Feb 21, 2025

So basically we can you all the model provided by the langchain init_chat_model as from what I can understand we use it other the hood. For such case what should api name format when calling it(as we will have to set as environment variable in colab).For such cases how can I pass the API key while writing the configurable

@rlancemartin
Copy link
Collaborator

all the model provided by the langchain init_chat_model as from what I can understand we use it other the hood.

right. all we need to do is add different providers to the enum in the configuration:

3aa98d8

for example, it would be easy to add Gemini (see here) but you are correct that you would need to set API key.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants