Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: OpenAI - connect allow_parallel_tool_calls to parallel_tool_calls option #17961

Open
GICodeWarrior opened this issue Feb 28, 2025 · 1 comment
Labels
enhancement New feature or request triage Issue needs to be triaged/prioritized

Comments

@GICodeWarrior
Copy link
Contributor

Feature Description

The existing allow_parallel_tool_calls option isn't sent to OpenAI. It is used to filter the output in some flows though.

Workaround:
Pass parallel_tool_calls=False via llm_kwargs or similar.

Reason

The lack of this feature can cause issues when expecting a single response (e.g. as_structured_llm). The LLM can split it's response across multiple function calls, and then the framework drops all but the first.

It's made a bit worse since it looks like the framework supports disabling parallel calls via allow_parallel_tool_calls, and that does not currently work.

Value of Feature

No response

@GICodeWarrior GICodeWarrior added enhancement New feature or request triage Issue needs to be triaged/prioritized labels Feb 28, 2025
@logan-markewich
Copy link
Collaborator

I welcome a PR! If not ill probably get to it tomorrow

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

2 participants