Support for max_completion_tokens in the o1 series #10523
-
It has been resolved in #9749 , but I still get an error after updating to v1.37.0. Am I missing some setting?
Microsoft.SemanticKernel.HttpOperationException Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead. この例外は、最初にこの呼び出し履歴 内部例外 1: Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Tagging @RogerBarreto for visibility. |
Beta Was this translation helpful? Give feedback.
-
@aeras3637 Thanks for bringing the subject, I have identified and was able to replicate, we merged in a fix for that that will be available on the next release. After the fix, to use the next max_completion_tokens with Azure you need to update your code to something like: var result = await service.GetChatMessageContentAsync("my prompt", new AzureOpenAIPromptExecutionSettings
{
SetNewMaxCompletionTokensEnabled = true,
MaxTokens = 1000,
}); |
Beta Was this translation helpful? Give feedback.
@aeras3637 Thanks for bringing the subject, I have identified and was able to replicate, we merged in a fix for that that will be available on the next release.
After the fix, to use the next max_completion_tokens with Azure you need to update your code to something like: