You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I searched the LangChain documentation with the integrated search.
I used the GitHub search to find a similar question and didn't find it.
I am sure that this is a bug in LangChain rather than my code.
The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
def QuestionRouter (state) :
class RouteQuery(BaseModel):
"""Route a user query to the most relevant datasource."""
route: Literal["search", "ordinary"] = Field(
...,
description="Given a user question choose to route it to a tool or a ordinary question.",
)
print ("\n Inside Question Router")
structured_llm_router = llm.with_structured_output(RouteQuery)
Error Message and Stack Trace (if applicable)
Failed to use model_dump to serialize <class 'pydantic._internal._model_construction.ModelMetaclass'> to JSON: TypeError("BaseModel.model_dump() missing 1 required positional argument: 'self'")
Failed to use dict to serialize <class 'pydantic._internal._model_construction.ModelMetaclass'> to JSON: TypeError("BaseModel.dict() missing 1 required positional argument: 'self'")
Hello! How can I assist you with insurance broking servicesFailed to use model_dump to serialize <class 'pydantic._internal._model_construction.ModelMetaclass'> to JSON: TypeError("BaseModel.model_dump() missing 1 required positional argument: 'self'")
Description
using langchain + langgraph
Works perfectly with anthropic, groq etc
Issue is only visible with openai models
System Info
python 3.12.6
The text was updated successfully, but these errors were encountered:
to restore the previous behavior. See release notes for details.
If that is not related to your problem, can you provide a minimal reproducible example and share what versions of langchain packages + pydantic you are using? Here is my attempt, I am unable to reproduce the issue:
fromtypingimportLiteralfromlangchain_openaiimportChatOpenAIfrompydanticimportBaseModel, FieldclassRouteQuery(BaseModel):
"""Route a user query to the most relevant datasource."""route: Literal["search", "ordinary"] =Field(
...,
description="Given a user question choose to route it to a tool or a ordinary question.",
)
llm=ChatOpenAI(model="gpt-4o-mini")
structured_llm_router=llm.with_structured_output(RouteQuery)
structured_llm_router.invoke("how are you?")
Thanks you so much.
You were right it is a breaking fix done in 0.3.0 which is causing this issue.
The problem occurs which you use pydantic class to definite your output.
Works once I pass function_calling as method for gpt-4o
- With gpt4o-mini, looks like I need to change my prompt, the model behaves differently
On Monday, Jan 13, 2025 at 11:08 PM, ccurme ***@***.*** ***@***.***)> wrote:
On Friday langchain-openai==0.3.0 was released, which changed the default method for with_structured_output.
In case that is causing your issue, you can call
llm.with_structured_output(RouteQuery, method="function_calling")
to restore the previous behavior. See release notes (https://github.com/langchain-ai/langchain/releases/tag/langchain-openai%3D%3D0.3.0) for details.
If that is not related to your problem, can you provide a minimal reproducible example and share what versions of langchain packages + pydantic you are using? Here is my attempt, I am unable to reproduce the issue:
from typing import Literal from langchain_openai import ChatOpenAI from pydantic import BaseModel, Field class RouteQuery(BaseModel): """Route a user query to the most relevant datasource.""" route: Literal["search", "ordinary"] = Field( ..., description="Given a user question choose to route it to a tool or a ordinary question.", ) llm = ChatOpenAI(model="gpt-4o-mini") structured_llm_router = llm.with_structured_output(RouteQuery) structured_llm_router.invoke("how are you?")
—
Reply to this email directly, view it on GitHub (#29177 (comment)), or unsubscribe (https://github.com/notifications/unsubscribe-auth/BNGKFFCAWBUPEGAC3Q4HDDD2KP2XRAVCNFSM6AAAAABVDC4ZQWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKOBXG42TSOJRGU).
You are receiving this because you authored the thread.Message ID: ***@***.***>
Checked other resources
Example Code
def QuestionRouter (state) :
class RouteQuery(BaseModel):
"""Route a user query to the most relevant datasource."""
Error Message and Stack Trace (if applicable)
Failed to use model_dump to serialize <class 'pydantic._internal._model_construction.ModelMetaclass'> to JSON: TypeError("BaseModel.model_dump() missing 1 required positional argument: 'self'")
Failed to use dict to serialize <class 'pydantic._internal._model_construction.ModelMetaclass'> to JSON: TypeError("BaseModel.dict() missing 1 required positional argument: 'self'")
Hello! How can I assist you with insurance broking servicesFailed to use model_dump to serialize <class 'pydantic._internal._model_construction.ModelMetaclass'> to JSON: TypeError("BaseModel.model_dump() missing 1 required positional argument: 'self'")
Description
using langchain + langgraph
Works perfectly with anthropic, groq etc
Issue is only visible with openai models
System Info
python 3.12.6
The text was updated successfully, but these errors were encountered: