Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

with structured output gives error when using openai model but not anthropic or others. #29177

Open
5 tasks done
simpliatanu opened this issue Jan 13, 2025 · 2 comments
Open
5 tasks done
Assignees
Labels
Ɑ: core Related to langchain-core investigate Flagged for investigation.

Comments

@simpliatanu
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

def QuestionRouter (state) :
class RouteQuery(BaseModel):
"""Route a user query to the most relevant datasource."""

        route: Literal["search", "ordinary"] = Field(
        ...,
        description="Given a user question choose to route it to a tool or a ordinary question.",
    )
    
    
    print ("\n Inside Question Router") 
    structured_llm_router = llm.with_structured_output(RouteQuery)

Error Message and Stack Trace (if applicable)

Failed to use model_dump to serialize <class 'pydantic._internal._model_construction.ModelMetaclass'> to JSON: TypeError("BaseModel.model_dump() missing 1 required positional argument: 'self'")
Failed to use dict to serialize <class 'pydantic._internal._model_construction.ModelMetaclass'> to JSON: TypeError("BaseModel.dict() missing 1 required positional argument: 'self'")
Hello! How can I assist you with insurance broking servicesFailed to use model_dump to serialize <class 'pydantic._internal._model_construction.ModelMetaclass'> to JSON: TypeError("BaseModel.model_dump() missing 1 required positional argument: 'self'")

Description

using langchain + langgraph
Works perfectly with anthropic, groq etc
Issue is only visible with openai models

System Info

python 3.12.6

@langcarl langcarl bot added the investigate Flagged for investigation. label Jan 13, 2025
@dosubot dosubot bot added the Ɑ: core Related to langchain-core label Jan 13, 2025
@ccurme ccurme self-assigned this Jan 13, 2025
@ccurme
Copy link
Collaborator

ccurme commented Jan 13, 2025

On Friday langchain-openai==0.3.0 was released, which changed the default method for with_structured_output.

In case that is causing your issue, you can call

llm.with_structured_output(RouteQuery, method="function_calling")

to restore the previous behavior. See release notes for details.

If that is not related to your problem, can you provide a minimal reproducible example and share what versions of langchain packages + pydantic you are using? Here is my attempt, I am unable to reproduce the issue:

from typing import Literal

from langchain_openai import ChatOpenAI
from pydantic import BaseModel, Field


class RouteQuery(BaseModel):
    """Route a user query to the most relevant datasource."""
    route: Literal["search", "ordinary"] = Field(
        ...,
        description="Given a user question choose to route it to a tool or a ordinary question.",
    )


llm = ChatOpenAI(model="gpt-4o-mini")
structured_llm_router = llm.with_structured_output(RouteQuery)
structured_llm_router.invoke("how are you?")

@simpliatanu
Copy link
Author

simpliatanu commented Jan 14, 2025 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Ɑ: core Related to langchain-core investigate Flagged for investigation.
Projects
None yet
Development

No branches or pull requests

2 participants