Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatOpenAI Calls Misplaced in Langsmith Tracing When Using a Lambda Wrapper #3975

Open
4 tasks done
ahmadajal opened this issue Mar 21, 2025 · 1 comment
Open
4 tasks done
Assignees

Comments

@ahmadajal
Copy link

Checked other resources

  • This is a bug, not a usage question. For questions, please use GitHub Discussions.
  • I added a clear and detailed title that summarizes the issue.
  • I read what a minimal reproducible example is (https://stackoverflow.com/help/minimal-reproducible-example).
  • I included a self-contained, minimal example that demonstrates the issue INCLUDING all the relevant imports. The code run AS IS to reproduce the issue.

Example Code

from datetime import datetime
import operator
import os
from typing import Annotated, Sequence, TypedDict

from dotenv import load_dotenv
from langchain_core.messages import AIMessage, BaseMessage, HumanMessage
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnableLambda
from langchain.tools import Tool
from langchain_openai import ChatOpenAI
from langfuse.callback import CallbackHandler
from langgraph.graph import END, START, StateGraph
from langgraph.prebuilt import ToolNode

# Your env file should contain your OPENAI_API_KEY + your Langsmith credentials
load_dotenv('PATH-to-env-file')

openai_key = os.getenv('OPENAI_API_KEY')

class AgentState(TypedDict):
    messages: Annotated[Sequence[BaseMessage], operator.add]
    next: str

# Define a new tool that returns the current datetime
datetime_tool = Tool(
    name="Datetime",
    func = lambda x: datetime.now().isoformat(),
    description="Returns the current datetime",
)

def to_continue(state: AgentState):
    """End graph if last message has no tool calls."""
    last_msg = state['messages'][-1]
    if isinstance(last_msg, AIMessage) and last_msg.tool_calls:
        return 'tools'
    return END

system_prompt = """
You are an agent tasked with answering all user queries by indicating the
today's date and time first. For instance if user asks "what is the capital
of France?", you respond:
"Today is 21 March 2025, the time is 15:10."
"The capital of France is Paris."

You get the today's date and time from the Datetime tool.
"""

llm = ChatOpenAI(model='gpt-4o', api_key=openai_key)

agent_node = (
    ChatPromptTemplate.from_messages([('system', system_prompt),
                                    ('placeholder', '{messages}'),
                                    ('system', 'Show the today date and time and then respond to the user query')])
    | llm.bind_tools([datetime_tool])
    | RunnableLambda(lambda x: {'messages': [x]}))

tool_node = ToolNode([datetime_tool])

# Define the agent graph
workflow = StateGraph(AgentState)
workflow.add_node('agent', agent_node)
workflow.add_node('tools', tool_node)

workflow.add_edge(START, 'agent')
workflow.add_conditional_edges('agent', to_continue)
workflow.add_edge('tools', 'agent')
graph = workflow.compile(debug=False)

Agent = graph | (lambda x: {'output': x['messages'][-1].content, **x})

langfuse_handler = CallbackHandler()

# Invoking the CompiledGraph
res1 = graph.invoke({'messages': [HumanMessage(content='What is the capital of Switzerland?')]},
                    config={'callbacks': [langfuse_handler]})

# Invoking the RunnableSequence
res2 = Agent.invoke({'messages': [HumanMessage(content='What is the capital of Switzerland?')]},
                    config={'callbacks': [langfuse_handler]})

print(res1['messages'][-1].content)
print(res2.get('output'))

Error Message and Stack Trace (if applicable)

Description

I have a simple agent implemented as a LangGraph, consisting of two nodes:
1. Agent Node – Handles user input and generates a response.
2. Datetime Tool Node – Provides the current date and time.

The agent is designed to first retrieve the current date and time from the tool and then generate its response accordingly.

Expected Behavior

When I invoke the CompiledGraph directly, the Langsmith tracing correctly associates each ChatOpenAI call (which includes system prompts and user/AI messages) with the corresponding execution of the agent node in the graph. (See the first image below.)

Image

However, if I wrap the compiled graph inside a lambda function (e.g., to return the result as a dictionary with an "output" key), the tracing behavior changes. In this case, all ChatOpenAI calls appear at the end of the trace, rather than being correctly nested within the agent node’s execution. (See the second image below.)

Image

The same thing also happens with Langfuse.

System Info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 24.3.0: Thu Jan 2 20:24:16 PST 2025; root:xnu-11215.81.4~3/RELEASE_ARM64_T6000
Python Version: 3.12.7 | packaged by Anaconda, Inc. | (main, Oct 4 2024, 08:22:19) [Clang 14.0.6 ]

Package Information

langchain_core: 0.3.45
langchain: 0.3.21
langchain_community: 0.3.20
langsmith: 0.3.17
langchain_openai: 0.3.9
langchain_text_splitters: 0.3.7
langgraph_sdk: 0.1.57

@hinthornw hinthornw self-assigned this Mar 21, 2025
@hinthornw
Copy link
Contributor

(Can reproduce - delayed in getting the fix out0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants