You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I included a self-contained, minimal example that demonstrates the issue INCLUDING all the relevant imports. The code run AS IS to reproduce the issue.
Example Code
fromdatetimeimportdatetimeimportoperatorimportosfromtypingimportAnnotated, Sequence, TypedDictfromdotenvimportload_dotenvfromlangchain_core.messagesimportAIMessage, BaseMessage, HumanMessagefromlangchain_core.promptsimportChatPromptTemplatefromlangchain_core.runnablesimportRunnableLambdafromlangchain.toolsimportToolfromlangchain_openaiimportChatOpenAIfromlangfuse.callbackimportCallbackHandlerfromlanggraph.graphimportEND, START, StateGraphfromlanggraph.prebuiltimportToolNode# Your env file should contain your OPENAI_API_KEY + your Langsmith credentialsload_dotenv('PATH-to-env-file')
openai_key=os.getenv('OPENAI_API_KEY')
classAgentState(TypedDict):
messages: Annotated[Sequence[BaseMessage], operator.add]
next: str# Define a new tool that returns the current datetimedatetime_tool=Tool(
name="Datetime",
func=lambdax: datetime.now().isoformat(),
description="Returns the current datetime",
)
defto_continue(state: AgentState):
"""End graph if last message has no tool calls."""last_msg=state['messages'][-1]
ifisinstance(last_msg, AIMessage) andlast_msg.tool_calls:
return'tools'returnENDsystem_prompt="""You are an agent tasked with answering all user queries by indicating thetoday's date and time first. For instance if user asks "what is the capitalof France?", you respond:"Today is 21 March 2025, the time is 15:10.""The capital of France is Paris."You get the today's date and time from the Datetime tool."""llm=ChatOpenAI(model='gpt-4o', api_key=openai_key)
agent_node= (
ChatPromptTemplate.from_messages([('system', system_prompt),
('placeholder', '{messages}'),
('system', 'Show the today date and time and then respond to the user query')])
|llm.bind_tools([datetime_tool])
|RunnableLambda(lambdax: {'messages': [x]}))
tool_node=ToolNode([datetime_tool])
# Define the agent graphworkflow=StateGraph(AgentState)
workflow.add_node('agent', agent_node)
workflow.add_node('tools', tool_node)
workflow.add_edge(START, 'agent')
workflow.add_conditional_edges('agent', to_continue)
workflow.add_edge('tools', 'agent')
graph=workflow.compile(debug=False)
Agent=graph| (lambdax: {'output': x['messages'][-1].content, **x})
langfuse_handler=CallbackHandler()
# Invoking the CompiledGraphres1=graph.invoke({'messages': [HumanMessage(content='What is the capital of Switzerland?')]},
config={'callbacks': [langfuse_handler]})
# Invoking the RunnableSequenceres2=Agent.invoke({'messages': [HumanMessage(content='What is the capital of Switzerland?')]},
config={'callbacks': [langfuse_handler]})
print(res1['messages'][-1].content)
print(res2.get('output'))
Error Message and Stack Trace (if applicable)
Description
I have a simple agent implemented as a LangGraph, consisting of two nodes:
1. Agent Node – Handles user input and generates a response.
2. Datetime Tool Node – Provides the current date and time.
The agent is designed to first retrieve the current date and time from the tool and then generate its response accordingly.
Expected Behavior
When I invoke the CompiledGraph directly, the Langsmith tracing correctly associates each ChatOpenAI call (which includes system prompts and user/AI messages) with the corresponding execution of the agent node in the graph. (See the first image below.)
However, if I wrap the compiled graph inside a lambda function (e.g., to return the result as a dictionary with an "output" key), the tracing behavior changes. In this case, all ChatOpenAI calls appear at the end of the trace, rather than being correctly nested within the agent node’s execution. (See the second image below.)
The same thing also happens with Langfuse.
System Info
System Information
OS: Darwin
OS Version: Darwin Kernel Version 24.3.0: Thu Jan 2 20:24:16 PST 2025; root:xnu-11215.81.4~3/RELEASE_ARM64_T6000
Python Version: 3.12.7 | packaged by Anaconda, Inc. | (main, Oct 4 2024, 08:22:19) [Clang 14.0.6 ]
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
I have a simple agent implemented as a LangGraph, consisting of two nodes:
1. Agent Node – Handles user input and generates a response.
2. Datetime Tool Node – Provides the current date and time.
The agent is designed to first retrieve the current date and time from the tool and then generate its response accordingly.
Expected Behavior
When I invoke the CompiledGraph directly, the Langsmith tracing correctly associates each ChatOpenAI call (which includes system prompts and user/AI messages) with the corresponding execution of the agent node in the graph. (See the first image below.)
However, if I wrap the compiled graph inside a lambda function (e.g., to return the result as a dictionary with an "output" key), the tracing behavior changes. In this case, all ChatOpenAI calls appear at the end of the trace, rather than being correctly nested within the agent node’s execution. (See the second image below.)
The same thing also happens with Langfuse.
System Info
System Information
Package Information
The text was updated successfully, but these errors were encountered: