Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

usage_metadata return None in Langgraph Studio #3936

Open
4 tasks done
DmitryKatson opened this issue Mar 20, 2025 · 7 comments
Open
4 tasks done

usage_metadata return None in Langgraph Studio #3936

DmitryKatson opened this issue Mar 20, 2025 · 7 comments

Comments

@DmitryKatson
Copy link

Checked other resources

  • This is a bug, not a usage question. For questions, please use GitHub Discussions.
  • I added a clear and detailed title that summarizes the issue.
  • I read what a minimal reproducible example is (https://stackoverflow.com/help/minimal-reproducible-example).
  • I included a self-contained, minimal example that demonstrates the issue INCLUDING all the relevant imports. The code run AS IS to reproduce the issue.

Example Code

from typing import Dict, Any
from langchain_core.messages import SystemMessage
from langchain_openai import AzureChatOpenAI
from langgraph.graph import START, StateGraph, MessagesState
import os

llm = AzureChatOpenAI(
    azure_deployment=os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME"),
    openai_api_version=os.getenv("OPENAI_API_VERSION"),
    azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
    temperature=0
)

# System message
system_message = SystemMessage(content="You are a helpful assistant.")

# Define the basic assistant node
def assistant(state: MessagesState) -> Dict[str, Any]:

    """Basic assistant node that processes messages."""
    result = llm.invoke([system_message] + state["messages"])
    print("Usage metadata: ", result.usage_metadata)
    return {"messages": [result]}

# Build graph
builder = StateGraph(MessagesState)

# Add nodes
builder.add_node("assistant", assistant)

# Set entry and finish points
builder.set_entry_point("assistant")
builder.set_finish_point("assistant")

# Compile graph
graph = builder.compile()

Error Message and Stack Trace (if applicable)

result.usage_metadata is None

Description

When trying to access tokens usage inside the graph - always return None.

result = llm.invoke("hello")
print("Usage metadata: ", result.usage_metadata)

Usage metadata: None.

However if i use just the same code outside of langgraph (in test.py file)

from langchain_openai import AzureChatOpenAI
import os

llm = AzureChatOpenAI(
    azure_deployment=os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME"),
    openai_api_version=os.getenv("OPENAI_API_VERSION"),
    azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
    temperature=0
)

result = llm.invoke("hello")
print(result.usage_metadata)

{'input_tokens': 8, 'output_tokens': 11, 'total_tokens': 19, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}}

Same with llm.with_structured_output(myPydanticClass, include_raw=True) - if i use inside the langgraph information about token usage is not returned, but if i run outside - all works as expected.

Am i doing something wrong?

System Info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 24.0.0: Mon Aug 12 20:49:48 PDT 2024; root:xnu-11215.1.10~2/RELEASE_ARM64_T8103
Python Version: 3.12.6 (v3.12.6:a4a2d2b0d85, Sep 6 2024, 16:08:03) [Clang 13.0.0 (clang-1300.0.29.30)]

Package Information

langchain_core: 0.3.46
langchain: 0.3.21
langchain_community: 0.3.20
langsmith: 0.3.18
langchain_openai: 0.3.9
langchain_text_splitters: 0.3.7
langgraph_api: 0.0.31
langgraph_cli: 0.1.77
langgraph_license: Installed. No version info available.
langgraph_sdk: 0.1.58
langgraph_storage: Installed. No version info available.

Optional packages not installed

langserve

Other Dependencies

aiohttp<4.0.0,>=3.8.3: Installed. No version info available.
async-timeout<5.0.0,>=4.0.0;: Installed. No version info available.
click: 8.1.8
cryptography: 43.0.3
dataclasses-json<0.7,>=0.5.7: Installed. No version info available.
httpx: 0.28.1
httpx-sse<1.0.0,>=0.4.0: Installed. No version info available.
jsonpatch<2.0,>=1.33: Installed. No version info available.
jsonschema-rs: 0.29.1
langchain-anthropic;: Installed. No version info available.
langchain-aws;: Installed. No version info available.
langchain-azure-ai;: Installed. No version info available.
langchain-cohere;: Installed. No version info available.
langchain-community;: Installed. No version info available.
langchain-core<1.0.0,>=0.3.45: Installed. No version info available.
langchain-deepseek;: Installed. No version info available.
langchain-fireworks;: Installed. No version info available.
langchain-google-genai;: Installed. No version info available.
langchain-google-vertexai;: Installed. No version info available.
langchain-groq;: Installed. No version info available.
langchain-huggingface;: Installed. No version info available.
langchain-mistralai;: Installed. No version info available.
langchain-ollama;: Installed. No version info available.
langchain-openai;: Installed. No version info available.
langchain-text-splitters<1.0.0,>=0.3.7: Installed. No version info available.
langchain-together;: Installed. No version info available.
langchain-xai;: Installed. No version info available.
langchain<1.0.0,>=0.3.21: Installed. No version info available.
langgraph: 0.3.18
langgraph-checkpoint: 2.0.21
langsmith-pyo3: Installed. No version info available.
langsmith<0.4,>=0.1.125: Installed. No version info available.
langsmith<0.4,>=0.1.17: Installed. No version info available.
numpy<3,>=1.26.2: Installed. No version info available.
openai-agents: Installed. No version info available.
openai<2.0.0,>=1.66.3: Installed. No version info available.
opentelemetry-api: Installed. No version info available.
opentelemetry-exporter-otlp-proto-http: Installed. No version info available.
opentelemetry-sdk: Installed. No version info available.
orjson: 3.10.15
packaging: 24.2
packaging<25,>=23.2: Installed. No version info available.
pydantic: 2.10.6
pydantic-settings<3.0.0,>=2.4.0: Installed. No version info available.
pydantic<3.0.0,>=2.5.2;: Installed. No version info available.
pydantic<3.0.0,>=2.7.4: Installed. No version info available.
pydantic<3.0.0,>=2.7.4;: Installed. No version info available.
pyjwt: 2.10.1
pytest: Installed. No version info available.
python-dotenv: 1.0.1
PyYAML>=5.3: Installed. No version info available.
requests: 2.32.3
requests-toolbelt: 1.0.0
requests<3,>=2: Installed. No version info available.
rich: Installed. No version info available.
SQLAlchemy<3,>=1.4: Installed. No version info available.
sse-starlette: 2.1.3
starlette: 0.46.1
structlog: 25.2.0
tenacity: 9.0.0
tenacity!=8.4.0,<10,>=8.1.0: Installed. No version info available.
tenacity!=8.4.0,<10.0.0,>=8.1.0: Installed. No version info available.
tiktoken<1,>=0.7: Installed. No version info available.
typing-extensions>=4.7: Installed. No version info available.
uvicorn: 0.34.0
watchfiles: 1.0.4
zstandard: 0.23.0

@hinthornw
Copy link
Contributor

hinthornw commented Mar 20, 2025

Are you sure you're running frmo the same python environment? LangGraph has no concept of llms, etc. so I'm not sure how the two would interact in this way.

@gbaian10
Copy link
Contributor

I'm not sure about the situation with AzureChatOpenAI, but the output from ChatOpenAI seems normal?

from typing import Any

from dotenv import load_dotenv
from langchain_core.messages import SystemMessage
from langchain_openai import ChatOpenAI
from langgraph.graph import MessagesState, StateGraph
from rich import get_console

load_dotenv()
llm = ChatOpenAI(model="gpt-4o-mini")

system_message = SystemMessage(content="You are a helpful assistant.")


def assistant(state: MessagesState) -> dict[str, Any]:
    result = llm.invoke([system_message] + state["messages"])
    print("Usage metadata: ", result.usage_metadata)
    return {"messages": [result]}


builder = StateGraph(MessagesState)
builder.add_node("assistant", assistant)
builder.set_entry_point("assistant")
builder.set_finish_point("assistant")

graph = builder.compile()
result = graph.invoke({"messages": [("human", "hello")]})
get_console().print(result)

@DmitryKatson
Copy link
Author

DmitryKatson commented Mar 20, 2025 via email

@DmitryKatson
Copy link
Author

DmitryKatson commented Mar 21, 2025

I'm not sure about the situation with AzureChatOpenAI, but the output from ChatOpenAI seems normal?

from typing import Any

from dotenv import load_dotenv
from langchain_core.messages import SystemMessage
from langchain_openai import ChatOpenAI
from langgraph.graph import MessagesState, StateGraph
from rich import get_console

load_dotenv()
llm = ChatOpenAI(model="gpt-4o-mini")

system_message = SystemMessage(content="You are a helpful assistant.")

def assistant(state: MessagesState) -> dict[str, Any]:
result = llm.invoke([system_message] + state["messages"])
print("Usage metadata: ", result.usage_metadata)
return {"messages": [result]}

builder = StateGraph(MessagesState)
builder.add_node("assistant", assistant)
builder.set_entry_point("assistant")
builder.set_finish_point("assistant")

graph = builder.compile()
result = graph.invoke({"messages": [("human", "hello")]})
get_console().print(result)

Interesting.
When I run this code directly i get expected result

Image

But when i use langgraph studio i get None

langgraph dev

Image

@eyurtsev
Copy link
Collaborator

eyurtsev commented Mar 21, 2025

When you're running using langgraph dev maybe it's using a different env?

You can use from langchain_core.sys_info import print_sys_info; print_sys_info() to get env information printed out about all packages.

@DmitryKatson
Copy link
Author

When you're running using langgraph dev maybe it's using a different env?

You can use from langchain_core.sys_info import print_sys_info; print_sys_info() to get env information printed out about all packages.

Yes, absolutely identical. I reproduced this on the very new fresh vm, installed everything from scratch. langgraph studio and direct python execution, use same environment.

Image

Same issue

Image

@DmitryKatson
Copy link
Author

DmitryKatson commented Mar 22, 2025

Interesting, that when with this code i start langgraph langgraph dev it runs the graph and i get token usage

Image

and i see this in langsmith traces as well

Image

but when invoke the question through the studio it didn't return the usage_metadata.

Image Image Image

If i call this graph through API i get usage_metadata as well

Image Image

So, for some reason the problem is in calling from studio

@DmitryKatson DmitryKatson changed the title usage_metadata return None in Langgraph usage_metadata return None in Langgraph Studio Mar 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants