Skip to content

ToolRuntime isn't injected into BaseTool subclass's _run #34558

@sammyne

Description

@sammyne

Checked other resources

  • This is a bug, not a usage question.
  • I added a clear and descriptive title that summarizes this issue.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
  • This is not related to the langchain-community package.
  • I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.

Package (Required)

  • langchain
  • langchain-openai
  • langchain-anthropic
  • langchain-classic
  • langchain-core
  • langchain-cli
  • langchain-model-profiles
  • langchain-tests
  • langchain-text-splitters
  • langchain-chroma
  • langchain-deepseek
  • langchain-exa
  • langchain-fireworks
  • langchain-groq
  • langchain-huggingface
  • langchain-mistralai
  • langchain-nomic
  • langchain-ollama
  • langchain-perplexity
  • langchain-prompty
  • langchain-qdrant
  • langchain-xai
  • Other / not sure / general

Example Code (Python)

from typing import Type

from langchain.tools import ToolRuntime
from langchain_core.tools import BaseTool
from pydantic import BaseModel, Field


class MultiplyInput(BaseModel):
    a: int = Field(description="first number")
    b: int = Field(description="second number")


class Multipler(BaseTool):
    name: str = "Multiplier"
    description: str = "a tool for multiplying two numbers"
    args_schema: Type[BaseModel] = MultiplyInput
    return_direct: bool = False

    def _run(self, a: int, b: int, runtime: ToolRuntime) -> int:
        print(f"runtime = {runtime}")

        return a * b


if __name__ == "__main__":
    import os

    import dotenv
    from langchain.agents import create_agent
    from langchain_openai import ChatOpenAI

    # load OPENAI_API_KEY, OPENAI_API_BASE_URL and OPENAI_MODEL from .env
    dotenv.load_dotenv()

    # model = ChatOpenAI(
    #     api_key=os.environ["OPENAI_API_KEY"],
    #     base_url=os.environ["OPENAI_API_BASE_URL"],
    #     model=os.environ["OPENAI_MODEL"],
    # )
    model = ChatOpenAI(model="gpt-4o")

    agent = create_agent(
        model, tools=[Multipler()], system_prompt="You are a helpful assistant."
    )

    r = agent.invoke({"messages": [{"role": "user", "content": "calculate 2 * 3"}]})

    for v in r["messages"]:
        v.pretty_print()

Error Message and Stack Trace (if applicable)

Traceback (most recent call last):
  File "/Users/sammyne/github.com/sammyne/langmem0/./examples/hi.py", line 46, in <module>
    r = agent.invoke({"messages": [{"role": "user", "content": "calculate 2 * 3"}]})
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/pregel/main.py", line 3068, in invoke
    for chunk in self.stream(
                 ~~~~~~~~~~~^
        input,
        ^^^^^^
    ...<10 lines>...
        **kwargs,
        ^^^^^^^^^
    ):
    ^
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/pregel/main.py", line 2643, in stream
    for _ in runner.tick(
             ~~~~~~~~~~~^
        [t for t in loop.tasks.values() if not t.writes],
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ...<2 lines>...
        schedule_task=loop.accept_push,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ):
    ^
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/pregel/_runner.py", line 167, in tick
    run_with_retry(
    ~~~~~~~~~~~~~~^
        t,
        ^^
    ...<10 lines>...
        },
        ^^
    )
    ^
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/pregel/_retry.py", line 42, in run_with_retry
    return task.proc.invoke(task.input, config)
           ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 656, in invoke
    input = context.run(step.invoke, input, config, **kwargs)
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 400, in invoke
    ret = self.func(*args, **kwargs)
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 799, in _func
    outputs = list(
        executor.map(self._run_one, tool_calls, input_types, tool_runtimes)
    )
  File "/Users/xiangminli/.local/share/uv/python/cpython-3.13.11-macos-aarch64-none/lib/python3.13/concurrent/futures/_base.py", line 619, in result_iterator
    yield _result_or_cancel(fs.pop())
          ~~~~~~~~~~~~~~~~~^^^^^^^^^^
  File "/Users/xiangminli/.local/share/uv/python/cpython-3.13.11-macos-aarch64-none/lib/python3.13/concurrent/futures/_base.py", line 317, in _result_or_cancel
    return fut.result(timeout)
           ~~~~~~~~~~^^^^^^^^^
  File "/Users/xiangminli/.local/share/uv/python/cpython-3.13.11-macos-aarch64-none/lib/python3.13/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ~~~~~~~~~~~~~~~~~^^
  File "/Users/xiangminli/.local/share/uv/python/cpython-3.13.11-macos-aarch64-none/lib/python3.13/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/Users/xiangminli/.local/share/uv/python/cpython-3.13.11-macos-aarch64-none/lib/python3.13/concurrent/futures/thread.py", line 59, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langchain_core/runnables/config.py", line 551, in _wrapped_fn
    return contexts.pop().run(fn, *args)
           ~~~~~~~~~~~~~~~~~~^^^^^^^^^^^
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 1010, in _run_one
    return self._execute_tool_sync(tool_request, input_type, config)
           ~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 959, in _execute_tool_sync
    content = _handle_tool_error(e, flag=self._handle_tool_errors)
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 424, in _handle_tool_error
    content = flag(e)  # type: ignore [assignment, call-arg]
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 381, in _default_handle_tool_errors
    raise e
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 916, in _execute_tool_sync
    response = tool.invoke(call_args, config)
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langchain_core/tools/base.py", line 629, in invoke
    return self.run(tool_input, **kwargs)
           ~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langchain_core/tools/base.py", line 981, in run
    raise error_to_raise
  File "/Users/sammyne/github.com/sammyne/langmem0/.venv/lib/python3.13/site-packages/langchain_core/tools/base.py", line 947, in run
    response = context.run(self._run, *tool_args, **tool_kwargs)
TypeError: Multipler._run() missing 1 required positional argument: 'runtime'
During task with name 'tools' and id '4967121b-b0b3-7ca9-cc4e-818167697ccb'

Description

  • I'm trying to build a custom tool by subclassing BaseTool, and want to access runtime arg of type ToolRuntime within the overriden _run method
  • Expect to access ToolRuntime successfuly
  • But it failed

System Info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 24.6.0: Mon Jul 14 11:30:30 PDT 2025; root:xnu-11417.140.69~1/RELEASE_ARM64_T6020
Python Version: 3.13.11 (main, Dec 17 2025, 20:55:16) [Clang 21.1.4 ]

Package Information

langchain_core: 1.2.5
langchain: 1.2.0
langsmith: 0.5.2
langchain_openai: 1.1.6
langgraph_sdk: 0.3.1

Optional packages not installed

langserve

Other Dependencies

httpx: 0.28.1
jsonpatch: 1.33
langgraph: 1.0.5
openai: 2.14.0
orjson: 3.11.5
packaging: 25.0
pydantic: 2.12.5
pyyaml: 6.0.3
requests: 2.32.5
requests-toolbelt: 1.0.0
tenacity: 9.1.2
tiktoken: 0.12.0
typing-extensions: 4.15.0
uuid-utils: 0.12.0
zstandard: 0.25.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugRelated to a bug, vulnerability, unexpected error with an existing featurelangchain`langchain` package issues & PRsopenai`langchain-openai` package issues & PRs

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions