人机协作

要在代理中审查、编辑和批准工具调用,您可以使用 LangGraph 内置的人机协作 (HIL) 功能,特别是 [interrupt()](https://langgraph.com.cn/reference/types/index.html#langgraph.types.interrupt interrupt”) 原语。

LangGraph 允许您无限期地暂停执行——几分钟、几小时甚至几天——直到收到人工输入。

这之所以可能,是因为代理状态已检查点到数据库中,这使得系统能够持久化执行上下文并在以后恢复工作流程,从中断处继续。

要深入了解人机协作概念,请参阅概念指南

image

在继续之前,人工可以审查和编辑代理的输出。这在工具调用可能敏感或需要人工监督的应用中尤其关键。

向工具添加人工审批步骤

  1. 在工具中使用 interrupt() 暂停执行。
  2. 使用 Command(resume=...) 根据人工输入继续。

API 参考:InMemorySaver | interrupt | create_react_agent

<code tabindex="0"><span id="__span-0-1">from langgraph.checkpoint.memory import InMemorySaver
<span id="__span-0-2">from langgraph.types import interrupt
<span id="__span-0-3">from langgraph.prebuilt import create_react_agent
<span id="__span-0-4">
<span id="__span-0-5"># An example of a sensitive tool that requires human review / approval
<span id="__span-0-6">def book_hotel(hotel_name: str):
<span id="__span-0-7">    """Book a hotel"""
<span id="__span-0-8">    response = interrupt(  
<span id="__span-0-9">        f"Trying to call `book_hotel` with args {{'hotel_name': {hotel_name}}}. "
<span id="__span-0-10">        "Please approve or suggest edits."
<span id="__span-0-11">    )
<span id="__span-0-12">    if response["type"] == "accept":
<span id="__span-0-13">        pass
<span id="__span-0-14">    elif response["type"] == "edit":
<span id="__span-0-15">        hotel_name = response["args"]["hotel_name"]
<span id="__span-0-16">    else:
<span id="__span-0-17">        raise ValueError(f"Unknown response type: {response['type']}")
<span id="__span-0-18">    return f"Successfully booked a stay at {hotel_name}."
<span id="__span-0-19">
<span id="__span-0-20">checkpointer = InMemorySaver() 
<span id="__span-0-21">
<span id="__span-0-22">agent = create_react_agent(
<span id="__span-0-23">    model="anthropic:claude-3-5-sonnet-latest",
<span id="__span-0-24">    tools=[book_hotel],
<span id="__span-0-25">    checkpointer=checkpointer, 
<span id="__span-0-26">)

使用 stream() 方法运行代理,传入 config 对象以指定线程 ID。这允许代理在未来的调用中恢复相同的对话。

<code tabindex="0"><span id="__span-1-1">config = {
<span id="__span-1-2">   "configurable": {
<span id="__span-1-3">      "thread_id": "1"
<span id="__span-1-4">   }
<span id="__span-1-5">}
<span id="__span-1-6">
<span id="__span-1-7">for chunk in agent.stream(
<span id="__span-1-8">    {"messages": [{"role": "user", "content": "book a stay at McKittrick hotel"}]},
<span id="__span-1-9">    config
<span id="__span-1-10">):
<span id="__span-1-11">    print(chunk)
<span id="__span-1-12">    print("\n")

您会看到代理一直运行到它到达 interrupt() 调用,此时它会暂停并等待人工输入。

使用 Command(resume=...) 根据人工输入恢复代理。

API 参考:Command

<code tabindex="0"><span id="__span-2-1">from langgraph.types import Command
<span id="__span-2-2">
<span id="__span-2-3">for chunk in agent.stream(
<span id="__span-2-4">    Command(resume={"type": "accept"}),  
<span id="__span-2-5">    # Command(resume={"type": "edit", "args": {"hotel_name": "McKittrick Hotel"}}),
<span id="__span-2-6">    config
<span id="__span-2-7">):
<span id="__span-2-8">    print(chunk)
<span id="__span-2-9">    print("\n")

与 Agent Inbox 结合使用

您可以创建一个包装器,为_任何_工具添加中断。

以下示例提供了一个与 Agent Inbox UIAgent Chat UI 兼容的参考实现。

为任何工具添加人机协作的包装器

<code tabindex="0"><span id="__span-3-1">from typing import Callable
<span id="__span-3-2">from langchain_core.tools import BaseTool, tool as create_tool
<span id="__span-3-3">from langchain_core.runnables import RunnableConfig
<span id="__span-3-4">from langgraph.types import interrupt 
<span id="__span-3-5">from langgraph.prebuilt.interrupt import HumanInterruptConfig, HumanInterrupt
<span id="__span-3-6">
<span id="__span-3-7">def add_human_in_the_loop(
<span id="__span-3-8">    tool: Callable | BaseTool,
<span id="__span-3-9">    *,
<span id="__span-3-10">    interrupt_config: HumanInterruptConfig = None,
<span id="__span-3-11">) -&gt; BaseTool:
<span id="__span-3-12">    """Wrap a tool to support human-in-the-loop review.""" 
<span id="__span-3-13">    if not isinstance(tool, BaseTool):
<span id="__span-3-14">        tool = create_tool(tool)
<span id="__span-3-15">
<span id="__span-3-16">    if interrupt_config is None:
<span id="__span-3-17">        interrupt_config = {
<span id="__span-3-18">            "allow_accept": True,
<span id="__span-3-19">            "allow_edit": True,
<span id="__span-3-20">            "allow_respond": True,
<span id="__span-3-21">        }
<span id="__span-3-22">
<span id="__span-3-23">    @create_tool(  
<span id="__span-3-24">        tool.name,
<span id="__span-3-25">        description=tool.description,
<span id="__span-3-26">        args_schema=tool.args_schema
<span id="__span-3-27">    )
<span id="__span-3-28">    def call_tool_with_interrupt(config: RunnableConfig, **tool_input):
<span id="__span-3-29">        request: HumanInterrupt = {
<span id="__span-3-30">            "action_request": {
<span id="__span-3-31">                "action": tool.name,
<span id="__span-3-32">                "args": tool_input
<span id="__span-3-33">            },
<span id="__span-3-34">            "config": interrupt_config,
<span id="__span-3-35">            "description": "Please review the tool call"
<span id="__span-3-36">        }
<span id="__span-3-37">        response = interrupt([request])[0]  
<span id="__span-3-38">        # approve the tool call
<span id="__span-3-39">        if response["type"] == "accept":
<span id="__span-3-40">            tool_response = tool.invoke(tool_input, config)
<span id="__span-3-41">        # update tool call args
<span id="__span-3-42">        elif response["type"] == "edit":
<span id="__span-3-43">            tool_input = response["args"]["args"]
<span id="__span-3-44">            tool_response = tool.invoke(tool_input, config)
<span id="__span-3-45">        # respond to the LLM with user feedback
<span id="__span-3-46">        elif response["type"] == "response":
<span id="__span-3-47">            user_feedback = response["args"]
<span id="__span-3-48">            tool_response = user_feedback
<span id="__span-3-49">        else:
<span id="__span-3-50">            raise ValueError(f"Unsupported interrupt response type: {response['type']}")
<span id="__span-3-51">
<span id="__span-3-52">        return tool_response
<span id="__span-3-53">
<span id="__span-3-54">    return call_tool_with_interrupt

您可以使用 add_human_in_the_loop 包装器将 interrupt() 添加到任何工具,而无需将其添加在工具_内部_

API 参考:InMemorySaver | create_react_agent

<code tabindex="0"><span id="__span-4-1">from langgraph.checkpoint.memory import InMemorySaver
<span id="__span-4-2">from langgraph.prebuilt import create_react_agent
<span id="__span-4-3">
<span id="__span-4-4">checkpointer = InMemorySaver()
<span id="__span-4-5">
<span id="__span-4-6">def book_hotel(hotel_name: str):
<span id="__span-4-7">   """Book a hotel"""
<span id="__span-4-8">   return f"Successfully booked a stay at {hotel_name}."
<span id="__span-4-9">
<span id="__span-4-10">
<span id="__span-4-11">agent = create_react_agent(
<span id="__span-4-12">    model="anthropic:claude-3-5-sonnet-latest",
<span id="__span-4-13">    tools=[
<span id="__span-4-14">        add_human_in_the_loop(book_hotel), 
<span id="__span-4-15">    ],
<span id="__span-4-16">    checkpointer=checkpointer,
<span id="__span-4-17">)
<span id="__span-4-18">
<span id="__span-4-19">config = {"configurable": {"thread_id": "1"}}
<span id="__span-4-20">
<span id="__span-4-21"># Run the agent
<span id="__span-4-22">for chunk in agent.stream(
<span id="__span-4-23">    {"messages": [{"role": "user", "content": "book a stay at McKittrick hotel"}]},
<span id="__span-4-24">    config
<span id="__span-4-25">):
<span id="__span-4-26">    print(chunk)
<span id="__span-4-27">    print("\n")

您会看到代理一直运行到它到达 interrupt() 调用,此时它会暂停并等待人工输入。

使用 Command(resume=...) 根据人工输入恢复代理。

API 参考:Command

<code tabindex="0"><span id="__span-5-1">from langgraph.types import Command 
<span id="__span-5-2">
<span id="__span-5-3">for chunk in agent.stream(
<span id="__span-5-4">    Command(resume=[{"type": "accept"}]),
<span id="__span-5-5">    # Command(resume=[{"type": "edit", "args": {"args": {"hotel_name": "McKittrick Hotel"}}}]),
<span id="__span-5-6">    config
<span id="__span-5-7">):
<span id="__span-5-8">    print(chunk)
<span id="__span-5-9">    print("\n")

额外资源