LogoA2A Docs

LangGraph Integration

Connecting LangGraph with the A2A protocol

LangGraph Integration

LangGraph is a library for building stateful, multi-actor applications with LLMs. This guide shows how to integrate LangGraph with the A2A protocol.

Overview

LangGraph extends LangChain's capabilities by adding a flexible approach to building multi-actor systems with state management. Integrating LangGraph with the A2A protocol allows you to expose these complex agent networks through a standardized interface.

Sample Implementation

Here's a simplified example of connecting a LangGraph agent to the A2A protocol:

import operator
from typing import TypedDict, Annotated, Sequence, Dict
from langchain_openai import ChatOpenAI
from langgraph.graph import StateGraph, END
from langgraph.prebuilt import ToolNode
from a2a.server import A2AServer
 
# Define the state structure
class AgentState(TypedDict):
    messages: Annotated[Sequence, "The messages in the conversation"]
    tools: Dict
    tool_outputs: Annotated[Dict, "Outputs from tool calls"]
 
# Initialize the language model
llm = ChatOpenAI(model="gpt-4")
 
# Define a simple weather tool
def get_weather(location: str):
    # Simplified weather info
    return f"The weather in {location} is sunny with a high of 75°F."
 
# Create nodes for the graph
# Assistant node (uses the LLM)
assistant = llm.bind_tools([
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get the current weather for a location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The location to get weather for"
                    }
                },
                "required": ["location"]
            }
        }
    }
])
 
# Tool execution node
tools = ToolNode({
    "get_weather": get_weather
})
 
# Build the graph
graph = StateGraph(AgentState)
graph.add_node("assistant", assistant)
graph.add_node("tools", tools)
 
# Add edges
graph.add_edge("assistant", "tools")
graph.add_conditional_edges(
    "tools",
    # If there are tool calls, route to assistant, otherwise end
    lambda state: "assistant" if state["tool_outputs"] else END
)
 
# Compile the graph
agent_executor = graph.compile()
 
# Function to handle A2A requests
def process_with_langgraph(user_input):
    # Initialize state
    initial_state = {
        "messages": [{"content": user_input, "role": "user"}],
        "tools": {},
        "tool_outputs": {}
    }
    
    # Execute the graph
    result = agent_executor.invoke(initial_state)
    
    # Extract the assistant's final message
    for message in reversed(result["messages"]):
        if message["role"] == "assistant":
            return message["content"]
    
    return "No response generated"
 
# Create A2A server with LangGraph integration
a2a_server = A2AServer(
    agent=process_with_langgraph,
    agent_card={
        "name": "LangGraph Weather Agent",
        "description": "An agent that provides weather information using LangGraph",
        "skills": [
            {
                "id": "weather-info",
                "name": "Weather Information",
                "description": "Get current weather conditions for any location"
            }
        ]
    }
)
 
# Run the server
if __name__ == "__main__":
    a2a_server.run(host="0.0.0.0", port=8000)

Key Features

  • Stateful execution: Manage conversation and agent state
  • Graph-based workflow: Create complex decision flows between agents and tools
  • Conditional routing: Direct execution based on agent outputs
  • A2A compatibility: Expose LangGraph agents through the A2A protocol

Where to Find the Code

The complete sample implementation is available in the A2A GitHub repository.

Table of Contents