LangChain
AP3 includes a LangChain integration and an optional A2A executor bridge:
- Agent factory:
ap3.integrations.langchain.LangChainIntegration - A2A bridge:
ap3.integrations.langchain.LangChainExecutor
Install
pip install langchain langchain-core langchain-openai
Create a LangChain agent executor with AP3
Use LangChainIntegration to produce a LangChain AgentExecutor (e.g. ReAct style), and then invoke it like any other LangChain agent.
from ap3.integrations.langchain import LangChainIntegration
from langchain_openai import ChatOpenAI
# Your LangChain tools can call AP3 SDK operations, or any other tooling you need.
tools = []
integration = LangChainIntegration(
name="my_agent",
description="A helpful agent",
instruction="Use tools when appropriate. Be concise.",
tools=tools,
)
llm = ChatOpenAI(model="gpt-4", temperature=0)
agent_executor = integration.create_agent(llm=llm, verbose=True)
result = agent_executor.invoke({"input": "Hello!"})
print(result)
Expose the LangChain agent over A2A (optional)
If you run an A2A server, LangChainExecutor adapts a LangChain AgentExecutor to the A2A task lifecycle (extracting text parts, running the agent, returning the output as artifacts).
from ap3.integrations.langchain import LangChainExecutor
# agent_card = ... (your A2A AgentCard)
# agent_executor = ... (from LangChainIntegration, or built directly)
executor = LangChainExecutor(
agent_executor=agent_executor,
card=agent_card,
)