Support Knowledge Agent
This cookbook builds an agentic RAG system: an interactive CLI agent that triages support issues by pulling context from Notion docs, Datadog monitors, and GitHub issues. It uses scoped sessions, multi-turn chat with streaming, and a structured system prompt.
Prerequisites
- Python 3.10+
- UV
- Composio API key
- OpenAI API key
Project setup
Create a new project and install dependencies:
mkdir composio-support-agent && cd composio-support-agent
uv init && uv add composio composio-openai-agents openai-agentsAdd your API keys to a .env file:
COMPOSIO_API_KEY=your_composio_api_key
OPENAI_API_KEY=your_openai_api_keySetting up the client
Composio takes an OpenAIAgentsProvider so that tools come back in the format the OpenAI Agents SDK expects. We also import the streaming event types we'll need for real-time output.
import asyncio
from agents import Agent, Runner
from agents.stream_events import RawResponsesStreamEvent
from composio import Composio
from composio_openai_agents import OpenAIAgentsProvider
from openai.types.responses import ResponseTextDeltaEvent
composio = Composio(provider=OpenAIAgentsProvider())Defining the agent
The system prompt tells the agent what tools it has and how to behave. It knows about Datadog, Notion, and GitHub, and decides on its own which to use based on the question.
SYSTEM_PROMPT = """You are a Support Knowledge Agent. Use your tools to help the user triage issues, find documentation, and manage incidents. Call tools first, then respond with what you found. Be concise."""
def create_agent(tools) -> Agent:
return Agent(
name="Support Knowledge Agent",
model="gpt-5.2",
instructions=SYSTEM_PROMPT,
tools=tools,
)Chat loop with streaming
The chat loop creates a session scoped to three toolkits: datadog, notion, and github. The agent only sees tools from these services. Runner.run_streamed streams tokens as they arrive so you see the response in real time. Message history is tracked in a list for multi-turn context.
async def main():
user_id = "default"
session = composio.create(
user_id=user_id,
toolkits=["datadog", "notion", "github"],
)
tools = session.tools()
agent = create_agent(tools)
messages = []
print("Support Knowledge Agent (type 'quit' to exit)")
print("-" * 50)
while True:
user_input = input("\nYou: ").strip()
if not user_input or user_input.lower() == "quit":
break
messages.append({"role": "user", "content": user_input})
print("\nAgent: ", end="", flush=True)
result = Runner.run_streamed(starting_agent=agent, input=messages, max_turns=30)
async for event in result.stream_events():
if isinstance(event, RawResponsesStreamEvent) and isinstance(event.data, ResponseTextDeltaEvent):
print(event.data.delta, end="", flush=True)
print()
messages.append({"role": "assistant", "content": result.final_output})
asyncio.run(main())If a toolkit isn't connected yet, the agent will automatically return an authentication link in its response. The user can complete OAuth and then retry.
Complete script
Here's everything together:
# region setup
import asyncio
from agents import Agent, Runner
from agents.stream_events import RawResponsesStreamEvent
from composio import Composio
from composio_openai_agents import OpenAIAgentsProvider
from openai.types.responses import ResponseTextDeltaEvent
composio = Composio(provider=OpenAIAgentsProvider())
# endregion setup
# region agent
SYSTEM_PROMPT = """You are a Support Knowledge Agent. Use your tools to help the user triage issues, find documentation, and manage incidents. Call tools first, then respond with what you found. Be concise."""
def create_agent(tools) -> Agent:
return Agent(
name="Support Knowledge Agent",
model="gpt-5.2",
instructions=SYSTEM_PROMPT,
tools=tools,
)
# endregion agent
# region chat
async def main():
user_id = "default"
session = composio.create(
user_id=user_id,
toolkits=["datadog", "notion", "github"],
)
tools = session.tools()
agent = create_agent(tools)
messages = []
print("Support Knowledge Agent (type 'quit' to exit)")
print("-" * 50)
while True:
user_input = input("\nYou: ").strip()
if not user_input or user_input.lower() == "quit":
break
messages.append({"role": "user", "content": user_input})
print("\nAgent: ", end="", flush=True)
result = Runner.run_streamed(starting_agent=agent, input=messages, max_turns=30)
async for event in result.stream_events():
if isinstance(event, RawResponsesStreamEvent) and isinstance(event.data, ResponseTextDeltaEvent):
print(event.data.delta, end="", flush=True)
print()
messages.append({"role": "assistant", "content": result.final_output})
asyncio.run(main())
# endregion chatRunning the agent
uv run --env-file .env python main.pyThe agent starts an interactive chat. Type a message and watch the response stream in. Type quit to exit.
Support Knowledge Agent (type 'quit' to exit)
--------------------------------------------------
You: The payments service is returning 500 errors. Can you check what's going on?
Agent: I checked Datadog and found an active alert on the payments-api monitor...