Introduction

Learn about tool calling with Composio

Tool calling as a concept was introduced due to LLMs lack of ability to interact with data and influence external systems. Earlier you might be able to ask an LLM to write you a nice email, but you would have to manually send it. With tool calling, you can now provide an LLM a valid tools for example, GMAIL_SEND_EMAIL to go and accomplish the task autonomously.

Composio extends this by providing a platform to connect your AI agents to external tools like Gmail, GitHub, Salesforce, etc. It’s like a bridge between your AI and the tools it needs to get work done.

Tool Calling with Composio

Here’s a typical flow when your agent uses a tool via Composio:

Essentially: Your app gets tool definitions from Composio, the LLM decides which to use, your app tells Composio to run it (handle_tool_calls), and Composio securely executes the real API call.

Example: Using a Composio Tool with OpenAI

Let’s see this in action. We’ll ask an OpenAI model to fetch a GitHub username using a pre-built Composio tool.

(Assumes you’ve completed the Setup steps: installed SDKs, run composio login, and composio add github)

1. Initialize Clients & Toolset Get your LLM client and Composio toolset ready.

1from composio_openai import ComposioToolSet, App, Action
2from openai import OpenAI
3# Assumes .env file with API keys is loaded
4
5client = OpenAI()
6toolset = ComposioToolSet() # Uses default entity_id

2. Get the Composio Tool Fetch the specific tool definition from Composio, formatted for your LLM.

1# Fetch the tool for getting the authenticated user's GitHub info
2tools = toolset.get_tools(actions=[Action.GITHUB_GET_THE_AUTHENTICATED_USER])
3print(f"Fetched {len(tools)} tool(s) for the LLM.")

3. Send Request to LLM Provide the user’s task and the Composio tools to the LLM.

1task = "What is my GitHub username?"
2messages = [{"role": "user", "content": task}]
3
4print(f"Sending task to LLM: '{task}'")
5response = client.chat.completions.create(
6 model="gpt-4o-mini",
7 messages=messages,
8 tools=tools,
9 tool_choice="auto" # Instruct LLM to choose if a tool is needed
10)

4. Handle Tool Call via Composio If the LLM decided to use a tool, pass the response to handle_tool_calls. Composio takes care of the execution.

1execution_result = None
2response_message = response.choices[0].message
3
4if response_message.tool_calls:
5 print("LLM requested tool use. Executing via Composio...")
6 # Composio handles auth, API call execution, and returns the result
7 execution_result = toolset.handle_tool_calls(response)
8 print("Execution Result from Composio:", execution_result)
9else:
10 print("LLM responded directly (no tool used):", response_message.content)
11
12# Now 'execution_result' holds the data returned by the GitHub API call
13# You could parse it or feed it back to the LLM for a final summary.

This example showcases how Composio seamlessly integrates with the LLM’s tool-calling mechanism, handling the complex parts of API interaction securely and reliably.