LlamaIndex Provider

The LlamaIndex Provider transforms Composio tools into a format compatible with LlamaIndex’s function calling capabilities.

Setup

Python
$pip install composio_llamaindex==0.8.0 llama-index

Usage

Python
1import asyncio
2import dotenv
3from composio_llamaindex import LlamaIndexProvider
4from llama_index.core.agent.workflow import FunctionAgent
5from llama_index.llms.openai import OpenAI
6
7from composio import Composio
8
9# Load environment variables from .env
10dotenv.load_dotenv()
11
12# Setup client
13llm = OpenAI(model="gpt-5")
14composio = Composio(provider=LlamaIndexProvider())
15
16tools = composio.tools.get(
17 user_id="user@acme.com",
18 tools=["GITHUB_STAR_A_REPOSITORY_FOR_THE_AUTHENTICATED_USER"],
19)
20
21workflow = FunctionAgent(
22 tools=tools,
23 llm=llm,
24 system_prompt="You are an agent that performs github actions.",
25)
26
27
28async def main():
29 result = await workflow.run(
30 user_msg="Hello! I would like to star a repo composiohq/composio on GitHub"
31 )
32 print(result)
33
34
35if __name__ == "__main__":
36 asyncio.run(main())