LlamaIndex Provider
The LlamaIndex Provider transforms Composio tools into a format compatible with LlamaIndex’s function calling capabilities.
Setup
Python
Usage
Python
The LlamaIndex Provider transforms Composio tools into a format compatible with LlamaIndex’s function calling capabilities.
$ pip install composio_llamaindex==0.8.0 llama-index
1 import asyncio 2 import dotenv 3 from composio_llamaindex import LlamaIndexProvider 4 from llama_index.core.agent.workflow import FunctionAgent 5 from llama_index.llms.openai import OpenAI 6 7 from composio import Composio 8 9 # Load environment variables from .env 10 dotenv.load_dotenv() 11 12 # Setup client 13 llm = OpenAI(model="gpt-5") 14 composio = Composio(provider=LlamaIndexProvider()) 15 16 tools = composio.tools.get( 17 user_id="user@acme.com", 18 tools=["GITHUB_STAR_A_REPOSITORY_FOR_THE_AUTHENTICATED_USER"], 19 ) 20 21 workflow = FunctionAgent( 22 tools=tools, 23 llm=llm, 24 system_prompt="You are an agent that performs github actions.", 25 ) 26 27 28 async def main(): 29 result = await workflow.run( 30 user_msg="Hello! I would like to star a repo composiohq/composio on GitHub" 31 ) 32 print(result) 33 34 35 if __name__ == "__main__": 36 asyncio.run(main())