Databricks

Learn how to use Databricks with Composio

Overview

SLUG: DATABRICKS

Description

Databricks is the lakehouse company, helping data teams solve the world’s toughest problems with unified analytics platform for big data and AI.

Authentication Details

client_id
stringRequired
client_secret
stringRequired
oauth_redirect_uri
stringDefaults to https://backend.composio.dev/api/v1/auth-apps/add
scopes
stringDefaults to all-apis,openid,email,profile,offline_access
full
stringRequired
full
stringRequired
generic_api_key
stringRequired

Connecting to Databricks

Create an auth config

Use the dashboard to create an auth config for the Databricks toolkit. This allows you to connect multiple Databricks accounts to Composio for agents to use.

1

Select App

Navigate to Databricks.

2

Configure Auth Config Settings

Select among the supported auth schemes of and configure them here.

3

Create and Get auth config ID

Click “Create Databricks Auth Config”. After creation, copy the displayed ID starting with ac_. This is your auth config ID. This is not a sensitive ID — you can save it in environment variables or a database. This ID will be used to create connections to the toolkit for a given user.

Connect Your Account

Using OAuth2

1from composio import Composio
2
3# Replace these with your actual values
4databricks_auth_config_id = "ac_YOUR_DATABRICKS_CONFIG_ID" # Auth config ID created above
5user_id = "0000-0000-0000" # UUID from database/application
6
7composio = Composio()
8
9
10def authenticate_toolkit(user_id: str, auth_config_id: str):
11 connection_request = composio.connected_accounts.initiate(
12 user_id=user_id,
13 auth_config_id=auth_config_id,
14 )
15
16 print(
17 f"Visit this URL to authenticate Databricks: {connection_request.redirect_url}"
18 )
19
20 # This will wait for the auth flow to be completed
21 connection_request.wait_for_connection(timeout=15)
22 return connection_request.id
23
24
25connection_id = authenticate_toolkit(user_id, databricks_auth_config_id)
26
27# You can also verify the connection status using:
28connected_account = composio.connected_accounts.get(connection_id)
29print(f"Connected account: {connected_account}")

Using API Key

1from composio import Composio
2
3# Replace these with your actual values
4databricks_auth_config_id = "ac_YOUR_DATABRICKS_CONFIG_ID" # Auth config ID created above
5user_id = "0000-0000-0000" # UUID from database/app
6
7composio = Composio()
8
9def authenticate_toolkit(user_id: str, auth_config_id: str):
10 # Replace this with a method to retrieve an API key from the user.
11 # Or supply your own.
12 user_api_key = input("[!] Enter API key")
13
14 connection_request = composio.connected_accounts.initiate(
15 user_id=user_id,
16 auth_config_id=auth_config_id,
17 config={"auth_scheme": "API_KEY", "val": {"generic_api_key": user_api_key}}
18 )
19
20 # API Key authentication is immediate - no redirect needed
21 print(f"Successfully connected Databricks for user {user_id}")
22 print(f"Connection status: {connection_request.status}")
23
24 return connection_request.id
25
26
27connection_id = authenticate_toolkit(user_id, databricks_auth_config_id)
28
29# You can verify the connection using:
30connected_account = composio.connected_accounts.get(connection_id)
31print(f"Connected account: {connected_account}")

Tools

Executing tools

To prototype you can execute some tools to see the responses and working on the Databricks toolkit’s playground

For code examples, see the Tool calling guide and Provider examples.

Tool List

Tool Name: Add Member to Security Group

Description

Tool to add a user or group as a member to a Databricks security group. Use when you need to grant group membership for access control.

Action Parameters

group_name
string
parent_name
stringRequired
user_name
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Custom LLM Agent

Description

Tool to delete a Custom LLM agent created through Agent Bricks. Use when you need to remove a custom LLM and all associated data. This operation is irreversible and deletes all data including temporary transformations, model checkpoints, and internal metadata.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Databricks App

Description

Tool to create a new Databricks app with specified configuration. Use when you need to create apps hosted on Databricks serverless platform to deploy secure data and AI applications. The app name must be unique within the workspace, contain only lowercase alphanumeric characters and hyphens, and cannot be changed after creation.

Action Parameters

budget_policy_id
string
compute_size
string
description
string
name
stringRequired
permissions
array
resources
array
source_code_path
string
user_api_scopes
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Databricks App

Description

Tool to delete a Databricks app from the workspace. Use when you need to remove an app and its associated service principal. When an app is deleted, Databricks automatically deletes the provisioned service principal.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Deploy Databricks App

Description

Tool to create a deployment for a Databricks app. Use when you need to deploy an app with source code from a workspace path. The deployment process provisions compute resources and uploads the source code. Deployments can be in states: IN_PROGRESS, SUCCEEDED, FAILED, or CANCELLED.

Action Parameters

app_name
stringRequired
mode
string
source_code_path
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Databricks App Details

Description

Tool to retrieve details about a specific Databricks app by name. Use when you need to get comprehensive information about an app including configuration, deployment status, compute resources, and metadata.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Databricks App Permission Levels

Description

Tool to retrieve available permission levels for a Databricks app. Use when you need to understand what permission levels can be assigned to users or groups for a specific app. Returns permission levels like CAN_USE and CAN_MANAGE with their descriptions.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Databricks App Permissions

Description

Tool to retrieve permissions for a Databricks app. Use when you need to check who has access to an app and their permission levels. Returns the access control list including inherited permissions from parent or root objects.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get App Deployment Update

Description

Tool to retrieve information about a specific app deployment update. Use when you need to track the status and details of app deployment updates, including whether the update succeeded, failed, is in progress, or was not updated.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set Databricks App Permissions

Description

Tool to set permissions for a Databricks app, replacing all existing permissions. Use when you need to configure access control for an app. This operation replaces ALL existing permissions; for incremental updates, use the update permissions endpoint instead. Admin permissions cannot be removed.

Action Parameters

access_control_list
arrayRequired
name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Start Databricks App

Description

Tool to start the last active deployment of a Databricks app. Use when you need to start a stopped app, which transitions it to the ACTIVE state. The start operation is asynchronous and the app will transition to the ACTIVE state after the operation completes.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Stop Databricks App

Description

Tool to stop the active deployment of a Databricks app. Use when you need to stop a running app, which transitions it to the STOPPED state. The stop operation is asynchronous and the app will transition to the STOPPED state after the operation completes.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Databricks App

Description

Tool to update an existing Databricks app configuration. Use when you need to modify app settings such as description, resources, compute size, budget policy, or API scopes. This is a partial update operation - only fields provided in the request will be updated, other fields retain their current values.

Action Parameters

budget_policy_id
string
compute_size
string
default_source_code_path
string
description
string
name
stringRequired
resources
array
usage_policy_id
string
user_api_scopes
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Databricks App Permissions

Description

Tool to incrementally update permissions for a Databricks app. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. For replacing all permissions, use SetPermissions instead.

Action Parameters

access_control_list
arrayRequired
name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Cancel Databricks Job Run

Description

Tool to cancel a Databricks job run asynchronously. Use when you need to terminate a running job. The run will be terminated shortly after the request completes. If the run is already in a terminal state (TERMINATED, SKIPPED, or INTERNAL_ERROR), this is a no-op.

Action Parameters

run_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Catalog Artifact Allowlist

Description

Tool to retrieve artifact allowlist configuration for a specified artifact type in Unity Catalog. Use when you need to check which artifacts are permitted for use in your Databricks environment. Requires metastore admin privileges or MANAGE ALLOWLIST privilege on the metastore.

Action Parameters

artifact_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Catalog

Description

Tool to delete a catalog from Unity Catalog metastore. Use when you need to permanently remove a catalog and optionally its contents. By default, the catalog must be empty (except for information_schema). Use force=true to delete non-empty catalogs. Do not delete the main catalog as it can break existing data operations.

Action Parameters

force
boolean
name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Catalog Details

Description

Tool to retrieve details of a specific catalog in Unity Catalog. Use when you need to get information about a catalog including its metadata, owner, properties, and configuration. Requires metastore admin privileges, catalog ownership, or USE_CATALOG privilege.

Action Parameters

include_browse
boolean
name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Catalog Connection

Description

Tool to create a new Unity Catalog connection to external data sources. Use when you need to establish connections to databases and services such as MySQL, PostgreSQL, Snowflake, etc. Requires metastore admin privileges or CREATE CONNECTION privilege on the metastore.

Action Parameters

comment
string
connection_type
stringRequired
name
stringRequired
options
objectRequired
properties
object
read_only
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Catalog Connection

Description

Tool to delete a Unity Catalog connection to external data sources. Use when you need to remove connections to databases and services. Deleting a connection removes the abstraction used to connect from Databricks Compute to external data sources.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Catalog Connection

Description

Tool to retrieve detailed information about a specific Unity Catalog connection. Use when you need to get connection metadata, configuration, and properties for external data source connections.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Catalog Connection

Description

Tool to update an existing Unity Catalog connection configuration. Use when you need to modify connection properties, credentials, ownership, or metadata for external data sources.

Action Parameters

comment
string
name_or_id
stringRequired
new_name
string
options
objectRequired
owner
string
properties
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Catalog Credential

Description

Tool to create a new credential for Unity Catalog access to cloud services. Use when you need to establish authentication for STORAGE (cloud storage) or SERVICE (external services like AWS Glue) purposes. Requires metastore admin or CREATE_STORAGE_CREDENTIAL/CREATE_SERVICE_CREDENTIAL privileges. Exactly one cloud credential type must be provided.

Action Parameters

aws_iam_role
object
azure_managed_identity
object
azure_service_principal
object
comment
string
databricks_gcp_service_account
object
name
stringRequired
purpose
stringRequired
read_only
boolean
skip_validation
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Catalog Credential

Description

Tool to delete a Unity Catalog credential for cloud storage or service access. Use when you need to remove credentials that authenticate access to cloud resources. By default, deletion will fail if the credential has dependent resources. Use force=true to delete credentials with dependencies.

Action Parameters

credential_name_or_id
stringRequired
force
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Generate Temporary Service Credential

Description

Tool to generate temporary credentials from a service credential with admin access. Use when you need short-lived, scoped credentials for accessing cloud resources. The caller must be a metastore admin or have the ACCESS privilege on the service credential.

Action Parameters

azure_options
object
credential_name
stringRequired
gcp_options
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Catalog Credential

Description

Tool to retrieve detailed information about a specific Unity Catalog credential. Use when you need to get credential metadata, configuration, and cloud provider details for storage or service credentials.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Catalog Credential

Description

Tool to update an existing Unity Catalog credential with new properties. Use when you need to modify credential configuration, ownership, or cloud provider settings. The caller must be the owner of the credential, a metastore admin, or have MANAGE permission on the credential. If the caller is a metastore admin, only the owner field can be changed.

Action Parameters

aws_iam_role
object
azure_managed_identity
object
azure_service_principal
object
cloudflare_api_token
object
comment
string
databricks_gcp_service_account
object
force
boolean
isolation_mode
string
name_or_id
stringRequired
new_name
string
owner
string
read_only
boolean
skip_validation
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Validate Catalog Credential

Description

Tool to validate a Unity Catalog credential for external access. Use when you need to verify that a credential can successfully perform its intended operations. For SERVICE credentials, validates cloud service access. For STORAGE credentials, tests READ, WRITE, DELETE, LIST operations on the specified location.

Action Parameters

aws_iam_role
object
azure_managed_identity
object
azure_service_principal
object
cloudflare_api_token
object
credential_name
string
databricks_gcp_service_account
object
external_location_name
string
purpose
string
read_only
boolean
url
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Entity Tag Assignment

Description

Tool to retrieve a specific tag assignment for a Unity Catalog entity by tag key. Use when you need to get details about a tag assigned to catalogs, schemas, tables, columns, or volumes. Requires USE CATALOG and USE SCHEMA permissions on parent resources, and ASSIGN or MANAGE permissions on the tag policy for governed tags.

Action Parameters

entity_name
stringRequired
entity_type
stringRequired
tag_key
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create External Location

Description

Tool to create a new Unity Catalog external location combining a cloud storage path with a storage credential. Use when you need to establish access to cloud storage in Azure Data Lake Storage, AWS S3, or Cloudflare R2. Requires metastore admin or CREATE_EXTERNAL_LOCATION privilege on both the metastore and the associated storage credential.

Action Parameters

comment
string
credential_name
stringRequired
enable_file_events
boolean
encryption_details
object
fallback
boolean
file_event_queue
object
name
stringRequired
read_only
boolean
skip_validation
boolean
url
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete External Location

Description

Tool to delete an external location from Unity Catalog metastore. Use when you need to remove an external location that combines a cloud storage path with a storage credential. The caller must be the owner of the external location. Use force=true to delete even if there are dependent external tables or mounts.

Action Parameters

force
boolean
name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get External Location Details

Description

Tool to retrieve details of a specific Unity Catalog external location. Use when you need to get information about an external location including its URL, storage credential, and configuration. Requires metastore admin privileges, external location ownership, or appropriate privileges on the external location.

Action Parameters

include_browse
boolean
name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update External Location

Description

Tool to update an existing Unity Catalog external location properties. Use when you need to modify the cloud storage path, credentials, ownership, or configuration of an external location. The caller must be the owner of the external location or a metastore admin. Use force parameter to update even if URL changes invalidate dependencies.

Action Parameters

comment
string
credential_name
string
enable_file_events
boolean
encryption_details
object
fallback
boolean
file_event_queue
object
force
boolean
isolation_mode
string
name
stringRequired
new_name
string
owner
string
read_only
boolean
skip_validation
boolean
url
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update External Metadata

Description

Tool to update an external metadata object in Unity Catalog. Use when you need to modify metadata about external systems registered within Unity Catalog. The user must have metastore admin status, own the object, or possess the MODIFY privilege. Note that changing ownership requires the MANAGE privilege, and callers cannot update both the owner and other metadata in a single request.

Action Parameters

columns
array
description
string
entity_type
string
id
stringRequired
name
string
owner
string
properties
object
system_type
string
update_mask
stringRequired
url
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Catalog Function

Description

Tool to update function owner in Unity Catalog. Use when you need to change the ownership of a catalog function. Only the owner of the function can be updated via this endpoint. The caller must be a metastore admin, the owner of the function's parent catalog, the owner of the parent schema with USE_CATALOG privilege, or the owner of the function with both USE_CATALOG and USE_SCHEMA privileges.

Action Parameters

name
stringRequired
owner
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Catalog Grants

Description

Tool to get permissions (grants) for a securable in Unity Catalog without inherited permissions. Use when you need to see direct privilege assignments on a catalog or other securable object. Returns only privileges directly assigned to principals, excluding inherited permissions from parent securables. For inherited permissions, use the get-effective endpoint instead.

Action Parameters

full_name
stringRequired
max_results
integer
page_token
string
principal
string
securable_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Effective Catalog Permissions

Description

Tool to get effective permissions for a securable in Unity Catalog, including inherited permissions from parent securables. Use when you need to understand what privileges are granted to principals through direct assignments or inheritance. Returns privileges conveyed to each principal through the Unity Catalog hierarchy (metastore → catalog → schema → table/view/volume).

Action Parameters

full_name
stringRequired
max_results
integer
page_token
string
principal
string
securable_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Catalog Grants

Description

Tool to update permissions for Unity Catalog securables by adding or removing privileges for principals. Use when you need to grant or revoke permissions on catalogs, schemas, tables, or other Unity Catalog objects. Only metastore admins, object owners, users with MANAGE privilege, or parent catalog/schema owners can update permissions.

Action Parameters

changes
arrayRequired
full_name
stringRequired
securable_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Assign Metastore to Workspace

Description

Tool to assign a Unity Catalog metastore to a workspace. Use when you need to link a workspace to a Unity Catalog metastore, enabling shared data access with consistent governance policies. Requires account admin privileges. If an assignment for the same workspace_id exists, it will be overwritten by the new metastore_id and default_catalog_name.

Action Parameters

default_catalog_name
string
metastore_id
stringRequired
workspace_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Metastore

Description

Tool to create a new Unity Catalog metastore. Use when you need to establish a top-level container for data in Unity Catalog, registering metadata about securable objects (tables, volumes, external locations, shares) and access permissions. Requires account admin privileges. By default, the owner is the user calling the API; setting owner to empty string assigns ownership to System User.

Action Parameters

cloud
string
default_data_access_config_id
string
delta_sharing_organization_name
string
delta_sharing_recipient_token_lifetime_in_seconds
integer
delta_sharing_scope
string
name
stringRequired
owner
string
region
string
storage_root
string
storage_root_credential_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Current Metastore Assignment

Description

Tool to retrieve the current metastore assignment for the workspace being accessed. Use when you need to determine which metastore is assigned to the current workspace context.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Metastore

Description

Tool to delete a Unity Catalog metastore. Use when you need to permanently remove a metastore and its managed data. Before deletion, you must delete or unlink any workspaces using the metastore. All objects managed by the metastore will become inaccessible. Requires metastore admin privileges.

Action Parameters

force
boolean
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Metastore Details

Description

Tool to retrieve detailed information about a specific Unity Catalog metastore by its ID. Use when you need to get metastore configuration, ownership, storage settings, and Delta Sharing details. Requires metastore admin permissions.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Metastore Summary

Description

Tool to retrieve summary information about the metastore associated with the current workspace. Use when you need metastore configuration overview including cloud vendor, region, storage, and Delta Sharing details.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Unassign Metastore from Workspace

Description

Tool to unassign a Unity Catalog metastore from a workspace. Use when you need to remove the association between a workspace and its assigned metastore, leaving the workspace with no metastore. The metastore itself is not deleted, only the workspace assignment is removed. Requires account admin privileges.

Action Parameters

metastore_id
stringRequired
workspace_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Metastore

Description

Tool to update configuration settings for an existing Unity Catalog metastore. Use when you need to modify metastore properties like name, owner, Delta Sharing settings, or storage credentials. Requires metastore admin permissions.

Action Parameters

delta_sharing_organization_name
string
delta_sharing_recipient_token_lifetime_in_seconds
integer
delta_sharing_scope
string
external_access_enabled
boolean
id
stringRequired
new_name
string
owner
string
privilege_model_version
string
storage_root_credential_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Metastore Assignment

Description

Tool to update a metastore assignment for a workspace. Use when you need to update the metastore_id or default_catalog_name for a workspace that already has a metastore assigned. Account admin privileges are required to update metastore_id, while workspace admin can update default_catalog_name.

Action Parameters

default_catalog_name
string
metastore_id
string
workspace_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Model Version

Description

Tool to retrieve detailed information about a specific version of a registered model in Unity Catalog. Use when you need to get metadata, status, source location, and configuration of a model version. Requires metastore admin privileges, model ownership, or EXECUTE privilege on the registered model with appropriate catalog and schema privileges.

Action Parameters

full_name
stringRequired
include_aliases
boolean
include_browse
boolean
version
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Model Version

Description

Tool to update a Unity Catalog model version. Use when you need to modify the comment of a specific model version. Currently only the comment field can be updated. The caller must be a metastore admin or owner of the parent registered model with appropriate catalog and schema privileges.

Action Parameters

comment
string
full_name
stringRequired
version
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Online Table

Description

Tool to delete an online table by name. Use when you need to permanently remove an online table and stop data synchronization. This operation deletes all data in the online table permanently and releases all resources. Note: online tables are deprecated and will not be accessible after January 15, 2026.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Quality Monitor

Description

Tool to retrieve quality monitor configuration for a Unity Catalog table. Use when you need to get monitor status, metrics tables, custom metrics, notifications, scheduling, and monitoring configuration details. Requires catalog and schema privileges plus SELECT on the table.

Action Parameters

table_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Quality Monitor Refreshes

Description

Tool to retrieve the refresh history for a quality monitor on a Unity Catalog table. Use when you need to check the status and history of monitor refresh operations. Returns up to 25 most recent refreshes including their state, timing, and status messages.

Action Parameters

table_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Registered Model

Description

Tool to retrieve detailed information about a registered model in Unity Catalog. Use when you need to get metadata, owner, storage location, and configuration of a registered model. Requires metastore admin privileges, model ownership, or EXECUTE privilege on the registered model with appropriate catalog and schema privileges.

Action Parameters

full_name
stringRequired
include_aliases
boolean
include_browse
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Resource Quota Information

Description

Tool to retrieve usage information for a Unity Catalog resource quota defined by a child-parent pair. Use when you need to check quota usage for a specific resource type (tables per metastore, schemas per catalog, etc.). The API also triggers an asynchronous refresh if the count is out of date. Requires account admin authentication with OAuth.

Action Parameters

parent_full_name
stringRequired
parent_securable_type
stringRequired
quota_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Batch Create Access Requests

Description

Tool to batch create access requests for Unity Catalog permissions. Use when you need to request access to catalogs, schemas, tables, or other Unity Catalog securables. Maximum 30 requests per API call, and maximum 30 securables per principal per call.

Action Parameters

requests
arrayRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Access Request Destinations

Description

Tool to retrieve access request destinations for a Unity Catalog securable. Use when you need to find where notifications are sent when users request access to catalogs, schemas, tables, or other securables. Any caller can see URL destinations or destinations on the metastore. For other securables, only those with BROWSE permissions can see destinations.

Action Parameters

full_name
stringRequired
securable_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Access Request Destinations

Description

Tool to update access request notification destinations for Unity Catalog securables. Use when you need to configure where access request notifications are sent for catalogs, schemas, or external locations. Requires metastore admin, owner privileges, or MANAGE permission on the securable. Maximum 5 emails and 5 external destinations allowed per securable.

Action Parameters

destinations
arrayRequired
securable
objectRequired
update_mask
stringDefaults to destinations

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Catalog Schema

Description

Tool to retrieve details of a specific schema from Unity Catalog metastore. Use when you need to get schema metadata, ownership, storage configuration, and properties. Requires metastore admin privileges, schema ownership, or USE_SCHEMA privilege.

Action Parameters

full_name
stringRequired
include_browse
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Storage Credential

Description

Tool to create a new storage credential in Unity Catalog for cloud data access. Use when you need to establish authentication for accessing cloud storage paths. Requires metastore admin or CREATE_STORAGE_CREDENTIAL privilege on the metastore. Exactly one cloud credential type must be provided.

Action Parameters

aws_iam_role
object
azure_managed_identity
object
azure_service_principal
object
cloudflare_api_token
object
comment
string
databricks_gcp_service_account
object
name
stringRequired
read_only
boolean
skip_validation
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Storage Credential

Description

Tool to delete a storage credential from the Unity Catalog metastore. Use when you need to remove storage credentials that provide authentication to cloud storage. The caller must be the owner of the storage credential. Use force=true to delete even if there are dependent external locations, tables, or services.

Action Parameters

force
boolean
name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Storage Credential

Description

Tool to retrieve storage credential details from Unity Catalog metastore by name. Use when you need to get information about a storage credential's configuration and properties. Requires metastore admin privileges, credential ownership, or appropriate permissions on the storage credential.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Storage Credential

Description

Tool to update an existing storage credential in Unity Catalog. Use when you need to modify credential properties, cloud provider configuration, or ownership. The caller must be the owner of the storage credential or a metastore admin. Metastore admins can only modify the owner field.

Action Parameters

aws_iam_role
object
azure_managed_identity
object
azure_service_principal
object
cloudflare_api_token
object
comment
string
databricks_gcp_service_account
object
force
boolean
isolation_mode
string
name
stringRequired
new_name
string
owner
string
read_only
boolean
skip_validation
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Validate Storage Credential

Description

Tool to validate a storage credential configuration for Unity Catalog. Use when you need to verify that a storage credential can successfully access a cloud storage location. Requires metastore admin, storage credential owner, or CREATE_EXTERNAL_LOCATION privilege.

Action Parameters

aws_iam_role
object
azure_managed_identity
object
azure_service_principal
object
cloudflare_api_token
object
databricks_gcp_service_account
object
external_location_name
string
read_only
boolean
storage_credential_name
string
url
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Disable System Schema

Description

Tool to disable a system schema in Unity Catalog metastore. Use when you need to remove a system schema from the system catalog. System schemas store information about customer usage patterns such as audit logs, billing information, and lineage data. Requires account admin or metastore admin privileges.

Action Parameters

metastore_id
stringRequired
schema_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Enable System Schema

Description

Tool to enable a system schema in Unity Catalog metastore. Use when you need to activate a system schema to track customer usage patterns. System schemas store information about audit logs, billing, compute usage, storage, lineage, and marketplace data. Requires account admin or metastore admin privileges.

Action Parameters

catalog_name
string
metastore_id
stringRequired
schema_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Catalog Table

Description

Tool to delete a table from Unity Catalog. Use when you need to permanently remove a table from its parent catalog and schema. The operation requires appropriate permissions on the parent catalog, schema, and table.

Action Parameters

full_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Check Table Exists

Description

Tool to check if a table exists in Unity Catalog metastore. Use when you need to verify table existence before performing operations. Requires metastore admin privileges, table ownership with SELECT privilege, or USE_CATALOG and USE_SCHEMA privileges on parent objects.

Action Parameters

full_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Catalog Table Details

Description

Tool to retrieve comprehensive metadata about a table from Unity Catalog metastore. Use when you need detailed table information including columns, type, storage, constraints, and governance metadata. Requires metastore admin privileges, table ownership, or SELECT privilege on the table, plus USE_CATALOG and USE_SCHEMA privileges on parent objects.

Action Parameters

full_name
stringRequired
include_browse
boolean
include_delta_metadata
boolean
include_manifest_capabilities
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Catalog Table

Description

Tool to update Unity Catalog table properties. Use when you need to change the owner or comment of a table. The caller must be the owner of the parent catalog, have the USE_CATALOG privilege on the parent catalog and be the owner of the parent schema, or be the owner of the table and have the USE_CATALOG privilege on the parent catalog and the USE_SCHEMA privilege on the parent schema.

Action Parameters

comment
string
full_name
stringRequired
owner
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Generate Temporary Path Credentials

Description

Tool to generate short-lived, scoped temporary credentials for accessing external storage locations registered in Unity Catalog. Use when you need temporary access to cloud storage paths with specific read/write permissions. The credentials inherit the privileges of the requesting principal and are valid for a limited time. The requesting principal must have EXTERNAL USE LOCATION privilege on the external location.

Action Parameters

dry_run
boolean
operation
stringRequired
url
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Catalog Volume Details

Description

Tool to retrieve detailed information about a specific Unity Catalog volume. Use when you need to get volume metadata including type, storage location, owner, and timestamps. Requires metastore admin privileges or volume ownership with appropriate USE_CATALOG and USE_SCHEMA privileges on parent objects.

Action Parameters

include_browse
boolean
name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Catalog Workspace Bindings

Description

Tool to update workspace bindings for a Unity Catalog securable (catalog). Use when you need to control which workspaces can access a catalog. Allows adding or removing workspace bindings with read-write or read-only access. Caller must be a metastore admin or owner of the catalog.

Action Parameters

add
array
name
stringRequired
remove
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Clean Room Asset

Description

Tool to retrieve detailed information about a specific asset within a Databricks Clean Room. Use when you need to get metadata and configuration for clean room assets such as tables, views, notebooks, volumes, or foreign tables.

Action Parameters

asset_name
stringRequired
asset_type
stringRequired
clean_room_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Clean Room Auto-Approval Rule

Description

Tool to create a new auto-approval rule for a Databricks Clean Room. Use when you need to automatically approve notebooks shared by other collaborators that meet specific criteria. In 2-person clean rooms, auto-approve notebooks from the other collaborator using author_collaborator_alias. In multi-collaborator clean rooms, use author_scope=ANY_AUTHOR to auto-approve from any author.

Action Parameters

author_collaborator_alias
string
author_scope
string
clean_room_name
stringRequired
rule_owner_collaborator_alias
string
runner_collaborator_alias
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Clean Room

Description

Tool to create a new Databricks Clean Room for secure data collaboration with specified collaborators. Use when you need to establish a collaborative environment for multi-party data analysis. This is an asynchronous operation; the clean room starts in PROVISIONING state and becomes ACTIVE when ready. Requires metastore admin privileges or CREATE_CLEAN_ROOM privilege on the metastore.

Action Parameters

comment
string
name
stringRequired
remote_detailed_info
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Compute Cluster Policy

Description

Tool to create a new cluster policy with prescribed settings for controlling cluster creation. Use when you need to establish policies that govern cluster configurations. Only admin users can create cluster policies.

Action Parameters

definition
stringRequired
description
string
libraries
array
max_clusters_per_user
integer
name
stringRequired
policy_family_definition_overrides
string
policy_family_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Compute Cluster Policy

Description

Tool to delete a cluster policy. Use when you need to remove a cluster policy from the workspace. Clusters governed by this policy can still run, but cannot be edited. Only workspace admin users can delete policies. This operation is permanent and cannot be undone.

Action Parameters

policy_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Edit Compute Cluster Policy

Description

Tool to update an existing Databricks cluster policy. Use when you need to modify policy settings like name, definition, or restrictions. Note that this operation may make some clusters governed by the previous policy invalid.

Action Parameters

definition
string
description
string
libraries
array
max_clusters_per_user
integer
name
string
policy_family_definition_overrides
string
policy_family_id
string
policy_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Compute Cluster Policy

Description

Tool to retrieve detailed information about a specific cluster policy by its ID. Use when you need to view the configuration and settings of an existing cluster policy.

Action Parameters

policy_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Compute Cluster Policy Permission Levels

Description

Tool to retrieve available permission levels for a Databricks cluster policy. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific cluster policy. Returns permission levels like CAN_USE with their descriptions.

Action Parameters

cluster_policy_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Compute Cluster Policy Permissions

Description

Tool to retrieve permissions for a Databricks cluster policy. Use when you need to check who has access to a specific cluster policy and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects.

Action Parameters

cluster_policy_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set Compute Cluster Policy Permissions

Description

Tool to set permissions for a Databricks cluster policy, replacing all existing permissions. Use when you need to configure access control for a cluster policy. This operation replaces ALL existing permissions; non-admin users must be granted permissions to access the policy. Workspace admins always have permissions on all policies.

Action Parameters

access_control_list
arrayRequired
cluster_policy_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Cluster Policy Permissions

Description

Tool to incrementally update permissions on a Databricks cluster policy. Use when you need to modify permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

Action Parameters

access_control_list
arrayRequired
cluster_policy_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Compute Cluster

Description

Tool to terminate a Databricks compute cluster. Use when you need to stop and delete a cluster. The cluster configuration is retained for 30 days after termination, after which it is permanently deleted.

Action Parameters

cluster_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Edit Compute Cluster

Description

Tool to update the configuration of a Databricks cluster. Use when you need to modify cluster settings like size, Spark version, or cloud-specific attributes. The cluster must be in RUNNING or TERMINATED state. Running clusters will restart to apply changes.

Action Parameters

apply_policy_default_values
boolean
autoscale
object
autotermination_minutes
integer
aws_attributes
object
azure_attributes
object
cluster_id
stringRequired
cluster_log_conf
object
cluster_name
string
custom_tags
object
data_security_mode
string
docker_image
object
driver_instance_pool_id
string
driver_node_type_id
string
enable_elastic_disk
boolean
enable_local_disk_encryption
boolean
gcp_attributes
object
idempotency_token
string
init_scripts
array
instance_pool_id
string
node_type_id
string
num_workers
integer
policy_id
string
runtime_engine
string
single_user_name
string
spark_conf
object
spark_env_vars
object
spark_version
stringRequired
ssh_public_keys
array
workload_type
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Compute Cluster Permission Levels

Description

Tool to retrieve available permission levels for a Databricks compute cluster. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific cluster. Returns permission levels like CAN_ATTACH_TO, CAN_RESTART, and CAN_MANAGE with their descriptions.

Action Parameters

cluster_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Compute Cluster Node Types

Description

Tool to list all supported Spark node types available for cluster launch in the workspace region. Use when you need to determine which instance types are available for creating or configuring clusters. Returns detailed specifications including compute resources, storage capabilities, and cloud-specific attributes for each node type.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Compute Cluster Availability Zones

Description

Tool to list availability zones where Databricks clusters can be created. Use when you need to determine available zones for cluster deployment or planning redundancy. Returns the default zone and a list of all zones available in the workspace's cloud region. This endpoint is available for AWS workspaces.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Permanently Delete Compute Cluster

Description

Tool to permanently delete a Databricks compute cluster. Use when you need to irreversibly remove a cluster and its resources. After permanent deletion, the cluster will no longer appear in the cluster list and cannot be recovered.

Action Parameters

cluster_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Pin Compute Cluster

Description

Tool to pin a Databricks compute cluster configuration. Use when you need to preserve a cluster's configuration beyond the standard 30-day retention period. This operation is idempotent - pinning an already-pinned cluster has no effect. Requires workspace administrator privileges.

Action Parameters

cluster_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Compute Cluster Spark Versions

Description

Tool to list all available Databricks Runtime Spark versions for cluster creation. Use when you need to determine which Spark versions are available for creating or configuring clusters. The 'key' field from the response should be used as the 'spark_version' parameter when creating clusters.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Start Compute Cluster

Description

Tool to start a terminated Databricks compute cluster asynchronously. Use when you need to restart a stopped cluster. The cluster transitions through PENDING state before reaching RUNNING. Poll cluster status to verify when fully started.

Action Parameters

cluster_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Unpin Compute Cluster

Description

Tool to unpin a Databricks compute cluster configuration. Use when you need to allow a cluster's configuration to be removed after termination. This operation is idempotent - unpinning an already-unpinned cluster has no effect. Requires workspace administrator privileges.

Action Parameters

cluster_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Compute Cluster

Description

Tool to partially update a Databricks compute cluster configuration using field masks. Use when you need to update specific cluster attributes without providing a full configuration. The update_mask specifies which fields to modify. Running clusters restart to apply changes; terminated clusters apply changes on next startup.

Action Parameters

apply_policy_default_values
boolean
autoscale
object
autotermination_minutes
integer
aws_attributes
object
azure_attributes
object
cluster_id
stringRequired
cluster_log_conf
object
cluster_name
string
custom_tags
object
data_security_mode
string
docker_image
object
driver_instance_pool_id
string
driver_node_type_id
string
enable_elastic_disk
boolean
enable_local_disk_encryption
boolean
gcp_attributes
object
init_scripts
array
instance_pool_id
string
node_type_id
string
num_workers
integer
policy_id
string
runtime_engine
string
single_user_name
string
spark_conf
object
spark_env_vars
object
spark_version
string
ssh_public_keys
array
update_mask
stringRequired
workload_type
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Global Init Script

Description

Tool to create a new global initialization script in Databricks workspace. Use when you need to run scripts on every node in every cluster. Global init scripts run on all cluster nodes and only workspace admins can create them. Scripts execute in position order and clusters must restart to apply changes. The script cannot exceed 64KB when decoded.

Action Parameters

enabled
boolean
name
stringRequired
position
integer
script
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Global Init Script

Description

Tool to delete a global initialization script from Databricks workspace. Use when you need to remove a script that runs on every cluster node. Requires workspace administrator privileges. Clusters must restart to reflect the removal of the script.

Action Parameters

script_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Global Init Script

Description

Tool to retrieve complete details of a global initialization script in Databricks workspace. Use when you need to view script configuration, Base64-encoded content, or metadata. Returns all script details including creation/update timestamps and whether the script is enabled.

Action Parameters

script_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Global Init Script

Description

Tool to update a global initialization script in Databricks workspace. Use when you need to modify script content, name, enabled status, or execution order. All fields are optional; unspecified fields retain their current value. Existing clusters must be restarted to pick up changes.

Action Parameters

enabled
boolean
name
string
position
integer
script
string
script_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Compute Instance Pool

Description

Tool to create a new Databricks instance pool with specified configuration. Use when you need to set up a pool that reduces cluster start and auto-scaling times by maintaining idle, ready-to-use cloud instances. When attached to a pool, a cluster allocates driver and worker nodes from the pool.

Action Parameters

aws_attributes
object
azure_attributes
object
custom_tags
object
disk_spec
object
enable_elastic_disk
boolean
gcp_attributes
object
idle_instance_autotermination_minutes
integerRequired
instance_pool_fleet_attributes
object
instance_pool_name
stringRequired
max_capacity
integer
min_idle_instances
integer
node_type_id
stringRequired
preloaded_docker_images
array
preloaded_spark_versions
array
remote_disk_throughput
integer
total_initial_remote_disk_size
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Compute Instance Pool

Description

Tool to delete a Databricks compute instance pool. Use when you need to permanently remove an instance pool. The idle instances in the pool are terminated asynchronously after deletion.

Action Parameters

instance_pool_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Edit Compute Instance Pool

Description

Tool to modify the configuration of an existing Databricks instance pool. Use when you need to update pool settings like capacity, termination minutes, or preloaded images. Note that the pool's node type cannot be changed after creation, though it must still be provided with the same value.

Action Parameters

aws_attributes
object
azure_attributes
object
custom_tags
object
disk_spec
object
enable_elastic_disk
boolean
gcp_attributes
object
idle_instance_autotermination_minutes
integer
instance_pool_fleet_attributes
object
instance_pool_id
stringRequired
instance_pool_name
stringRequired
max_capacity
integer
min_idle_instances
integer
node_type_id
stringRequired
preloaded_docker_images
array
preloaded_spark_versions
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Instance Pool Details

Description

Tool to retrieve detailed information about a Databricks instance pool by its ID. Use when you need to get instance pool configuration, capacity settings, preloaded images, and usage statistics. Instance pools reduce cluster start and auto-scaling times by maintaining idle, ready-to-use cloud instances.

Action Parameters

instance_pool_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Instance Pool Permission Levels

Description

Tool to retrieve available permission levels for a Databricks instance pool. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific instance pool. Returns permission levels like CAN_ATTACH_TO and CAN_MANAGE with their descriptions.

Action Parameters

instance_pool_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Instance Pool Permissions

Description

Tool to retrieve permissions for a Databricks instance pool. Use when you need to check who has access to a specific instance pool and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects.

Action Parameters

instance_pool_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set Compute Instance Pool Permissions

Description

Tool to set permissions for a Databricks instance pool, replacing all existing permissions. Use when you need to configure access control for an instance pool. This operation replaces ALL existing permissions. You must have CAN_MANAGE permission on a pool to configure its permissions.

Action Parameters

access_control_list
arrayRequired
instance_pool_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Instance Pool Permissions

Description

Tool to incrementally update permissions on a Databricks instance pool. Use when you need to modify permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

Action Parameters

access_control_list
arrayRequired
instance_pool_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Add Compute Instance Profile

Description

Tool to register an instance profile in Databricks for cluster launches. Use when administrators need to grant users permission to launch clusters using that profile. Requires admin access. Successfully registered profiles enable clusters to use the associated IAM role.

Action Parameters

iam_role_arn
string
instance_profile_arn
stringRequired
is_meta_instance_profile
boolean
skip_validation
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Edit Compute Instance Profile

Description

Tool to modify an existing AWS EC2 instance profile registered with Databricks. Use when you need to update the IAM role ARN associated with an instance profile. This operation is only available to admin users. The IAM role ARN is required if both of the following are true: your role name and instance profile name do not match, and you want to use the instance profile with Databricks SQL Serverless.

Action Parameters

iam_role_arn
string
instance_profile_arn
stringRequired
is_meta_instance_profile
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Remove Compute Instance Profile

Description

Tool to remove an instance profile from Databricks. Use when you need to unregister an AWS instance profile ARN from Databricks. This operation is only accessible to admin users. Existing clusters with this instance profile will continue to function normally after removal.

Action Parameters

instance_profile_arn
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Enforce Cluster Policy Compliance

Description

Tool to update a cluster to be compliant with the current version of its policy. Use when you need to enforce policy compliance on a cluster. The cluster can be updated if it is in a RUNNING or TERMINATED state. Note: Clusters created by Databricks Jobs, DLT, or Models cannot be enforced by this API.

Action Parameters

cluster_id
stringRequired
validate_only
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Cluster Policy Compliance

Description

Tool to retrieve policy compliance status for a specific cluster. Use when you need to check whether a cluster meets the requirements of its assigned policy and identify any policy violations. Clusters could be out of compliance if their policy was updated after the cluster was last edited.

Action Parameters

cluster_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Compute Policy Families

Description

Tool to retrieve information for a policy family by identifier and optional version. Use when you need to view Databricks-provided templates for configuring clusters for a particular use case. Policy families cannot be created, edited, or deleted by users.

Action Parameters

policy_family_id
stringRequired
version
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Databricks Cluster

Description

Tool to create a new Databricks Spark cluster with specified configuration. Use when you need to provision compute resources for data processing. This is an asynchronous operation that returns a cluster_id immediately with the cluster in PENDING state. The cluster transitions through states until reaching RUNNING.

Action Parameters

apply_policy_default_values
boolean
autoscale
object
autotermination_minutes
integer
aws_attributes
object
azure_attributes
object
clone_from
string
cluster_log_conf
object
cluster_name
string
cluster_source
string
custom_tags
object
data_security_mode
string
docker_image
object
driver_instance_pool_id
string
driver_node_type_id
string
enable_elastic_disk
boolean
enable_local_disk_encryption
boolean
gcp_attributes
object
idempotency_token
string
init_scripts
array
instance_pool_id
string
node_type_id
string
num_workers
integer
policy_id
string
runtime_engine
string
single_user_name
string
spark_conf
object
spark_env_vars
object
spark_version
stringRequired
ssh_public_keys
array
workload_type
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Genie Message

Description

Tool to create a message in a Genie conversation and get AI-generated responses. Use when you need to ask questions or send messages to Genie for data analysis. The response initially has status 'IN_PROGRESS' and should be polled every 1-5 seconds until reaching COMPLETED, FAILED, or CANCELLED status. Subject to 5 queries-per-minute rate limit during Public Preview.

Action Parameters

content
stringRequired
conversation_id
stringRequired
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Genie Space

Description

Tool to create a new Genie space from a serialized payload for programmatic space management. Use when you need to create a Genie workspace for AI-powered data analysis with predefined sample questions and data sources. The space requires a SQL warehouse ID and a serialized configuration that includes sample questions, instructions, and data source tables.

Action Parameters

description
string
display_name
string
parent_path
string
serialized_space
stringRequired
warehouse_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Genie Conversation

Description

Tool to delete a conversation from a Genie space programmatically. Use when you need to remove conversations to manage the Genie space limits (10,000 conversations per space). Useful for deleting older or test conversations that are no longer needed.

Action Parameters

conversation_id
stringRequired
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Genie Conversation Message

Description

Tool to delete a specific message from a Genie conversation. Use when you need to remove individual messages from conversations. This operation permanently deletes the message and cannot be undone.

Action Parameters

conversation_id
stringRequired
message_id
stringRequired
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Execute Message Attachment Query

Description

Tool to execute SQL query for an expired message attachment in a Genie space. Use when a query attachment has expired and needs to be re-executed to retrieve fresh results. Returns SQL statement execution results with schema, metadata, and data.

Action Parameters

attachment_id
stringRequired
conversation_id
stringRequired
message_id
stringRequired
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Execute Genie Message Query

Description

Tool to execute the SQL query associated with a Genie message. Use when you need to run the query generated by Genie and retrieve result data. Note: This endpoint is deprecated in favor of ExecuteMessageAttachmentQuery.

Action Parameters

conversation_id
stringRequired
message_id
stringRequired
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Genie Message

Description

Tool to retrieve details of a specific message from a Genie conversation. Use when you need to get message content, status, attachments, or check processing status of a previously created message.

Action Parameters

conversation_id
stringRequired
message_id
stringRequired
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Genie Message Attachment Query Result

Description

Tool to retrieve SQL query results from a Genie message attachment. Use when the message status is EXECUTING_QUERY or COMPLETED and you need to fetch the actual query execution results. Returns statement execution details including query data, schema, and metadata with a maximum of 5000 rows.

Action Parameters

attachment_id
stringRequired
conversation_id
stringRequired
message_id
stringRequired
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Genie Message Query Result

Description

Tool to retrieve SQL query execution results for a Genie message (up to 5000 rows). Use when message status is EXECUTING_QUERY or COMPLETED and the message has a query attachment. Returns query results with schema, metadata, and data in inline or external link format.

Action Parameters

conversation_id
stringRequired
message_id
stringRequired
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Genie Message Query Result

Description

Tool to retrieve SQL query execution results for a message attachment in a Genie space conversation. Use when you need to fetch query results from a Genie conversation message. Note: This endpoint is deprecated; consider using GetMessageAttachmentQueryResult instead. Returns results only when message status is EXECUTING_QUERY or COMPLETED. Maximum 5,000 rows per result.

Action Parameters

attachment_id
stringRequired
conversation_id
stringRequired
message_id
stringRequired
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Genie Space Details

Description

Tool to retrieve detailed information about a specific Databricks Genie space by ID. Use when you need to get configuration details, metadata, and optionally the serialized space content for backup or promotion across workspaces. Requires at least CAN EDIT permission to retrieve the serialized space content.

Action Parameters

include_serialized_space
boolean
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Genie Conversation Messages

Description

Tool to retrieve all messages from a specific conversation thread in a Genie space. Use when you need to view the complete message history of a conversation including user queries and AI responses. Supports pagination for conversations with many messages.

Action Parameters

conversation_id
stringRequired
page_size
integer
page_token
string
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Genie Conversations

Description

Tool to retrieve all existing conversation threads within a Genie space. Use when you need to view conversations in a Genie space, either for the current user or all users if you have CAN MANAGE permission. Supports pagination for spaces with many conversations.

Action Parameters

include_all
boolean
page_size
integer
page_token
string
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Genie Spaces

Description

Tool to retrieve all Genie spaces in the workspace that the authenticated user has access to. Use when you need to list available Genie spaces, their metadata, and warehouse associations. Supports pagination for workspaces with many spaces.

Action Parameters

page_size
integer
page_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Send Genie Message Feedback

Description

Tool to send feedback for a Genie message. Use when you need to provide positive, negative, or no feedback rating for AI-generated messages in Genie conversations. Positive feedback on responses that join tables or use SQL expressions can prompt Genie to suggest new SQL snippets to space managers for review and approval.

Action Parameters

conversation_id
stringRequired
message_id
stringRequired
rating
stringRequired
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Start Genie Conversation

Description

Tool to start a new Genie conversation in a Databricks space for natural language data queries. Use when you need to ask questions about data using natural language. The message processes asynchronously, so initial status will be IN_PROGRESS. Poll the message status to get the completed response with query results.

Action Parameters

content
stringRequired
space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Trash Genie Space

Description

Tool to move a Genie space to trash instead of permanently deleting it. Use when you need to remove a Genie space while retaining recovery options. Trashed spaces follow standard Databricks trash behavior with 30-day retention before permanent deletion. Requires CAN MANAGE permission on the space.

Action Parameters

space_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Genie Space

Description

Tool to update an existing Genie space configuration. Use when you need to modify a Genie space's title, description, warehouse assignment, or complete serialized configuration. Supports partial updates (only provide fields you want to change) or full replacement via serialized_space. Useful for CI/CD pipelines, version control, and automated space management.

Action Parameters

description
string
id
stringRequired
serialized_space
string
title
string
warehouse_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Lakeview Dashboard

Description

Tool to create a new Lakeview dashboard in Databricks. Use when you need to create AI/BI dashboards for data visualization and analytics. Only the display_name parameter is required to create a blank dashboard, or you can provide serialized_dashboard to duplicate an existing dashboard.

Action Parameters

display_name
stringRequired
parent_path
string
serialized_dashboard
string
warehouse_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Lakeview Dashboard Schedule

Description

Tool to delete a dashboard schedule from a Lakeview dashboard. Use when you need to remove scheduled refreshes or updates for a dashboard. Provide the etag parameter to ensure the schedule hasn't been modified since last retrieval (optimistic concurrency control).

Action Parameters

dashboard_id
stringRequired
etag
string
schedule_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Published Dashboard Token Info

Description

Tool to retrieve authorization info for generating downscoped tokens to access published Lakeview dashboards. Use when you need to generate OAuth tokens for dashboard embedding for external users, ensuring tokens are properly scoped to prevent leaking privileged access.

Action Parameters

dashboard_id
stringRequired
external_value
stringRequired
external_viewer_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Lakeview Dashboard Details

Description

Tool to retrieve details about a draft AI/BI Lakeview dashboard from the workspace. Use when you need to get comprehensive information about a dashboard including metadata, configuration, state, and serialized dashboard content.

Action Parameters

dashboard_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Published Lakeview Dashboard

Description

Tool to retrieve the current published version of a Lakeview dashboard. Use when you need to get information about the published dashboard including its display name, embedded credentials status, warehouse configuration, and last revision timestamp.

Action Parameters

dashboard_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Lakeview Dashboard Schedule

Description

Tool to retrieve a specific schedule for a Databricks AI/BI Lakeview dashboard. Use when you need to get schedule details including cron expressions, pause status, warehouse configuration, and subscription information. Each dashboard can have up to 10 schedules, with each schedule supporting up to 100 subscriptions.

Action Parameters

dashboard_id
stringRequired
schedule_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Publish Lakeview Dashboard

Description

Tool to publish an AI/BI Lakeview dashboard making it accessible via public link. Use when you need to publish a draft dashboard with embedded credentials and assign a warehouse for query execution. After successful publication, the dashboard becomes accessible at https://<deployment-url>/dashboardsv3/<resource_id>/published.

Action Parameters

dashboard_id
stringRequired
embed_credentials
booleanDefaults to True
warehouse_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Trash Lakeview Dashboard

Description

Tool to move a Lakeview dashboard to trash instead of permanently deleting it. Use when you need to remove a dashboard while retaining recovery options. Trashed dashboards can be recovered within 30 days before permanent deletion.

Action Parameters

dashboard_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Unpublish Lakeview Dashboard

Description

Tool to unpublish an AI/BI Lakeview dashboard while preserving its draft version. Use when you need to remove the published version of a dashboard. The draft version remains available and can be republished later if needed.

Action Parameters

dashboard_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Lakeview Dashboard

Description

Tool to update a draft Lakeview dashboard configuration and metadata. Use when you need to modify dashboard properties such as display name, warehouse, location, or content. This is a partial update operation - only provided fields will be updated. The etag field can be used for optimistic concurrency control to prevent conflicts from concurrent modifications.

Action Parameters

dashboard_id
stringRequired
dataset_catalog
string
dataset_schema
string
display_name
string
etag
string
parent_path
string
serialized_dashboard
string
warehouse_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Database Catalog

Description

Tool to create a new database catalog in Databricks. Use when you need to establish a catalog for organizing database objects within a specific database instance. Requires appropriate database permissions.

Action Parameters

database_name
stringRequired
instance_name
stringRequired
name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Database Instance

Description

Tool to create a Lakebase database instance with specified configuration. Use when you need to provision a new database instance in Databricks with database owner and superuser role. The creator receives full administrative capabilities on the instance.

Action Parameters

capacity
stringRequired
custom_tags
array
enable_pg_native_login
boolean
name
stringRequired
retention_window_in_days
integer
usage_policy_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Database Instance

Description

Tool to delete a Lakebase Postgres database instance. Use when you need to permanently remove a database instance and all associated data. The instance should be stopped before deletion, and users must have CAN MANAGE permissions. This operation cannot be undone.

Action Parameters

force
boolean
id
stringRequired
purge
booleanRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Synced Database Table

Description

Tool to delete a synced table from Unity Catalog and stop data refreshes. Use when you need to deregister a synced table connection between Unity Catalog and a database instance. Note: The underlying Postgres table remains and must be manually dropped to free space.

Action Parameters

synced_table_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Find Database Instance By UID

Description

Tool to find a database instance by its unique identifier (UID). Use when you need to retrieve instance details using the immutable UUID instead of the instance name.

Action Parameters

uid
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Generate Database Credential

Description

Tool to generate OAuth token for database instance authentication. Use when you need to authenticate to Databricks database instances. The generated token is workspace-scoped and expires after one hour, though open connections remain active past expiration.

Action Parameters

claims
string
instance_names
arrayRequired
request_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Database Instance

Description

Tool to retrieve detailed information about a specific database instance by its name identifier. Use when you need to get comprehensive configuration details including capacity, state, retention settings, and connection endpoints for a PostgreSQL database instance managed by Databricks Lakebase.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Data Quality Monitor

Description

Tool to create a data quality monitor for a Unity Catalog Delta table. Use when you need to set up monitoring for table quality, track data drift, or monitor ML model inference logs. Supports snapshot, time series, and inference log monitoring types. Only one monitor can be created per table.

Action Parameters

assets_dir
stringRequired
baseline_table_name
string
custom_metrics
array
data_classification_config
object
inference_log
object
notifications
object
output_schema_name
stringRequired
schedule
object
skip_builtin_dashboard
boolean
slicing_exprs
array
snapshot
object
table_name
stringRequired
time_series
object
warehouse_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List DBFS Directory Contents

Description

Tool to list the contents of a directory or get details of a file in DBFS. Use when you need to browse DBFS directories or check file details. Note: Recommended for directories with less than 10,000 files due to ~60 second timeout limitation. Throws RESOURCE_DOES_NOT_EXIST error if path doesn't exist.

Action Parameters

path
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Databricks Cluster

Description

Tool to terminate a Databricks Spark cluster asynchronously. Use when you need to stop and remove a cluster. The cluster is terminated asynchronously and removed after completion. Cluster configuration is retained for 30 days after termination.

Action Parameters

cluster_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Databricks Job Run

Description

Tool to delete a non-active Databricks job run from the system. Use when you need to manually remove completed runs before the 60-day auto-deletion. Returns an error if the run is still active. Only non-active runs can be deleted.

Action Parameters

run_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete User by ID

Description

Tool to delete a user from the Databricks workspace by their ID. Use when you need to remove a user resource from the workspace. A user that does not own or belong to a workspace is automatically purged after 30 days. Only workspace admins can deactivate users at the workspace level.

Action Parameters

user_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Edit Databricks Cluster

Description

Tool to edit an existing Databricks cluster configuration. Use when you need to modify cluster settings such as size, Spark version, node types, or cloud-specific attributes. The cluster must be in RUNNING or TERMINATED state. If updated while RUNNING, it will restart to apply changes.

Action Parameters

apply_policy_default_values
boolean
autoscale
object
autotermination_minutes
integer
aws_attributes
object
azure_attributes
object
cluster_id
stringRequired
cluster_log_conf
object
cluster_mount_infos
array
cluster_name
string
custom_tags
object
data_security_mode
string
docker_image
object
driver_instance_pool_id
string
driver_node_type_id
string
enable_elastic_disk
boolean
enable_local_disk_encryption
boolean
gcp_attributes
object
idempotency_token
string
init_scripts
array
instance_pool_id
string
is_single_node
boolean
node_type_id
string
num_workers
integer
policy_id
string
runtime_engine
string
single_user_name
string
spark_conf
object
spark_env_vars
object
spark_version
stringRequired
ssh_public_keys
array
workload_type
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Add Block to DBFS Stream

Description

Tool to append a block of data to an open DBFS stream. Use when uploading large files in chunks as part of the DBFS streaming upload workflow: 1) create a stream handle, 2) add blocks, 3) close the stream. Each block is limited to 1 MB of base64-encoded data.

Action Parameters

data
stringRequired
handle
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create DBFS File Stream

Description

Tool to open a stream to write to a DBFS file and returns a handle. Use when uploading files to DBFS using the streaming workflow: 1) create a stream handle, 2) add blocks of data, 3) close the stream. The returned handle has a 10-minute idle timeout and must be used within that period.

Action Parameters

overwrite
boolean
path
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete DBFS File or Directory

Description

Tool to delete a file or directory from DBFS. Use when you need to remove files or directories from the Databricks File System. For large deletions (>10K files), use dbutils.fs in a cluster context instead of the REST API. Operation may return 503 PARTIAL_DELETE for large deletions and should be re-invoked until completion.

Action Parameters

path
stringRequired
recursive
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get DBFS File Status

Description

Tool to get the information of a file or directory in DBFS. Use when you need to check if a file or directory exists, retrieve its size, type, or last modification time. Throws RESOURCE_DOES_NOT_EXIST exception if the file or directory does not exist.

Action Parameters

path
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Move DBFS File or Directory

Description

Tool to move a file or directory from one location to another within DBFS. Use when you need to relocate files or directories in Databricks File System. Recursively moves all files if source is a directory. Not recommended for large-scale operations (>10k files) as it may timeout after ~60 seconds.

Action Parameters

destination_path
stringRequired
source_path
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Read DBFS File Contents

Description

Tool to read the contents of a file from DBFS. Returns base64-encoded file data with maximum read size of 1 MB per request. Use when you need to retrieve file contents from Databricks File System. Throws RESOURCE_DOES_NOT_EXIST if file does not exist, INVALID_PARAMETER_VALUE if path is a directory, MAX_READ_SIZE_EXCEEDED if read length exceeds 1 MB.

Action Parameters

length
integer
offset
integer
path
stringRequired

Action Response

bytes_read
integer
data
string
error
string
successful
booleanRequired

Tool Name: Get All Library Statuses

Description

Tool to retrieve status of all libraries across all Databricks clusters. Use when you need to check library installation status on all clusters, including libraries set to be installed on all clusters via the API or libraries UI. Returns detailed status information for each library on each cluster.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Cluster Information

Description

Tool to retrieve comprehensive metadata and configuration details for a Databricks cluster by its unique identifier. Use when you need to check cluster state, configuration, resources, or operational details. Returns cluster information including state, compute configuration, cloud-specific settings, and resource allocations.

Action Parameters

cluster_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Group by ID

Description

Tool to retrieve information for a specific group in Databricks workspace by its ID. Use when you need to get complete group details including members, roles, entitlements, and metadata. Implements the SCIM 2.0 protocol standard for retrieving Group resources.

Action Parameters

attributes
string
excludedAttributes
string
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get User by ID

Description

Tool to retrieve information for a specific user in Databricks workspace by their ID. Use when you need to get complete user details including identity, contact information, group memberships, roles, and entitlements. Implements the SCIM 2.0 protocol standard for retrieving User resources.

Action Parameters

attributes
string
excludedAttributes
string
user_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update IAM Account Access Control Rule Set

Description

Tool to update account-level access control rule set for service principals, groups, or budget policies. Use when you need to replace the entire set of access control rules for a resource. This is a PUT operation that replaces all existing roles - to preserve existing roles, they must be included in the grant_rules array.

Action Parameters

name
stringRequired
rule_set
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get IAM Account Group V2

Description

Tool to retrieve a specific group resource by its unique identifier from a Databricks account using SCIM v2 protocol. Use when you need to get complete group details including members, roles, and entitlements.

Action Parameters

account_id
stringRequired
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Current User Information

Description

Tool to retrieve details about the currently authenticated user or service principal making the API request. Use when you need to get information about the current user's identity, groups, roles, and entitlements within the Databricks workspace.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create IAM Group V2

Description

Tool to create a new group in Databricks workspace using SCIM v2 protocol. Use when you need to create a new security group with a unique display name, optionally with initial members, entitlements, and roles.

Action Parameters

displayName
stringRequired
entitlements
array
externalId
string
groups
array
members
array
roles
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete IAM Group V2

Description

Tool to delete a group from Databricks workspace using SCIM v2 protocol. Use when you need to permanently remove a security group. Requires appropriate permissions to delete the group.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Workspace IAM Group V2

Description

Tool to retrieve details of a specific group by ID from Databricks workspace using SCIM v2 protocol. Use when you need to get complete group information including members, roles, entitlements, and metadata.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Patch IAM Group V2

Description

Tool to partially update a Databricks workspace group using SCIM 2.0 PATCH operations. Use when you need to modify group attributes like displayName, add/remove members, or update entitlements/roles. All operations in a single request are atomic.

Action Parameters

Operations
arrayRequired
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update IAM Group V2

Description

Tool to update an existing group in Databricks workspace using SCIM v2 protocol. This performs a complete replacement of the group resource. Use when you need to update group properties, members, entitlements, or roles. For partial updates, consider using PATCH instead.

Action Parameters

displayName
stringRequired
entitlements
array
externalId
string
groups
array
id
stringRequired
members
array
roles
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Migrate Permissions

Description

Tool to migrate ACL permissions from workspace groups to account groups. Use when adopting Unity Catalog and migrating permissions from workspace-level groups to account-level groups. Primarily used by the Unity Catalog Migration (UCX) tool. Supports batch processing with configurable size limits.

Action Parameters

from_workspace_group_name
stringRequired
size
integer
to_account_group_name
stringRequired
workspace_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get IAM Permissions

Description

Tool to retrieve IAM permissions for a Databricks workspace object. Use when you need to check who has access to a specific resource and their permission levels. Returns the access control list (ACL) including user, group, and service principal permissions with inheritance information.

Action Parameters

object_id
stringRequired
object_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get IAM Permission Levels

Description

Tool to retrieve available permission levels for a Databricks workspace object. Use when you need to understand what permission levels can be assigned to users or groups for a specific object type. Returns permission levels like CAN_READ, CAN_RUN, CAN_EDIT, CAN_MANAGE with their descriptions. Available levels vary by object type.

Action Parameters

object_id
stringRequired
object_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set IAM Permissions

Description

Tool to set IAM permissions for a Databricks workspace object, replacing all existing permissions. Use when you need to configure complete access control for a resource. This operation replaces the entire access control list - existing permissions are overwritten. Admin permissions on the admins group cannot be removed.

Action Parameters

access_control_list
arrayRequired
object_id
stringRequired
object_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update IAM Permissions

Description

Tool to incrementally update permissions on Databricks workspace objects including dashboards, jobs, clusters, warehouses, notebooks, and more. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

Action Parameters

access_control_list
arrayRequired
object_id
stringRequired
object_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create IAM Service Principal V2

Description

Tool to create a new service principal in Databricks workspace using SCIM v2 protocol. Use when you need to create a service principal that already exists in the Databricks account. Required for identity-federated workspaces where you must specify a valid UUID applicationId.

Action Parameters

active
boolean
applicationId
stringRequired
displayName
string
entitlements
array
externalId
string
groups
array
roles
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete IAM Service Principal V2

Description

Tool to delete a service principal from Databricks workspace using SCIM v2 protocol. Use when you need to permanently remove a service principal and revoke its access to the workspace. The operation is idempotent - subsequent DELETE requests to the same ID will return 404 Not Found.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get IAM Service Principal V2

Description

Tool to retrieve details of a specific service principal by ID from Databricks workspace using SCIM v2 protocol. Use when you need to get complete service principal information including groups, roles, entitlements, and metadata.

Action Parameters

attributes
string
excludedAttributes
string
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Patch IAM Service Principal V2

Description

Tool to partially update a service principal using SCIM 2.0 PATCH operations. Use when you need to modify service principal attributes like active status, displayName, groups, entitlements, or roles without replacing the entire resource. All operations in a single request are atomic.

Action Parameters

Operations
arrayRequired
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update IAM Service Principal V2

Description

Tool to update an existing service principal in Databricks workspace using SCIM v2 protocol. This performs a complete replacement of the service principal resource (PUT operation). Use when you need to update service principal properties, group memberships, entitlements, or roles. Note: applicationId and id are immutable fields.

Action Parameters

active
boolean
applicationId
stringRequired
displayName
stringRequired
entitlements
array
externalId
string
groups
array
id
stringRequired
roles
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create IAM User V2

Description

Tool to create a new user in Databricks workspace using SCIM v2 protocol. Use when you need to provision a new user account with a unique userName (email), optionally with display name, activation status, group memberships, entitlements, and roles.

Action Parameters

active
boolean
displayName
string
emails
array
entitlements
array
externalId
string
groups
array
name
object
roles
array
userName
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete IAM User V2

Description

Tool to delete a user from Databricks workspace using SCIM v2 protocol. Use when you need to inactivate a user and revoke their access to the workspace. Note that users are automatically purged 30 days after deletion if they do not own or belong to any workspace. Applications or scripts using tokens generated by the deleted user will no longer be able to access Databricks APIs.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get IAM User V2

Description

Tool to retrieve detailed information for a specific user by ID from Databricks workspace using SCIM v2 protocol. Use when you need to get complete user information including name, email, groups, roles, entitlements, and metadata.

Action Parameters

attributes
string
excludedAttributes
string
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get IAM Users V2 Permissions

Description

Tool to retrieve permissions for password-based authentication. Use when you need to check who has access to password authentication and their permission levels. Note: Password authentication was deprecated July 10, 2024 and is no longer supported.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Patch IAM User V2

Description

Tool to partially update a user using SCIM 2.0 PATCH operations. Use when you need to modify user attributes like active status, displayName, userName, name fields, emails, groups, entitlements, or roles without replacing the entire resource. All operations in a single request are atomic.

Action Parameters

Operations
arrayRequired
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update IAM User V2

Description

Tool to update a user in Databricks workspace using SCIM v2 protocol. This performs a complete replacement of the user resource. Use when you need to update user properties including userName, displayName, active status, groups, entitlements, or roles.

Action Parameters

active
boolean
displayName
string
emails
array
entitlements
array
externalId
string
groups
array
id
stringRequired
name
object
roles
array
userName
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Workspace Access Detail Local

Description

Tool to retrieve detailed workspace access information for a specific identity in Databricks. Use when you need to check workspace access details including permissions, principal information, and access metadata.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Job Compliance for Policy

Description

Tool to retrieve policy compliance status of all jobs using a given cluster policy. Use when you need to identify jobs that are out of compliance because the policy was updated after the job was last edited. Jobs are non-compliant when their job clusters no longer meet the requirements of the updated policy.

Action Parameters

page_size
integer
page_token
string
policy_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Job Permission Levels

Description

Tool to retrieve available permission levels for a Databricks job. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific job. Returns permission levels like CAN_VIEW, CAN_MANAGE_RUN, and CAN_MANAGE with their descriptions.

Action Parameters

job_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Job Permissions

Description

Tool to incrementally update permissions for a Databricks job. Use when you need to modify specific job permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

Action Parameters

access_control_list
arrayRequired
job_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Job Run By ID

Description

Tool to retrieve metadata of a single Databricks job run by ID. Use when you need to get detailed information about a specific job run including state, timing, and cluster configuration. Runs are automatically removed after 60 days.

Action Parameters

run_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Databricks Job Runs

Description

Tool to list Databricks job runs in descending order by start time. Use when you need to retrieve a paginated list of job runs with optional filtering by job ID, run status, time range, and other criteria. Supports pagination via offset/limit or page_token. All runs are automatically removed after 60 days.

Action Parameters

active_only
boolean
completed_only
boolean
expand_tasks
boolean
job_id
integer
limit
integer
offset
integer
page_token
string
run_type
string
start_time_from
integer
start_time_to
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Cancel All Databricks Job Runs

Description

Tool to cancel all active runs of a Databricks job asynchronously. Use when you need to terminate all running instances of a job. The cancellation happens asynchronously without preventing new runs. When all_queued_runs=true without a job_id, it cancels all queued runs across the workspace.

Action Parameters

all_queued_runs
boolean
job_id
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Cancel Databricks Job Run

Description

Tool to cancel a Databricks job run asynchronously. Use when you need to terminate a running job. The run will be terminated shortly after the request completes. If the run is already in a terminal state, this is a no-op.

Action Parameters

run_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Databricks Job Run

Description

Tool to delete a non-active Databricks job run. Use when you need to remove a job run from the workspace. The run must be in a non-active state; attempting to delete an active run will return an error. Runs are automatically removed after 60 days.

Action Parameters

run_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Databricks Job Details

Description

Tool to retrieve detailed information about a single Databricks job. Use when you need to get comprehensive job configuration including tasks, schedules, notifications, and cluster settings. For jobs with more than 100 tasks or job clusters, use the page_token parameter to paginate through results.

Action Parameters

job_id
integerRequired
page_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Job Permission Levels

Description

Tool to retrieve available permission levels for a Databricks job. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific job. Returns permission levels like CAN_VIEW, CAN_MANAGE_RUN, CAN_MANAGE, and IS_OWNER with their descriptions.

Action Parameters

job_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Databricks Job Run Details

Description

Tool to retrieve complete metadata for a single Databricks job run. Use when you need to get detailed information about a specific job run including its state, timing, cluster configuration, and task details. Note that runs are automatically removed after 60 days. This endpoint does not return the run's output; use the getRunOutput method separately to retrieve output.

Action Parameters

run_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Databricks Job Run Output

Description

Tool to retrieve output and metadata of a single Databricks task run. Use when you need to get the output value from dbutils.notebook.exit() or check task execution results. IMPORTANT: This only works on task-level run IDs, not top-level job run IDs for multi-task jobs. API returns first 5 MB of output; for larger results use cloud storage. Runs are auto-removed after 60 days.

Action Parameters

run_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set Databricks Job Permissions

Description

Tool to set permissions for a Databricks job, completely replacing all existing permissions. Use when you need to configure access control for a job. This operation replaces ALL existing permissions; if no access_control_list is provided, all direct permissions are deleted. The job must have exactly one owner (cannot be a group).

Action Parameters

access_control_list
array
job_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Submit Databricks Job Run

Description

Tool to submit a one-time Databricks job run without creating a persistent job. Use when you need to execute a workload directly without defining a reusable job. The job run is submitted immediately and executes the specified tasks. You can track the run using the returned run_id with the jobs/runs/get endpoint.

Action Parameters

access_control_list
array
budget_policy_id
string
email_notifications
object
environments
array
git_source
object
health
object
idempotency_token
string
notification_settings
object
queue
object
run_as
object
run_name
string
tasks
array
timeout_seconds
integer
usage_policy_id
string
webhook_notifications
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Databricks Jobs

Description

Tool to retrieve a paginated list of all jobs in the Databricks workspace. Use when you need to discover available jobs, filter by name, or iterate through all jobs. Returns jobs in descending order by start time and supports task expansion for detailed task information.

Action Parameters

expand_tasks
boolean
limit
integer
name
string
offset
integer
page_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Job Policy Compliance

Description

Tool to retrieve policy compliance status for a specific job. Use when you need to check whether a job meets the requirements of its assigned policies and identify any policy violations. Jobs could be out of compliance if a policy they use was updated after the job was last edited and some of its job clusters no longer comply with their updated policies.

Action Parameters

job_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Unity Catalogs

Description

Tool to retrieve a list of all catalogs in the Unity Catalog metastore. Use when you need to discover available catalogs based on user permissions. If the caller is the metastore admin, all catalogs will be retrieved. Otherwise, only catalogs owned by the caller or for which the caller has the USE_CATALOG privilege will be retrieved.

Action Parameters

include_browse
boolean
include_unbound
boolean
max_results
integer
page_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Clusters

Description

Tool to list all pinned, active, and recently terminated Databricks clusters. Use when you need to retrieve cluster information, monitor cluster status, or get an overview of available compute resources. Returns clusters terminated within the last 30 days along with currently active clusters. Supports filtering by state, source, and policy, with pagination for large result sets.

Action Parameters

filter_by
object
page_size
integer
page_token
string
sort_by
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Workspace Groups

Description

Tool to list all groups in the Databricks workspace using SCIM v2 protocol. Use when you need to retrieve all groups or search for specific groups using filters and pagination.

Action Parameters

attributes
string
count
integer
excludedAttributes
string
filter
string
sortBy
string
sortOrder
string
startIndex
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Instance Pools

Description

Tool to retrieve a list of all active instance pools in the Databricks workspace with their statistics and configuration. Use when you need to get an overview of all available instance pools.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List All Databricks Jobs (API 2.0)

Description

Tool to list all jobs in the Databricks workspace using API 2.0. Use when you need to retrieve all jobs without pagination. Note: API 2.0 does not support pagination or filtering. For pagination support, use the API 2.2 endpoint instead.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Members of a Security Group

Description

Tool to retrieve all members (users and nested groups) of a Databricks security group. Use when you need to see who belongs to a specific group for access control auditing or management. This method is non-recursive and does not expand nested group memberships.

Action Parameters

group_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Model Serving Endpoints

Description

Tool to retrieve all serving endpoints for model serving in the workspace. Use when you need to list all available model serving endpoints and their configurations. Returns information about each endpoint including its state, configuration, served models, and traffic routing.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Node Types

Description

Tool to list all supported node types available for cluster launch in the workspace. Use when you need to determine which instance types are available for creating or configuring Databricks clusters.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Delta Live Tables Pipelines

Description

Tool to list Delta Live Tables pipelines in the workspace. Use when you need to retrieve a paginated list of pipelines with summary information. The pipeline specification field is not returned by this endpoint - only summary information is provided. For complete pipeline details, use the get pipeline endpoint.

Action Parameters

filter
string
max_results
integer
order_by
array
page_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Repos

Description

Tool to list Git repos that the calling user has Manage permissions on. Use when you need to retrieve all available repos in the workspace. Supports pagination and filtering by path prefix.

Action Parameters

next_page_token
string
path_prefix
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Databricks Job Runs

Description

Tool to list Databricks job runs in descending order by start time. Use when you need to retrieve job runs with optional filtering by job ID, run status, and type. Supports pagination via offset and limit parameters. Runs are automatically removed after 60 days.

Action Parameters

active_only
boolean
completed_only
boolean
job_id
integer
limit
integer
offset
integer
run_type
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Catalog Schemas

Description

Tool to retrieve all schemas in a specified catalog from Unity Catalog. Use when you need to discover available schemas within a catalog based on user permissions. If the caller is the metastore admin or owner of the parent catalog, all schemas will be retrieved. Otherwise, only schemas owned by the caller or for which the caller has the USE_SCHEMA privilege will be retrieved.

Action Parameters

catalog_name
stringRequired
include_browse
boolean
max_results
integer
page_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Secrets

Description

Tool to list all secret keys stored in a Databricks secret scope. Use when you need to retrieve metadata about secrets in a scope (does not return secret values). Requires READ permission on the scope.

Action Parameters

scope
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Secret Scopes

Description

Tool to list all secret scopes available in the Databricks workspace. Use when you need to retrieve all secret scopes including their names, backend types (DATABRICKS or AZURE_KEYVAULT), and Key Vault metadata for Azure-backed scopes.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Security Groups

Description

Tool to list all security groups in the Databricks workspace using SCIM v2 protocol. Use when you need to retrieve all groups with their identifiers and display names for access control management.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List SQL Warehouses

Description

Tool to list all SQL warehouses in the Databricks workspace. Use when you need to retrieve information about available SQL compute resources for running SQL commands. Returns the full list of SQL warehouses the user has access to, including their configuration, state, and connection details.

Action Parameters

page_size
integer
page_token
string
run_as_user_id
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Catalog Tables

Description

Tool to list all tables in a Unity Catalog schema with pagination support. Use when you need to retrieve tables from a specific catalog and schema combination. The API is paginated by default - continue reading pages using next_page_token until it's absent to ensure all results are retrieved.

Action Parameters

catalog_name
stringRequired
include_browse
boolean
include_manifest_capabilities
boolean
max_results
integer
omit_columns
boolean
omit_properties
boolean
omit_username
boolean
page_token
string
schema_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Tokens

Description

Tool to list all valid personal access tokens (PATs) for a user-workspace pair. Use when you need to retrieve all tokens associated with the authenticated user in the current workspace. Note that each PAT is valid for only one workspace, and Databricks automatically revokes PATs that haven't been used for 90 days.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Users

Description

Tool to list all users in a Databricks workspace using SCIM 2.0 protocol. Use when you need to retrieve user identities and their attributes. Supports filtering, pagination, and sorting.

Action Parameters

attributes
string
count
integer
excludedAttributes
string
filter
string
sortBy
string
sortOrder
string
startIndex
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Vector Search Endpoints

Description

Tool to list all vector search endpoints in the Databricks workspace. Use when you need to retrieve information about vector search endpoints which represent compute resources hosting vector search indexes. Supports pagination for handling large result sets.

Action Parameters

page_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Marketplace Consumer Installation

Description

Tool to create a marketplace consumer installation for Databricks Marketplace listings. Use when you need to install data products, datasets, notebooks, models, or other marketplace offerings into a workspace. Requires acceptance of consumer terms and the listing ID to proceed with installation.

Action Parameters

accepted_consumer_terms
object
catalog_name
string
id
stringRequired
recipient_type
string
repo_detail
object
share_name
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Marketplace Consumer Installation

Description

Tool to uninstall a Databricks Marketplace installation. Use when you need to remove an installed data product from your workspace. When an installation is deleted, the shared catalog is removed from the workspace. Requires CREATE CATALOG and USE PROVIDER permissions on the Unity Catalog metastore, or metastore admin role.

Action Parameters

installation_id
stringRequired
listing_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Marketplace Consumer Installation

Description

Tool to update marketplace consumer installation fields and rotate tokens for marketplace listings. Use when you need to modify installation attributes or refresh access credentials. The token will be rotated if the rotate_token flag is true.

Action Parameters

catalog_name
string
error_message
string
id
string
installation_id
stringRequired
installed_on
integer
listing_id
stringRequired
listing_name
string
recipient_type
string
repo_name
string
repo_path
string
rotate_token
boolean
share_name
string
status
string
token_detail
object
tokens
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Batch Get Marketplace Consumer Listings

Description

Tool to batch get published listings from the Databricks Marketplace. Use when you need to retrieve multiple listing details in a single API call. Maximum limit of 50 listing IDs per request.

Action Parameters

ids
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Marketplace Consumer Listing

Description

Tool to retrieve a published listing from Databricks Marketplace that consumer has access to. Use when you need to get detailed information about a specific marketplace listing by its ID. Requires Unity Catalog permissions to access marketplace assets.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Marketplace Consumer Personalization Requests

Description

Tool to retrieve personalization requests for a specific marketplace listing. Use when you need to check the status of customization or commercial transaction requests for a listing. Each consumer can make at most one personalization request per listing.

Action Parameters

listing_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Batch Get Marketplace Consumer Providers

Description

Tool to batch get providers from the Databricks Marketplace with visible listings. Use when you need to retrieve multiple provider details in a single API call. Maximum limit of 50 provider IDs per request.

Action Parameters

ids
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Marketplace Consumer Provider

Description

Tool to retrieve information about a specific provider in the Databricks Marketplace with visible listings. Use when you need to get provider details including contact information, description, and metadata.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Listing From Exchange

Description

Tool to remove the association between a marketplace exchange and a listing. Use when you need to disassociate an exchange from a provider listing. This removes the listing from the private exchange, and it will no longer be shared with the curated set of customers in that exchange.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Marketplace Provider Listing

Description

Tool to create a new listing in Databricks Marketplace for data providers. Use when you need to publish data products, datasets, models, or notebooks to the marketplace. Requires a listing object with summary information (name and listing_type). For free and instantly available data products, a share must be included during creation.

Action Parameters

listing
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Marketplace Provider Listing

Description

Tool to retrieve a specific marketplace provider listing by its identifier. Use when you need to get detailed information about a published or draft listing including metadata, configuration, and assets.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Provider Analytics Dashboard

Description

Tool to create a provider analytics dashboard for monitoring Databricks Marketplace listing metrics. Use when you need to establish analytics tracking for listing views, requests, installs, conversion rates, and consumer information. Requires Marketplace admin role and system tables to be enabled in the metastore.

Action Parameters

version
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Provider Analytics Dashboard

Description

Tool to retrieve provider analytics dashboard information for monitoring consumer usage metrics. Use when you need to access the dashboard ID to view marketplace listing performance including views, requests, installs, and conversion rates.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Latest Provider Analytics Dashboard Version

Description

Tool to retrieve the latest logical version of the provider analytics dashboard template. Use when you need to get the current dashboard template version for monitoring consumer usage metrics including listing views, requests, and installs.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create ML Experiment

Description

Tool to create a new MLflow experiment for tracking machine learning runs and models. Use when you need to organize and track ML experiments within Databricks. Returns RESOURCE_ALREADY_EXISTS error if an experiment with the same name already exists.

Action Parameters

artifact_location
string
name
stringRequired
tags
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Logged Model

Description

Tool to create a new logged model in MLflow that ties together model metadata, parameters, metrics, and artifacts. Use when you need to create a LoggedModel object as part of the unified 'log + register' workflow introduced in MLflow 2.8. LoggedModel objects persist throughout a model's lifecycle and provide a centralized way to track model information.

Action Parameters

experiment_id
stringRequired
model_type
string
name
string
params
array
source_run_id
string
tags
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create MLflow Experiment Run

Description

Tool to create a new MLflow run within an experiment for tracking machine learning execution. Use when starting a new ML training run, experiment execution, or data pipeline that needs parameter and metric tracking. Returns the created run with a unique run_id for subsequent metric and parameter logging.

Action Parameters

experiment_id
string
run_name
string
start_time
integer
tags
array
user_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete ML Experiment

Description

Tool to delete an MLflow experiment and associated metadata, runs, metrics, params, and tags. Use when you need to remove an experiment from Databricks. If the experiment uses FileStore, artifacts associated with the experiment are also deleted.

Action Parameters

experiment_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Logged Model

Description

Tool to delete a logged model from MLflow tracking. Use when you need to permanently remove a LoggedModel from the tracking server. The deletion is permanent and cannot be undone. LoggedModels track a model's lifecycle across different training and evaluation runs.

Action Parameters

model_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Logged Model Tag

Description

Tool to delete a tag from a logged model in MLflow. Use when you need to remove metadata from a LoggedModel object. This operation is irreversible and permanently removes the tag from the logged model. Part of MLflow 3's logged model management capabilities.

Action Parameters

key
stringRequired
model_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete ML Experiment Run

Description

Tool to mark an MLflow run for deletion in ML experiments. Use when you need to remove a specific run from Databricks. This is a soft delete operation - the run is marked for deletion rather than immediately removed and can be restored unless permanently deleted.

Action Parameters

run_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete ML Experiment Runs

Description

Tool to bulk delete runs in an ML experiment created before a specified timestamp. Use when you need to clean up old experiment runs. Only runs created prior to or at the specified timestamp are deleted. The maximum number of runs that can be deleted in one operation is 10000.

Action Parameters

experiment_id
stringRequired
max_runs
integer
max_timestamp_millis
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete ML Experiment Run Tag

Description

Tool to delete a tag from an MLflow experiment run. Use when you need to remove run metadata. This operation is irreversible and permanently removes the tag from the run.

Action Parameters

key
stringRequired
run_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Finalize Logged Model

Description

Tool to finalize a logged model in MLflow by updating its status to READY or FAILED. Use when custom model preparation logic is complete and you need to mark the model as ready for use or indicate that upload failed. This is part of the experimental logged models feature introduced in MLflow 2.8+.

Action Parameters

model_id
stringRequired
status
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get ML Experiment By Name

Description

Tool to retrieve MLflow experiment metadata by name. Use when you need to get experiment details using the experiment name. Returns deleted experiments but prefers active ones if both exist with the same name. Throws RESOURCE_DOES_NOT_EXIST if no matching experiment exists.

Action Parameters

experiment_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get ML Experiment

Description

Tool to retrieve metadata for an MLflow experiment by ID. Use when you need to get experiment details including name, artifact location, lifecycle stage, and tags. Works on both active and deleted experiments.

Action Parameters

experiment_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Logged Model

Description

Tool to fetch logged model metadata by unique ID. Use when you need to retrieve a LoggedModel object representing a model logged to an MLflow Experiment. Returns comprehensive model information including metrics, parameters, tags, and artifact details.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get ML Experiment Permission Levels

Description

Tool to retrieve available permission levels for a Databricks ML experiment. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific experiment. Returns permission levels like CAN_READ, CAN_EDIT, CAN_MANAGE, and IS_OWNER with their descriptions.

Action Parameters

experiment_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get ML Experiment Permissions

Description

Tool to retrieve permissions for an MLflow experiment. Use when you need to check who has access to an experiment and their permission levels. Note that notebook experiments inherit permissions from their corresponding notebook, while workspace experiments have independent permissions.

Action Parameters

experiment_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get MLflow Run

Description

Tool to retrieve complete information about a specific MLflow run including metadata, metrics, parameters, tags, inputs, and outputs. Use when you need to get details of a run by its run_id. Returns the most recent metric values when multiple metrics with the same key exist.

Action Parameters

run_id
stringRequired
run_uuid
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Log Batch MLflow Data

Description

Tool to log a batch of metrics, parameters, and tags for an MLflow run in a single request. Use when you need to efficiently log multiple metrics, params, or tags simultaneously. Items within each type are processed sequentially in the order specified. The combined total of all items across metrics, params, and tags cannot exceed 1000.

Action Parameters

metrics
array
params
array
run_id
stringRequired
tags
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Log MLflow Dataset Inputs

Description

Tool to log dataset inputs to an MLflow run for tracking data sources used during model development. Use when you need to track metadata about datasets used in ML experiment runs, including information about the dataset source, schema, and tags. Enables logging of dataset inputs to a run, allowing you to track data sources throughout the ML lifecycle.

Action Parameters

datasets
array
run_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Log Logged Model Parameters

Description

Tool to log parameters for a logged model in MLflow. Use when you need to attach hyperparameters or metadata to a LoggedModel object. A param can be logged only once for a logged model, and attempting to overwrite an existing param will result in an error. Available in MLflow 2.8+.

Action Parameters

id
stringRequired
params
arrayRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Log MLflow Metric

Description

Tool to log a metric for an MLflow run with timestamp. Use when you need to record ML model performance metrics like accuracy, loss, or custom evaluation metrics. Metrics can be logged multiple times with different timestamps and values are never overwritten - each log appends to the metric history for that key.

Action Parameters

key
stringRequired
run_id
stringRequired
step
integer
timestamp
integerRequired
value
numberRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Log MLflow Model

Description

Tool to log a model artifact for an MLflow run (Experimental API). Use when you need to record model metadata including artifact paths, flavors, and versioning information for a training run. The model_json parameter should contain a complete MLmodel specification in JSON string format.

Action Parameters

model_json
stringRequired
run_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Log MLflow Dataset Outputs

Description

Tool to log dataset outputs from an MLflow run for tracking data generated during model development. Use when you need to track metadata about datasets produced by ML experiment runs, including information about predictions, model outputs, or generated data. Enables logging of dataset outputs to a run, allowing you to track generated data throughout the ML lifecycle.

Action Parameters

datasets
array
run_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Log MLflow Parameter

Description

Tool to log a parameter for an MLflow run as a key-value pair. Use when you need to record hyperparameters or constant values for ML model training or ETL pipelines. Parameters can only be logged once per run and cannot be changed after logging. Logging identical parameters is idempotent.

Action Parameters

key
stringRequired
run_id
stringRequired
value
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Restore ML Experiment

Description

Tool to restore a deleted MLflow experiment and its associated metadata, runs, metrics, params, and tags. Use when you need to recover a previously deleted experiment from Databricks. If the experiment uses FileStore, underlying artifacts are also restored.

Action Parameters

experiment_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Restore ML Experiment Run

Description

Tool to restore a deleted MLflow run and its associated metadata, runs, metrics, params, and tags. Use when you need to recover a previously deleted run from Databricks ML experiments. The operation cannot restore runs that were permanently deleted.

Action Parameters

run_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Restore ML Experiment Runs

Description

Tool to bulk restore runs in an ML experiment that were deleted at or after a specified timestamp. Use when you need to recover multiple deleted experiment runs. Only runs deleted at or after the specified timestamp are restored. The maximum number of runs that can be restored in one operation is 10000.

Action Parameters

experiment_id
stringRequired
max_runs
integer
min_timestamp_millis
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Search Logged Models

Description

Tool to search for logged models in MLflow experiments based on various criteria. Use when you need to find models that match specific metrics, parameters, tags, or attributes using SQL-like filter expressions. Supports pagination, ordering results, and filtering by datasets.

Action Parameters

datasets
array
experiment_ids
array
filter
string
max_results
integer
order_by
array
page_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set ML Experiment Tag

Description

Tool to set a tag on an MLflow experiment. Use when you need to add or update experiment metadata. Experiment tags are metadata that can be updated at any time.

Action Parameters

experiment_id
stringRequired
key
stringRequired
value
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set Logged Model Tags

Description

Tool to set tags on a logged model in MLflow. Use when you need to add or update metadata tags on a LoggedModel object for organization and tracking. Tags are key-value pairs that can be used to search and filter logged models. Part of MLflow 3's logged model management capabilities.

Action Parameters

model_id
stringRequired
tags
arrayRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set ML Experiment Permissions

Description

Tool to set permissions for an MLflow experiment, replacing all existing permissions. Use when you need to configure access control for an experiment. This operation replaces ALL existing permissions; for incremental updates, use the update permissions endpoint instead.

Action Parameters

access_control_list
arrayRequired
experiment_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set MLflow Run Tag

Description

Tool to set a tag on an MLflow run. Use when you need to add custom metadata to runs for filtering, searching, and organizing experiments. Tags with the same key can be overwritten by successive writes. Logging the same tag (key, value) is idempotent.

Action Parameters

key
stringRequired
run_id
stringRequired
value
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update ML Experiment

Description

Tool to update MLflow experiment metadata, primarily for renaming experiments. Use when you need to rename an existing experiment. The new experiment name must be unique across all experiments in the workspace.

Action Parameters

experiment_id
stringRequired
new_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update ML Experiment Permissions

Description

Tool to incrementally update permissions for an MLflow experiment. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

Action Parameters

access_control_list
arrayRequired
experiment_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update ML Experiment Run

Description

Tool to update MLflow run metadata including status, end time, and run name. Use when a run's status changes outside normal execution flow or when you need to rename a run. This endpoint allows you to modify a run's metadata after it has been created.

Action Parameters

end_time
integer
run_id
stringRequired
run_name
string
run_uuid
string
status
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete ML Feature Engineering Kafka Config

Description

Tool to delete a Kafka configuration from ML Feature Engineering. Use when you need to remove Kafka streaming source configurations. The deletion is permanent and cannot be undone. Kafka configurations define how features are streamed from Kafka sources.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create ML Feature Store Online Store

Description

Tool to create a Databricks Online Feature Store for real-time feature serving. Use when you need to establish serverless infrastructure for low-latency access to feature data at scale. Requires Databricks Runtime 16.4 LTS ML or above, or serverless compute.

Action Parameters

capacity
stringRequired
name
stringRequired
read_replica_count
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete ML Feature Store Online Store

Description

Tool to delete an online store from ML Feature Store. Use when you need to remove online stores that provide low-latency feature serving infrastructure. The deletion is permanent and cannot be undone. Online stores are used for real-time feature retrieval in production ML serving.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete ML Feature Store Online Table

Description

Tool to delete an online table from ML Feature Store. Use when you need to permanently remove an online table and stop data synchronization. This operation deletes all data in the online table permanently and releases all resources.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create ML Forecasting Experiment

Description

Tool to create a new AutoML forecasting experiment for time series prediction. Use when you need to automatically train and optimize forecasting models on time series data. The experiment will train multiple models and select the best one based on the primary metric.

Action Parameters

data_dir
string
dataset
stringRequired
experiment_name
string
frequency
stringRequired
horizon
integerRequired
identity_col
array
primary_metric
string
target_col
stringRequired
time_col
stringRequired
timeout_minutes
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete ML Feature Tag

Description

Tool to delete a tag from a feature in a feature table in ML Feature Store. Use when you need to remove metadata tags from specific features. This operation removes the tag association from the feature but does not affect the feature data itself.

Action Parameters

feature_name
stringRequired
feature_table_id
stringRequired
tag_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get ML Feature Tag

Description

Tool to retrieve a specific tag from a feature in a feature table in ML Feature Store. Use when you need to get metadata tag details from specific features. This operation returns the tag name and value associated with the feature.

Action Parameters

feature_name
stringRequired
feature_table_id
stringRequired
tag_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set or Update ML Feature Tag

Description

Tool to set or update a tag on a feature in a feature table in ML Feature Store. Use when you need to add or modify metadata tags on specific features. If the tag already exists, it will be updated with the new value. If the tag doesn't exist, it will be created automatically. This operation is idempotent and can be used to ensure a tag has a specific value.

Action Parameters

feature_name
stringRequired
feature_table_id
stringRequired
tag_name
stringRequired
value
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get ML Model Registry Permission Levels

Description

Tool to retrieve available permission levels for a Databricks ML registered model. Use when you need to understand what permission levels can be assigned to users or groups for a specific model. Returns permission levels like CAN_READ, CAN_EDIT, CAN_MANAGE, CAN_MANAGE_PRODUCTION_VERSIONS, CAN_MANAGE_STAGING_VERSIONS, and IS_OWNER with their descriptions.

Action Parameters

registered_model_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete OAuth2 Service Principal Secret

Description

Tool to delete an OAuth secret from a service principal at the account level. Use when you need to revoke OAuth credentials for service principal authentication. Once deleted, applications or scripts using tokens generated from that secret will no longer be able to access Databricks APIs.

Action Parameters

secret_id
stringRequired
service_principal_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create OAuth Service Principal Secret

Description

Tool to create an OAuth secret for service principal authentication. Use when you need to obtain OAuth access tokens for accessing Databricks Accounts and Workspace APIs. A service principal can have up to five OAuth secrets, each valid for up to two years (730 days). The secret value is only shown once upon creation.

Action Parameters

id
stringRequired
lifetime
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete OAuth2 Service Principal Secret Proxy

Description

Tool to delete an OAuth secret from a service principal. Use when you need to revoke OAuth credentials for service principal authentication. Once deleted, applications using tokens generated from that secret will no longer be able to access Databricks APIs.

Action Parameters

secret_id
stringRequired
service_principal_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Databricks Pipeline

Description

Tool to delete a Databricks Delta Live Tables pipeline permanently and stop any active updates. Use when you need to remove a pipeline completely. If the pipeline publishes to Unity Catalog, deletion will cascade to all pipeline tables. This action cannot be easily undone without Databricks support assistance.

Action Parameters

pipeline_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Pipeline Permission Levels

Description

Tool to retrieve available permission levels for a Databricks Delta Live Tables pipeline. Use when you need to understand what permission levels can be assigned to users or groups for a specific pipeline. Returns permission levels like CAN_VIEW, CAN_RUN, CAN_MANAGE, and IS_OWNER with their descriptions.

Action Parameters

pipeline_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Pipeline Permissions

Description

Tool to retrieve permissions for a Databricks Delta Live Tables pipeline. Use when you need to check who has access to a pipeline and their permission levels. Returns the complete permissions information including access control lists with user, group, and service principal permissions.

Action Parameters

pipeline_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Pipeline Updates

Description

Tool to retrieve a paginated list of updates for a Databricks Delta Live Tables pipeline. Use when you need to view the update history for a specific pipeline. Returns information about each update including state, creation time, and configuration details such as full refresh and table selection.

Action Parameters

max_results
integer
page_token
string
pipeline_id
stringRequired
until_update_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Pipeline Permissions

Description

Tool to incrementally update permissions on a Databricks pipeline. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. Pipelines can inherit permissions from root object.

Action Parameters

access_control_list
arrayRequired
pipeline_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Quality Monitor V2

Description

Tool to create a quality monitor for Unity Catalog table. Use when you need to set up monitoring for data quality metrics, track drift over time, or monitor ML inference logs. Monitor creation is asynchronous; dashboard and metric tables take 8-20 minutes to complete. Exactly one monitor type (snapshot, time_series, or inference_log) must be specified.

Action Parameters

assets_dir
stringRequired
baseline_table_name
string
custom_metrics
array
data_classification_config
object
inference_log
object
notifications
object
output_schema_name
stringRequired
schedule
object
skip_builtin_dashboard
boolean
slicing_exprs
array
snapshot
object
table_name
stringRequired
time_series
object
warehouse_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Databricks Job Run Output

Description

Tool to retrieve output and metadata of a single Databricks task run. Use when you need to get the output value from dbutils.notebook.exit() or check task execution results. IMPORTANT: This only works on task-level run IDs, not top-level job run IDs for multi-task jobs. API returns first 5 MB of output; for larger results use cloud storage. Runs are auto-removed after 60 days.

Action Parameters

run_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Search MLflow Experiments

Description

Tool to search for MLflow experiments with filtering, ordering, and pagination support. Use when you need to find experiments based on name patterns, tags, or other criteria. Supports SQL-like filtering expressions and ordering by experiment attributes.

Action Parameters

filter
string
max_results
integer
order_by
array
page_token
string
view_type
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Search MLflow Runs

Description

Tool to search for MLflow runs with filtering, ordering, and pagination support. Use when you need to find runs based on metrics, parameters, tags, or other criteria. Supports complex filter expressions with operators like =, !=, >, >=, <, <= for metrics, params, and tags.

Action Parameters

experiment_ids
arrayRequired
filter
string
max_results
integer
order_by
array
page_token
string
run_view_type
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Provisioned Throughput Endpoint

Description

Tool to create a provisioned throughput serving endpoint for AI models in Databricks. Use when you need to provision model units for production GenAI applications with guaranteed throughput. The endpoint name must be unique across the workspace and can consist of alphanumeric characters, dashes, and underscores. Returns a long-running operation that completes when the endpoint is ready.

Action Parameters

ai_gateway
object
budget_policy_id
string
config
objectRequired
email_notifications
object
name
stringRequired
tags
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Serving Endpoint

Description

Tool to delete a model serving endpoint and all associated data. Use when you need to permanently remove an endpoint. Deletion is permanent and cannot be undone. This operation disables usage and deletes all data associated with the endpoint.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Serving Endpoint Details

Description

Tool to retrieve detailed information about a specific serving endpoint by name. Use when you need to get comprehensive information about a serving endpoint including its configuration, state, served entities, traffic routing, and metadata.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Serving Endpoint OpenAPI Spec

Description

Tool to retrieve the OpenAPI 3.1.0 specification for a serving endpoint. Use when you need to understand the endpoint's schema, generate client code, or visualize the API structure. The endpoint must be in a READY state and the served model must have a model signature logged.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Serving Endpoint Permission Levels

Description

Tool to retrieve available permission levels for a Databricks serving endpoint. Use when you need to understand what permission levels can be assigned to users or groups for access control. Returns permission levels like CAN_MANAGE, CAN_QUERY, and CAN_VIEW with their descriptions.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Serving Endpoint Rate Limits

Description

Tool to update rate limits for a Databricks serving endpoint. Use when you need to control the number of API calls allowed within a time period. Note: This endpoint is deprecated; consider using AI Gateway for rate limit management instead.

Action Parameters

name
stringRequired
rate_limits
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Serving Endpoint AI Gateway

Description

Tool to update AI Gateway configuration of a Databricks serving endpoint. Use when you need to configure traffic fallback, AI guardrails, payload logging, rate limits, or usage tracking. Supports external model, provisioned throughput, and pay-per-token endpoints; agent endpoints currently only support inference tables.

Action Parameters

fallback_config
object
guardrails
object
id
stringRequired
inference_table_config
object
rate_limits
array
usage_tracking_config
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete AI/BI Dashboard Embedding Access Policy

Description

Tool to delete AI/BI dashboard embedding access policy, reverting to default. Use when you need to remove the workspace-level policy for AI/BI published dashboard embedding. Upon deletion, the workspace reverts to the default setting (ALLOW_APPROVED_DOMAINS), conditionally permitting AI/BI dashboards to be embedded on approved domains.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get AI/BI Dashboard Embedding Access Policy

Description

Tool to retrieve workspace AI/BI dashboard embedding access policy setting. Use when you need to check whether AI/BI published dashboard embedding is enabled, conditionally enabled, or disabled. The default setting is ALLOW_APPROVED_DOMAINS which permits AI/BI dashboards to be embedded on approved domains.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update AI/BI Dashboard Embedding Access Policy

Description

Tool to update AI/BI dashboard embedding workspace access policy at the workspace level. Use when you need to control whether AI/BI published dashboard embedding is enabled, conditionally enabled, or disabled. Follows read-modify-write workflow with etag-based optimistic concurrency control to prevent race conditions.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete AI/BI Dashboard Embedding Approved Domains

Description

Tool to delete the list of approved domains for AI/BI dashboard embedding, reverting to default. Use when you need to remove the workspace-level approved domains list for hosting embedded AI/BI dashboards. Upon deletion, the workspace reverts to an empty approved domains list. The approved domains list cannot be modified when the current access policy is not configured to ALLOW_APPROVED_DOMAINS.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get AI/BI Dashboard Embedding Approved Domains

Description

Tool to retrieve the list of domains approved to host embedded AI/BI dashboards. Use when you need to check which external domains are permitted to embed AI/BI dashboards. The approved domains list cannot be modified unless the workspace access policy is set to ALLOW_APPROVED_DOMAINS.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update AI/BI Dashboard Embedding Approved Domains

Description

Tool to update the list of domains approved to host embedded AI/BI dashboards at the workspace level. Use when you need to modify the approved domains list. The approved domains list can only be modified when the current access policy is set to ALLOW_APPROVED_DOMAINS.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Automatic Cluster Update Setting

Description

Tool to retrieve automatic cluster update setting for the workspace. Use when you need to check whether automatic cluster updates are enabled, view maintenance window configuration, or get restart behavior settings. This setting controls whether clusters automatically update during maintenance windows. Currently in Public Preview.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Automatic Cluster Update Setting

Description

Tool to update workspace automatic cluster update configuration with etag-based concurrency control. Use when you need to enable/disable automatic cluster updates, configure maintenance windows, or adjust restart behavior. Requires Premium pricing tier and admin access. If the setting is updated concurrently, the PATCH request fails with HTTP 409 requiring retry with fresh etag.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Compliance Security Profile Setting

Description

Tool to retrieve workspace compliance security profile setting. Use when you need to check whether CSP is enabled or view configured compliance standards. The CSP enables additional monitoring, enforced instance types for inter-node encryption, hardened compute images, and other security controls. Once enabled, this setting represents a permanent workspace change that cannot be disabled.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Dashboard Email Subscriptions Setting

Description

Tool to delete the dashboard email subscriptions setting, reverting to default value. Use when you need to revert the workspace setting that controls whether schedules or workload tasks for refreshing AI/BI Dashboards can send subscription emails. Upon deletion, the setting reverts to its default value (enabled/true). This is a workspace-level setting.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Dashboard Email Subscriptions Setting

Description

Tool to retrieve dashboard email subscriptions setting for the workspace. Use when you need to check whether schedules or workload tasks for refreshing AI/BI Dashboards can send subscription emails. By default, this setting is enabled.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Dashboard Email Subscriptions Setting

Description

Tool to update the Dashboard Email Subscriptions setting for the workspace with etag-based concurrency control. Use when you need to enable or disable whether dashboard schedules can send subscription emails. If the setting is updated concurrently, the PATCH request fails with HTTP 409 requiring retry with fresh etag.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Default Namespace Setting

Description

Tool to delete the default namespace setting for the workspace, removing the default catalog configuration. Use when you need to remove the default catalog used for queries without fully qualified names. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Default Namespace Setting

Description

Tool to retrieve the default catalog namespace setting for the workspace. Use when you need to check which catalog is used for unqualified table references in Unity Catalog-enabled compute. Changes to this setting require restart of clusters and SQL warehouses to take effect.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Default Namespace Setting

Description

Tool to update the default catalog namespace configuration for workspace queries with etag-based concurrency control. Use when you need to configure the default catalog used for queries without fully qualified three-level names. Requires a restart of clusters and SQL warehouses to take effect. Only applies to Unity Catalog-enabled compute. If concurrent updates occur, the request fails with 409 status requiring retry with fresh etag.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Default Warehouse ID Setting

Description

Tool to delete the default warehouse ID setting for the workspace, reverting to default state. Use when you need to remove the default SQL warehouse configuration. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Default Warehouse ID Setting

Description

Tool to retrieve the default SQL warehouse ID setting for the workspace. Use when you need to check which warehouse is configured as the default for SQL authoring surfaces, AI/BI dashboards, Genie, Alerts, and Catalog Explorer.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Default Warehouse ID Setting

Description

Tool to update the default SQL warehouse configuration for the workspace with etag-based concurrency control. Use when you need to configure which warehouse is used as the default for SQL operations and queries in the workspace. If concurrent updates occur, the request fails with 409 status requiring retry with fresh etag.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Disable Legacy Access Setting

Description

Tool to delete the disable legacy access workspace setting, re-enabling legacy features. Use when you need to revert to allowing legacy Databricks features. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409). Changes take up to 5 minutes and require cluster/warehouse restart.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Disable Legacy Access Setting

Description

Tool to retrieve the disable legacy access workspace setting. Use when you need to check whether legacy feature access is disabled, including direct Hive Metastore access, Fallback Mode on external locations, and Databricks Runtime versions prior to 13.3 LTS.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Disable Legacy Access Setting

Description

Tool to update workspace disable legacy access setting with etag-based concurrency control. Use when you need to enable or disable legacy access features including direct Hive Metastore access, external location fallback mode, and Databricks Runtime versions prior to 13.3LTS. If concurrent updates occur, the request fails with HTTP 409 requiring retry with fresh etag from the error response.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Disable Legacy DBFS Setting

Description

Tool to delete the disable legacy DBFS workspace setting, reverting to default DBFS access behavior. Use when you need to re-enable access to legacy DBFS root and mounts. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Disable Legacy DBFS Setting

Description

Tool to retrieve the disable legacy DBFS workspace setting. Use when you need to check whether legacy DBFS root and mount access is disabled across all interfaces (UI, APIs, CLI, FUSE). When enabled, this setting also disables Databricks Runtime versions prior to 13.3 LTS and requires manual restart of compute clusters and SQL warehouses to take effect.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Disable Legacy DBFS Setting

Description

Tool to update workspace disable legacy DBFS setting with etag-based concurrency control. Use when you need to enable or disable legacy DBFS features including DBFS root access, mounts, and legacy Databricks Runtime versions prior to 13.3 LTS. Changes take up to 20 minutes to take effect and require manual restart of compute clusters and SQL warehouses.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Enable Export Notebook Setting

Description

Tool to retrieve workspace setting controlling notebook export functionality. Use when you need to check whether users can export notebooks and files from the Workspace UI. Administrators use this setting to manage data exfiltration controls.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Enable Export Notebook

Description

Tool to update workspace notebook and file export setting. Use when you need to enable or disable users' ability to export notebooks and files from the Workspace UI. Requires admin access.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Enable Notebook Table Clipboard Setting

Description

Tool to retrieve notebook table clipboard setting for the workspace. Use when you need to check whether notebook table clipboard functionality is enabled. This setting controls whether users can copy data from tables in notebooks to their clipboard.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Enable Notebook Table Clipboard

Description

Tool to update workspace setting for notebook table clipboard. Use when you need to enable or disable users' ability to copy tabular data from notebook result tables to clipboard. Requires workspace admin privileges.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Enable Results Downloading Setting

Description

Tool to retrieve workspace setting controlling notebook results download functionality. Use when you need to check whether users can download notebook query results. Requires workspace administrator privileges to access.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Enable Results Downloading

Description

Tool to update workspace notebook results download setting. Use when you need to enable or disable users' ability to download notebook results. Requires admin access.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Enhanced Security Monitoring Setting

Description

Tool to retrieve enhanced security monitoring workspace setting. Use when you need to check whether Enhanced Security Monitoring is enabled for the workspace. Enhanced Security Monitoring provides a hardened disk image and additional security monitoring agents. It is automatically enabled when compliance security profile is active, and can be manually toggled when compliance security profile is disabled.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Enhanced Security Monitoring

Description

Tool to update enhanced security monitoring workspace settings. Use when you need to enable or disable Enhanced Security Monitoring (ESM) for the workspace. Requires the etag from a previous GET request for optimistic concurrency control.

Action Parameters

allow_missing
booleanRequired
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create IP Access List

Description

Tool to create a new IP access list for workspace access control. Use when you need to allow or block specific IP addresses and CIDR ranges from accessing the Databricks workspace. The API will reject creation if the resulting list would block the caller's current IP address. Changes may take a few minutes to take effect.

Action Parameters

enabled
boolean
ip_addresses
array
label
stringRequired
list_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get IP Access List

Description

Tool to retrieve details of a specific IP access list by its ID. Use when you need to view the configuration of allowed or blocked IP addresses and subnets for accessing the workspace or workspace-level APIs. Requires workspace admin privileges.

Action Parameters

ip_access_list_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete LLM Proxy Partner Powered Setting

Description

Tool to delete (revert to default) the partner-powered AI features workspace setting. Use when you need to revert the workspace to default configuration for AI features powered by partner providers. By default, this setting is enabled for workspaces without a compliance security profile. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get LLM Proxy Partner Powered Setting

Description

Tool to retrieve workspace-level setting that controls whether partner-powered AI features are enabled. Use when you need to check if features like Databricks Assistant, Genie, and Data Science Agent can use models hosted by partner providers (Azure OpenAI or Anthropic). By default, this setting is enabled for non-CSP workspaces.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update LLM Proxy Partner Powered Setting

Description

Tool to update workspace-level setting controlling whether AI features are powered by partner-hosted models with etag-based concurrency control. Use when you need to enable or disable partner-powered AI features (Azure OpenAI or Anthropic). When disabled, Databricks-hosted models are used. If concurrent updates occur, the request fails with 409 status requiring retry with fresh etag.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Notification Destination

Description

Tool to create a notification destination for alerts and jobs. Use when you need to set up destinations for sending notifications outside of Databricks (email, Slack, PagerDuty, Microsoft Teams, or webhooks). Only workspace admins can create notification destinations. Requires HTTPS for webhooks with SSL certificates signed by a trusted certificate authority.

Action Parameters

config
objectRequired
display_name
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Notification Destination

Description

Tool to delete a notification destination from the Databricks workspace. Use when you need to permanently remove a notification destination. Only workspace administrators have permission to perform this delete operation.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Notification Destination

Description

Tool to retrieve details of a notification destination by its UUID identifier. Use when you need to get configuration details, display name, and type information for a specific notification destination. Only users with workspace admin permissions will see the full configuration details.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Notification Destination

Description

Tool to update an existing notification destination configuration. Use when you need to modify display name or configuration settings for email, Slack, PagerDuty, Microsoft Teams, or webhook destinations. Requires workspace admin permissions. At least one field (display_name or config) must be provided.

Action Parameters

config
object
display_name
string
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Restrict Workspace Admins Setting

Description

Tool to delete/revert the restrict workspace admins setting to its default state. Use when you need to restore default workspace administrator capabilities for service principal token creation and job ownership settings. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Restrict Workspace Admins Setting

Description

Tool to retrieve the restrict workspace admins setting for the workspace. Use when you need to check whether workspace administrators are restricted in their ability to create service principal tokens, change job owners, or modify job run_as settings. This setting controls security boundaries for admin privileges.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Restrict Workspace Admins Setting

Description

Tool to update the restrict workspace admins setting with etag-based concurrency control. Use when you need to modify workspace administrator capabilities for service principal token creation and job ownership/run-as settings. Requires account admin permissions and workspace membership. If concurrent updates occur, the request fails with HTTP 409 requiring retry with fresh etag.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete SQL Results Download Setting

Description

Tool to delete SQL results download workspace setting, reverting to default state where users are permitted to download results. Use when you need to restore the factory default configuration. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get SQL Results Download Setting

Description

Tool to retrieve SQL results download workspace setting. Use when you need to check whether users within the workspace are allowed to download results from the SQL Editor and AI/BI Dashboards UIs. By default, this setting is enabled (set to true). Returns etag for use in subsequent update/delete operations.

Action Parameters

etag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update SQL Results Download Setting

Description

Tool to update workspace SQL results download setting controlling whether users can download results from SQL Editor and AI/BI Dashboards. Use when you need to enable or disable SQL query results download capability. Requires workspace admin access and uses etag-based optimistic concurrency control to prevent conflicting updates.

Action Parameters

allow_missing
booleanDefaults to True
field_mask
stringRequired
setting
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Token via Token Management

Description

Tool to delete a token specified by ID via token management. Use when you need to revoke or remove access tokens. Admins can delete tokens for any user.

Action Parameters

token_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Token Information

Description

Tool to retrieve detailed information about a specific token by its ID from the token management system. Use when you need to get token metadata including creation time, expiry, owner, and usage information. Requires appropriate permissions to access token information.

Action Parameters

token_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Token Management Permission Levels

Description

Tool to retrieve available permission levels for personal access token management. Use when you need to understand what permission levels can be assigned for managing tokens in the workspace. Returns permission levels like CAN_USE and CAN_MANAGE with their descriptions.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Token Management Permissions

Description

Tool to retrieve permissions for workspace token management. Use when you need to check which users, groups, and service principals have permissions to create and manage personal access tokens. Requires workspace admin privileges and is available only in Databricks Premium plan.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set Token Management Permissions

Description

Tool to set permissions for personal access token management, replacing all existing permissions. Use when configuring which users, groups, and service principals can create and use tokens. This operation replaces ALL existing permissions; if you need to add or modify permissions without replacing existing ones, use the update_permissions method instead. Workspace admins always retain CAN_MANAGE permissions.

Action Parameters

access_control_list
arrayRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Token Management Permissions

Description

Tool to incrementally update permissions for personal access token management. Use when you need to modify who can create and use personal access tokens. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

Action Parameters

access_control_list
arrayRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Personal Access Token

Description

Tool to create a personal access token (PAT) for Databricks API authentication. Use when you need to generate a new token for REST API requests. Each PAT is valid for only one workspace. Users can create up to 600 PATs per workspace. Databricks automatically revokes PATs that haven't been used for 90 days.

Action Parameters

comment
string
lifetime_seconds
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Public Workspace Setting

Description

Tool to retrieve workspace-level settings by setting ID. Use when you need to get the current value of a specific workspace setting along with its version (etag) for subsequent updates. Returns setting value with etag for optimistic concurrency control.

Action Parameters

etag
string
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set Workspace Configuration Status

Description

Tool to set workspace configuration settings for a Databricks workspace. Use when you need to enable/disable workspace features or update configuration values. Requires workspace admin permissions. Invalid configuration keys will cause the entire request to fail.

Action Parameters

configurations
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Sharing Provider

Description

Tool to create a new authentication provider in Unity Catalog for Delta Sharing. Use when establishing a provider object for receiving data from external sources that aren't Unity Catalog-enabled. Requires metastore admin privileges or CREATE_PROVIDER permission on the metastore. Most recipients should not need to create provider objects manually as they are typically auto-created during Delta Sharing.

Action Parameters

authentication_type
stringRequired
comment
string
name
stringRequired
recipient_profile_str
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Sharing Provider

Description

Tool to retrieve information about a specific Delta Sharing provider in Unity Catalog. Use when you need to get provider details including authentication type, ownership, and connection information. Requires metastore admin privileges or provider ownership.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Sharing Provider

Description

Tool to update an existing Delta Sharing authentication provider in Unity Catalog. Use when you need to modify provider properties like comment, owner, or name. The caller must be either a metastore admin or the owner of the provider. To rename the provider, the caller must be BOTH a metastore admin AND the owner.

Action Parameters

comment
string
name
stringRequired
new_name
string
owner
string
recipient_profile_str
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Sharing Recipient

Description

Tool to create a Delta Sharing recipient in Unity Catalog metastore. Use when you need to create a recipient object representing an identity who will consume shared data. Recipients can be configured for Databricks-to-Databricks sharing or open sharing with token authentication. Requires metastore admin or CREATE_RECIPIENT privilege.

Action Parameters

authentication_type
stringRequired
comment
string
data_recipient_global_metastore_id
string
expiration_time
integer
ip_access_list
object
name
stringRequired
owner
string
properties_kvpairs
object
sharing_code
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Sharing Recipient

Description

Tool to delete a Delta Sharing recipient from Unity Catalog metastore. Use when you need to permanently remove a recipient object. Deletion invalidates all access tokens and immediately revokes access to shared data for users represented by the recipient. Requires recipient owner privileges.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Sharing Recipient

Description

Tool to retrieve a Delta Sharing recipient from Unity Catalog metastore by name. Use when you need to get information about a recipient object representing an entity that receives shared data. Requires recipient ownership or metastore admin privileges.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Share

Description

Tool to create a new share for data objects in Unity Catalog. Use when you need to establish a share for distributing data assets via Delta Sharing protocol. Data objects can be added after creation with update. Requires metastore admin or CREATE_SHARE privilege on the metastore.

Action Parameters

comment
string
name
stringRequired
storage_root
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Share

Description

Tool to delete a Unity Catalog share from the metastore. Use when you need to permanently remove a share object. Deletion immediately revokes recipient access to the shared data. This operation is permanent and requires share owner privileges.

Action Parameters

name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Share Details

Description

Tool to retrieve details of a specific share from Unity Catalog. Use when you need to get information about a share including its metadata, owner, and optionally the list of shared data objects. Requires metastore admin privileges or share ownership.

Action Parameters

include_shared_data
boolean
name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Share Permissions

Description

Tool to retrieve permissions for a Delta Sharing share from Unity Catalog. Use when you need to check which principals have been granted privileges on a share. Requires metastore admin privileges or share ownership.

Action Parameters

max_results
integer
name
stringRequired
page_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Share

Description

Tool to update an existing share in Unity Catalog with changes to metadata or data objects. Use when you need to modify share properties (comment, owner, name) or manage shared data objects (add, remove, or update tables/views/volumes). The caller must be a metastore admin or the owner of the share. For table additions, the owner must have SELECT privilege on the table.

Action Parameters

comment
string
name
stringRequired
new_name
string
owner
string
storage_root
string
updates
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Spark Versions

Description

Tool to retrieve all available Databricks Runtime and Spark versions for cluster creation. Use when you need to determine which Spark versions are available for creating or configuring clusters.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create SQL Alert

Description

Tool to create a new Databricks SQL alert for query monitoring. Use when you need to set up alerts that monitor query results and trigger notifications when specified conditions are met. The alert will evaluate the query results and send notifications when the condition threshold is crossed.

Action Parameters

alert
objectRequired
auto_resolve_display_name
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete SQL Alert

Description

Tool to delete a Databricks SQL alert (soft delete to trash). Use when you need to remove an alert from active monitoring. The alert is moved to trash and can be restored through the UI within 30 days, after which it is permanently deleted.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get SQL Alert Details

Description

Tool to retrieve details of a specific Databricks SQL alert by its UUID. Use when you need to get information about an alert including its configuration, trigger conditions, state, and notification settings.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Legacy SQL Alert

Description

Tool to create a legacy SQL alert that periodically runs a query and notifies when conditions are met. Use when you need to create alerts using the legacy API endpoint. Note: This is a legacy endpoint that has been replaced by /api/2.0/sql/alerts and is deprecated.

Action Parameters

name
stringRequired
options
objectRequired
parent
string
query_id
stringRequired
rearm
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Legacy SQL Alert

Description

Tool to permanently delete a legacy SQL alert (permanent deletion). Use when you need to permanently remove an alert using the legacy API endpoint. Note: This is a legacy endpoint that permanently deletes alerts. Unlike the newer /api/2.0/sql/alerts endpoint, deleted alerts cannot be restored from trash.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Legacy SQL Alert

Description

Tool to retrieve details of a specific legacy SQL alert by its ID. Use when you need to get information about a legacy alert including its configuration, state, query details, and notification settings. Note: This is a legacy endpoint (/api/2.0/preview/sql/alerts) that is deprecated and being replaced by /api/2.0/sql/alerts.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Legacy SQL Alerts

Description

Tool to list all legacy SQL alerts accessible to the authenticated user. Use when you need to retrieve a list of all legacy alerts in the workspace. Note: This is a legacy endpoint (/api/2.0/preview/sql/alerts) that is deprecated and being replaced by /api/2.0/sql/alerts.

Action Parameters

Action Response

data
arrayRequired
error
string
successful
booleanRequired

Tool Name: Update Legacy SQL Alert

Description

Tool to update a legacy SQL alert configuration including name, query reference, trigger conditions, and notification settings. Use when you need to modify existing alerts using the legacy API endpoint. Note: This is a legacy endpoint that has been replaced by /api/2.0/sql/alerts and is deprecated.

Action Parameters

id
stringRequired
name
string
options
objectRequired
parent
string
query_id
stringRequired
rearm
integer

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update SQL Alert

Description

Tool to update an existing Databricks SQL alert using partial update with field mask. Use when you need to modify alert properties including display name, query reference, trigger conditions, notification settings, or ownership.

Action Parameters

alert
object
auto_resolve_display_name
boolean
id
stringRequired
update_mask
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete SQL Dashboard

Description

Tool to delete a legacy Databricks SQL dashboard by moving it to trash (soft delete). Use when you need to remove a dashboard from active use. The dashboard is moved to trash and can be restored later through the UI. Trashed dashboards do not appear in searches and cannot be shared.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get SQL Dashboard

Description

Tool to retrieve complete legacy dashboard definition with metadata, widgets, and queries. Use when you need to get detailed information about a SQL dashboard. Note: Legacy dashboards API deprecated as of January 12, 2026. Databricks recommends using AI/BI dashboards (Lakeview API) for new implementations.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update SQL Dashboard

Description

Tool to update legacy Databricks SQL dashboard attributes (name, run_as_role, tags). Use when you need to modify dashboard metadata. Note: This operation only affects dashboard object attributes and does NOT add, modify, or remove widgets.

Action Parameters

id
stringRequired
name
string
run_as_role
string
tags
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get SQL Object Permissions

Description

Tool to retrieve the access control list for a specified SQL object (alerts, dashboards, queries, or data_sources). Use when you need to check who has access to a SQL object and their permission levels. Note: This API is deprecated; use the Workspace API for new implementations.

Action Parameters

object_id
stringRequired
object_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set SQL Object Permissions

Description

Tool to set access control list for SQL objects (alerts, dashboards, queries, or data_sources). Use when you need to configure permissions for a SQL object. IMPORTANT: This operation REPLACES ALL existing permissions. To retain existing permissions, include them in the access_control_list. Note: This is a legacy/deprecated API; Databricks recommends using the Workspace API instead.

Action Parameters

access_control_list
arrayRequired
object_id
stringRequired
object_type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create SQL Query

Description

Tool to create a saved SQL query object in Databricks. Use when you need to create a new saved query definition that includes the target SQL warehouse, query text, name, description, tags, and parameters. Note: This creates a saved query object, not an immediate execution. Use Statement Execution API for immediate query execution.

Action Parameters

auto_resolve_display_name
boolean
query
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete SQL Query

Description

Tool to delete a Databricks SQL query (soft delete to trash). Use when you need to remove a query from searches and list views. The query is moved to trash and can be restored through the UI within 30 days, after which it is permanently deleted.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get SQL Query Details

Description

Tool to retrieve detailed information about a specific SQL query by its UUID. Use when you need to get query configuration including SQL text, warehouse ID, parameters, ownership, and metadata.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Legacy SQL Query

Description

Tool to create a new SQL query definition using the legacy API. Use when you need to create queries with the legacy /preview/sql/queries endpoint that uses data_source_id. Note: This is a legacy endpoint. The API has been replaced by /api/2.0/sql/queries which uses warehouse_id instead of data_source_id.

Action Parameters

data_source_id
stringRequired
description
string
is_archived
boolean
is_draft
boolean
name
string
options
object
query
stringRequired
schedule
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Legacy SQL Query

Description

Tool to delete a legacy SQL query (soft delete to trash). Use when you need to remove a legacy query from searches and list views. The query is moved to trash and permanently deleted after 30 days. Note: This is a deprecated legacy API that will be phased out; use the non-legacy endpoint instead.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Legacy SQL Query

Description

Tool to retrieve details of a specific legacy SQL query by its UUID. Use when you need to get information about a legacy query including its SQL text, parameters, configuration, and metadata. Note: This is a legacy endpoint (/api/2.0/preview/sql/queries) that has been replaced by /api/2.0/sql/queries and will be supported for six months to allow migration time.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Restore SQL Query (Legacy)

Description

Tool to restore a trashed SQL query to active state. Use when you need to recover a deleted query within 30 days of deletion. Once restored, the query reappears in list views and searches and can be used for alerts again. This is a legacy/deprecated API endpoint.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Legacy SQL Query

Description

Tool to update an existing SQL query definition using the legacy API. Use when you need to modify queries with the legacy /preview/sql/queries endpoint. Note: This is a legacy/deprecated endpoint. The newer API uses PATCH /api/2.0/sql/queries/{id} instead.

Action Parameters

data_source_id
string
description
string
id
stringRequired
name
string
options
object
parent
string
query
string
run_as_role
string
tags
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update SQL Query

Description

Tool to update a saved SQL query object in Databricks using partial field updates. Use when you need to modify specific fields of an existing query without replacing the entire object. Requires update_mask parameter to specify which fields to update. Supports updating query text, configuration, parameters, and metadata.

Action Parameters

auto_resolve_display_name
boolean
id
stringRequired
query
object
update_mask
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List SQL Query History

Description

Tool to retrieve the history of SQL queries executed against SQL warehouses and serverless compute. Use when you need to list queries by time range, status, user, or warehouse. Returns most recently started queries first (up to max_results). Supports filtering and pagination.

Action Parameters

filter_by
object
include_metrics
boolean
max_results
integer
page_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create SQL Query Visualization

Description

Tool to create a new visualization for a Databricks SQL query. Use when you need to add a visual representation (table, chart, counter, funnel, or pivot table) to an existing saved query. The visualization will be attached to the specified query and can be added to dashboards.

Action Parameters

visualization
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Legacy SQL Query Visualization

Description

Tool to create a visualization in a SQL query using the legacy API. Use when you need to add a visual representation (table, chart, counter, pivot, etc.) to an existing saved query. Note: This is a deprecated endpoint; users should migrate to the current /api/2.0/sql/visualizations API. Databricks does not recommend modifying visualization settings in JSON.

Action Parameters

description
string
name
string
options
objectRequired
query_id
stringRequired
type
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Legacy SQL Query Visualization

Description

Tool to permanently delete a legacy SQL query visualization. Use when you need to remove a visualization from a SQL query using the legacy API endpoint. Note: This is a deprecated legacy endpoint. Databricks recommends migrating to /api/2.0/sql/visualizations/{id} instead.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Legacy SQL Query Visualization

Description

Tool to update a visualization in a SQL query using the legacy API. Use when you need to modify visualization properties such as name, description, type, and options. Note: This is a deprecated endpoint; users should migrate to the current queryvisualizations/update method. Databricks does not recommend modifying visualization settings in JSON.

Action Parameters

description
string
id
stringRequired
name
string
options
object
type
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update SQL Query Visualization

Description

Tool to update an existing Databricks SQL query visualization using partial update with field mask. Use when you need to modify visualization properties such as display name, description, type, or query attachment.

Action Parameters

id
stringRequired
update_mask
stringRequired
visualization
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Redash V2 Config

Description

Tool to retrieve workspace configuration for Redash V2 in Databricks SQL. Use when you need to get Redash configuration settings for the current workspace.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Cancel SQL Statement Execution

Description

Tool to cancel an executing SQL statement on a Databricks warehouse. Use when you need to terminate a running SQL query. The response indicates successful receipt of the cancel request, but does not guarantee cancellation. Callers must poll the statement status to confirm the terminal state (CANCELED, SUCCEEDED, FAILED, or CLOSED).

Action Parameters

statement_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete SQL Warehouse

Description

Tool to delete a SQL warehouse from the Databricks workspace. Use when you need to permanently remove a SQL compute resource. Deleted warehouses may be restored within 14 days by contacting Databricks support.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Edit SQL Warehouse

Description

Tool to update the configuration of an existing SQL warehouse. Use when you need to modify warehouse settings like cluster size, scaling parameters, auto-stop behavior, or enable features like Photon acceleration and serverless compute. The warehouse is identified by its ID, and you can update various properties including resource allocation and performance optimizations.

Action Parameters

auto_stop_mins
integer
channel
object
cluster_size
string
creator_name
string
enable_photon
boolean
enable_serverless_compute
boolean
id
stringRequired
instance_profile_arn
string
max_num_clusters
integer
min_num_clusters
integer
name
string
spot_instance_policy
string
tags
object
warehouse_type
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get SQL Warehouse Details

Description

Tool to retrieve detailed information about a specific SQL warehouse by its ID. Use when you need to get configuration, state, connection details, and resource allocation for a SQL warehouse. Returns comprehensive warehouse information including cluster settings, JDBC/ODBC connection strings, and health status.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get SQL Warehouse Permission Levels

Description

Tool to retrieve available permission levels for a Databricks SQL warehouse. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific SQL warehouse. Returns permission levels like CAN_USE, CAN_MANAGE, IS_OWNER, CAN_VIEW, and CAN_MONITOR with their descriptions.

Action Parameters

warehouse_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get SQL Warehouse Permissions

Description

Tool to retrieve permissions for a Databricks SQL warehouse. Use when you need to check who has access to a specific SQL warehouse and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects.

Action Parameters

warehouse_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Workspace Warehouse Config

Description

Tool to retrieve workspace-level SQL warehouse configuration settings. Use when you need to check security policies, serverless compute settings, channel versions, or warehouse type restrictions that apply to all SQL warehouses in the workspace.

Action Parameters

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set SQL Warehouse Permissions

Description

Tool to set permissions for a Databricks SQL warehouse, replacing all existing permissions. Use when you need to configure access control for a SQL warehouse. This operation is authoritative and overwrites all existing permissions. Exactly one IS_OWNER must be specified. Groups cannot have IS_OWNER permission.

Action Parameters

access_control_list
arrayRequired
warehouse_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set Workspace Warehouse Config

Description

Tool to configure workspace-level SQL warehouse settings shared by all SQL warehouses. Use when you need to set security policies, enable serverless compute, configure channel versions, or manage warehouse type restrictions across the workspace.

Action Parameters

channel
object
data_access_config
array
enable_serverless_compute
boolean
enabled_warehouse_types
array
google_service_account
string
instance_profile_arn
string
security_policy
string
sql_configuration_parameters
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Start SQL Warehouse

Description

Tool to start a stopped Databricks SQL warehouse asynchronously. Use when you need to restart a stopped warehouse. The warehouse transitions through STARTING state before reaching RUNNING. Requires CAN MONITOR permissions or higher.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update SQL Warehouse Permissions

Description

Tool to incrementally update permissions for a Databricks SQL warehouse. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. For replacing all permissions, use SetPermissions instead.

Action Parameters

access_control_list
arrayRequired
warehouse_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Submit One-Time Run

Description

Tool to submit a one-time run without creating a job. Use when you need to execute a task directly without saving it as a job definition. After submission, use the jobs/runs/get API with the returned run_id to check the run state and monitor progress.

Action Parameters

environments
array
existing_cluster_id
string
idempotency_token
string
libraries
array
new_cluster
object
notification_settings
object
run_name
string
tasks
arrayRequired
timeout_seconds
integer
webhook_notifications
object

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Tag Policy

Description

Tool to create a new tag policy (governed tag) in Databricks with built-in rules for consistency and control. Use when you need to establish governed tags with restricted values and define who can assign them. Maximum of 1,000 governed tags per account. Each governed tag can have up to 50 allowed values. Requires appropriate account-level permissions.

Action Parameters

description
string
tag_key
stringRequired
values
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Tag Policy

Description

Tool to delete a tag policy by its key, making the tag ungoverned. Use when you need to remove governance from a tag without deleting the tag itself. Requires MANAGE permission on the governed tag. System governed tags cannot be deleted.

Action Parameters

tag_key
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Tag Policy

Description

Tool to retrieve a specific tag policy by its associated governed tag's key. Use when you need to get details about tag governance policies including allowed values and metadata.

Action Parameters

tag_key
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Tag Policy

Description

Tool to update an existing tag policy (governed tag) with specified fields. Use when you need to modify tag policy properties like description, tag key, or allowed values. Users must have MANAGE permission on the governed tag to edit it.

Action Parameters

description
string
tag_key
stringRequired
update_mask
stringRequired
values
array

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Group

Description

Tool to update a Databricks group using SCIM 2.0 PATCH operations. Use when you need to modify group properties like displayName, add/remove members, or update roles.

Action Parameters

Operations
arrayRequired
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Group Copy

Description

Tool to delete a group from Databricks workspace using SCIM v2 protocol. Use when you need to remove a group resource. Users in the group are not removed when the group is deleted.

Action Parameters

id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Databricks Job By ID

Description

Tool to completely reset all settings for a Databricks job. Use when you need to overwrite all job configuration at once. Changes to timeout_seconds apply immediately to active runs; other changes apply to future runs only. Consider using the update endpoint for partial updates instead of reset to minimize disruption.

Action Parameters

job_id
integerRequired
new_settings
objectRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update User by ID (PATCH)

Description

Tool to update a Databricks user by applying SCIM 2.0 PATCH operations on specific user attributes. Use when you need to modify user properties like active status, displayName, entitlements, roles, or other user attributes.

Action Parameters

Operations
arrayRequired
id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Vector Search Endpoint

Description

Tool to create a new vector search endpoint to host indexes in Databricks Mosaic AI Vector Search. Use when you need to provision compute resources for hosting vector search indexes. The endpoint will be in PROVISIONING state initially and transition to ONLINE when ready.

Action Parameters

budget_policy_id
string
endpoint_type
stringRequired
name
stringRequired
usage_policy_id
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Vector Search Index

Description

Tool to delete a vector search index from Databricks workspace. Use when you need to remove unused or obsolete vector search indexes. When an index is deleted, any associated writeback tables are automatically removed. This operation is irreversible.

Action Parameters

index_name
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Query Vector Search Index

Description

Tool to query vector search index to find similar vectors and return associated documents. Use when performing similarity search, hybrid keyword-similarity search, or full-text search on Databricks Vector Search indexes. Supports filtering, reranking, and returns configurable columns from matched documents with similarity scores. Must provide either query_vector or query_text.

Action Parameters

columns
arrayRequired
columns_to_rerank
array
debug_level
integer
filters
object
filters_json
string
index_name
stringRequired
num_results
integerRequired
query_text
string
query_type
string
query_vector
array
reranker
object
score_threshold
number

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Upsert Data Vector Index

Description

Tool to upsert (insert or update) data into a Direct Vector Access Index. Use when you need to manually add or update vectors in a Databricks vector search index. The index must be a Direct Vector Access Index type where updates are managed via REST API or SDK calls. Data structure must match the schema defined when the index was created, including the primary key field.

Action Parameters

index_name
stringRequired
vectors
arrayRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Workspace Git Credentials

Description

Tool to create Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to set up Git integration for version control operations. Only one credential per user is supported - attempts to create when one already exists will fail.

Action Parameters

git_email
string
git_provider
stringRequired
git_username
string
is_default_for_provider
boolean
name
string
personal_access_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Workspace Git Credentials

Description

Tool to delete Git credentials for remote repository authentication in Databricks. Use when you need to remove a Git credential entry from the workspace. Only one Git credential per user is supported in Databricks, making this useful for credential lifecycle management when credentials need to be revoked or replaced.

Action Parameters

credential_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Workspace Git Credentials

Description

Tool to retrieve Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to get details of existing Git integration credentials by credential ID.

Action Parameters

credential_id
integerRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Workspace Git Credentials

Description

Tool to update existing Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to modify Git provider credentials, email, username, or access tokens. Note that the git_provider field cannot be changed after initial creation. For azureDevOpsServicesAad provider, do not specify personal_access_token or git_username.

Action Parameters

credential_id
integerRequired
git_email
string
git_provider
stringRequired
git_username
string
is_default_for_provider
boolean
name
string
personal_access_token
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: List Workspace Directory

Description

Tool to list the contents of a directory in Databricks workspace. Use when you need to view notebooks, files, directories, libraries, or repos at a specific path. Returns object information including paths, types, and metadata. Use object_id for setting permissions via the Permissions API.

Action Parameters

notebooks_modified_after
integer
path
stringRequired
recursive
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Workspace Repo

Description

Tool to create and optionally checkout a Databricks Repo linking a Git repository to the workspace. Use when you need to connect a Git repository to Databricks for collaborative development. Can optionally specify branch or tag to checkout after creation and configure sparse checkout for performance.

Action Parameters

branch
string
path
string
provider
string
sparse_checkout
object
tag
string
url
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Workspace Repo

Description

Tool to delete a Git repository from Databricks workspace. Use when you need to permanently remove a repository. The repository cannot be recovered after deletion completes successfully.

Action Parameters

repo_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Workspace Repo Permission Levels

Description

Tool to retrieve available permission levels for a Databricks workspace repository. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific Git repository. Returns permission levels like CAN_READ, CAN_RUN, CAN_EDIT, and CAN_MANAGE with their descriptions.

Action Parameters

repo_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Set Workspace Repo Permissions

Description

Tool to set permissions for a workspace repository, replacing all existing permissions. Use when you need to configure access control for a workspace repo. This operation replaces ALL existing permissions; admin users cannot have their permissions lowered. Repos can inherit permissions from their root object.

Action Parameters

access_control_list
array
repo_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Workspace Repo

Description

Tool to update a workspace repo to a different branch or tag. Use when you need to switch branches, pull latest changes, or update sparse checkout settings. When updating to a tag, the repo enters a detached HEAD state and must be switched back to a branch before committing.

Action Parameters

branch
string
id
integerRequired
sparse_checkout
object
tag
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Update Workspace Repo Permissions

Description

Tool to incrementally update permissions on a Databricks workspace repository. Use when you need to modify specific permissions for users, groups, or service principals without replacing the entire permission set. This PATCH operation only modifies the permissions specified while leaving other existing permissions intact. Repos can inherit permissions from their root object.

Action Parameters

access_control_list
arrayRequired
repo_id
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Secret Scope

Description

Tool to create a new secret scope in Databricks workspace. Use when you need to establish a secure location to store credentials and sensitive information. Scope names must be unique, case-insensitive, and cannot exceed 128 characters. By default, the scope is Databricks-backed with MANAGE permission for the creator.

Action Parameters

backend_azure_keyvault
object
initial_manage_principal
string
scope
stringRequired
scope_backend_type
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Secrets ACL

Description

Tool to delete an access control list from a Databricks secret scope. Use when you need to revoke permissions for a principal on a secret scope. Requires MANAGE permission on the scope. Fails if the ACL does not exist.

Action Parameters

principal
stringRequired
scope
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Secret Scope

Description

Tool to delete a secret scope and all associated secrets and ACLs. Use when you need to permanently remove a secret scope. This operation cannot be undone. The API throws errors if the scope does not exist or the user lacks authorization.

Action Parameters

scope
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Workspace Secret

Description

Tool to delete a secret from a Databricks secret scope. Use when you need to remove a secret stored in a scope. Requires WRITE or MANAGE permission on the scope. Not supported for Azure KeyVault-backed scopes.

Action Parameters

key
stringRequired
scope
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Secrets ACL

Description

Tool to retrieve ACL details for a principal on a Databricks secret scope. Use when you need to check the permission level granted to a specific user, service principal, or group. Requires MANAGE permission on the scope. Each permission level is hierarchical - WRITE includes READ, and MANAGE includes both WRITE and READ.

Action Parameters

principal
stringRequired
scope
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Secret Value

Description

Tool to get a secret value from a Databricks secret scope. Use when you need to retrieve the actual value of a secret stored in a scope. Important: This API can only be called from the DBUtils interface (from within a cluster/notebook). There is no API to read the actual secret value outside of a cluster. Requires READ permission on the scope.

Action Parameters

key
stringRequired
scope
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Put Secrets ACL

Description

Tool to create or overwrite access control list for a principal on a Databricks secret scope. Use when you need to grant or modify permissions for a user, group, or service principal on a secret scope. Requires MANAGE permission on the scope. Overwrites existing permission level for the principal if one already exists.

Action Parameters

permission
stringRequired
principal
stringRequired
scope
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Put Secret in Scope

Description

Tool to insert or update a secret in a Databricks secret scope. Use when you need to store sensitive information like passwords, API keys, or credentials. Overwrites existing secrets with the same key. Requires WRITE or MANAGE permission on the scope. Maximum 1,000 secrets per scope with 128 KB limit per secret.

Action Parameters

bytes_value
string
key
stringRequired
scope
stringRequired
string_value
string

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Delete Workspace Object

Description

Tool to permanently delete a workspace object or directory. Use when you need to remove notebooks, files, or directories from the workspace. This is a hard delete operation that cannot be undone. Recursive deletion of non-empty directories is not atomic and may partially complete if it fails.

Action Parameters

path
stringRequired
recursive
boolean

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Export Workspace Object

Description

Tool to export a workspace object (notebook, dashboard, or file) as file content or base64-encoded string. Use when you need to retrieve the content of workspace objects for backup, migration, or analysis. By default, returns base64-encoded content with file type information. Set direct_download=true to get raw file content directly.

Action Parameters

direct_download
boolean
format
string
path
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Get Workspace Object Status

Description

Tool to retrieve status and metadata for any workspace object including notebooks, directories, dashboards, and files. Use when you need to get object type, path, identifier, and additional metadata fields. Returns error with code RESOURCE_DOES_NOT_EXIST if the specified path does not exist.

Action Parameters

path
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Import Workspace Object

Description

Tool to import a notebook or file into the Databricks workspace. Use when you need to import base64-encoded content as notebooks, files, or directories. The content must be base64-encoded and can be imported in various formats including SOURCE, HTML, JUPYTER, DBC, and R_MARKDOWN. Maximum content size is 10 MB.

Action Parameters

content
string
format
string
language
string
overwrite
boolean
path
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired

Tool Name: Create Workspace Directory

Description

Tool to create a directory and necessary parent directories in the workspace. Use when you need to create new directories. The operation is idempotent - if the directory already exists, the command succeeds without action. Returns error RESOURCE_ALREADY_EXISTS if a file (not directory) exists at any prefix of the path.

Action Parameters

path
stringRequired

Action Response

data
objectRequired
error
string
successful
booleanRequired