Use external MCP servers in agents

Important

This feature is in Public Preview.

After you install an external MCP server in your workspace, use it in your agent code by connecting to the Azure Databricks-managed proxy URL. The proxy makes external servers behave like managed MCP servers, handling authentication and token management.

To install an external MCP server, see Install an external MCP server.

Connect to an external MCP server

The proxy URL pattern is https://<workspace-hostname>/api/2.0/mcp/external/<connection-name>, where <connection-name> is the Unity Catalog connection you registered for the external service.

Choose the tab that matches your agent framework and deployment target:

OpenAI Agents SDK (Apps)

from agents import Agent, Runner
from databricks.sdk import WorkspaceClient
from databricks_openai.agents import McpServer

workspace_client = WorkspaceClient()
host = workspace_client.config.host

async with McpServer(
    url=f"{host}/api/2.0/mcp/external/<connection-name>",
    name="external-service",
    workspace_client=workspace_client,
) as external_server:
    agent = Agent(
        name="Connected agent",
        instructions="You are a helpful assistant with access to external services.",
        model="databricks-claude-sonnet-4-5",
        mcp_servers=[external_server],
    )
    result = await Runner.run(agent, "Send a Slack message to the team about the deployment")
    print(result.final_output)

Grant the app access to the Unity Catalog connection in databricks.yml:

resources:
  apps:
    my_agent_app:
      resources:
        - name: 'my_connection'
          uc_securable:
            securable_full_name: '<connection-name>'
            securable_type: 'CONNECTION'
            permission: 'USE_CONNECTION'

LangGraph (Apps)

from databricks.sdk import WorkspaceClient
from databricks_langchain import ChatDatabricks, DatabricksMCPServer, DatabricksMultiServerMCPClient
from langgraph.prebuilt import create_react_agent

workspace_client = WorkspaceClient()
host = workspace_client.config.host

mcp_client = DatabricksMultiServerMCPClient([
    DatabricksMCPServer(
        name="external-service",
        url=f"{host}/api/2.0/mcp/external/<connection-name>",
        workspace_client=workspace_client,
    ),
])

async with mcp_client:
    tools = await mcp_client.get_tools()
    agent = create_react_agent(
        ChatDatabricks(endpoint="databricks-claude-sonnet-4-5"),
        tools=tools,
    )
    result = await agent.ainvoke(
        {"messages": [{"role": "user", "content": "Send a Slack message to the team about the deployment"}]}
    )
    print(result["messages"][-1].content)

Grant the app access to the Unity Catalog connection in databricks.yml:

resources:
  apps:
    my_agent_app:
      resources:
        - name: 'my_connection'
          uc_securable:
            securable_full_name: '<connection-name>'
            securable_type: 'CONNECTION'
            permission: 'USE_CONNECTION'

Model Serving

from databricks.sdk import WorkspaceClient
from databricks_mcp import DatabricksMCPClient
import mlflow

workspace_client = WorkspaceClient()
host = workspace_client.config.host

mcp_client = DatabricksMCPClient(
    server_url=f"{host}/api/2.0/mcp/external/<connection-name>",
    workspace_client=workspace_client,
)

tools = mcp_client.list_tools()

mlflow.pyfunc.log_model(
    "agent",
    python_model=my_agent,
    resources=mcp_client.get_databricks_resources(),
)

To deploy the agent, see Deploy an agent for generative AI applications (Model Serving). For details on logging agents with MCP resources, see Use Databricks managed MCP servers.

Lower-level: use the MCP SDK directly

If you're not using a Databricks-provided framework helper, call the proxy URL with the standard MCP SDK. This pattern is framework-agnostic — use it for custom orchestration or when you need direct control over the session.

%pip install -U databricks-sdk databricks_mcp tabulate databricks_ai_bridge
%restart_python

import json
from databricks.sdk import WorkspaceClient
from databricks_mcp import DatabricksOAuthClientProvider
from mcp.client.streamable_http import streamablehttp_client as connect
from mcp import ClientSession


async def main():
    app_url = "https://<workspace-hostname>/api/2.0/mcp/external/<connection-name>"
    client = WorkspaceClient()

    async with connect(app_url, auth=DatabricksOAuthClientProvider(client)) as (
        read_stream,
        write_stream,
        _,
    ):
        async with ClientSession(read_stream, write_stream) as session:
            await session.initialize()
            tools = await session.list_tools()
            response = await session.call_tool(name="<tool-name>", arguments={...})
            print(response.content[0].text)

await main()

For a synchronous alternative, use DatabricksMCPClient directly:

from databricks.sdk import WorkspaceClient
from databricks_mcp import DatabricksMCPClient

workspace_client = WorkspaceClient()
host = workspace_client.config.host

mcp_client = DatabricksMCPClient(
    server_url=f"{host}/api/2.0/mcp/external/<connection-name>",
    workspace_client=workspace_client,
)

tools = mcp_client.list_tools()
response = mcp_client.call_tool("<tool-name>", {...})
print(response.content[0].text)

Example notebooks: Build an agent with Databricks MCP servers

The following notebooks show how to author LangGraph and OpenAI agents that call MCP tools, including external MCP servers accessed through Databricks proxy endpoints.

LangGraph MCP tool-calling agent

Get notebook

OpenAI MCP tool-calling agent

Get notebook

Next steps