Edit

Fabric MCP Server (local) overview

Fabric MCP Server (local) is an open-source implementation of the Model Context Protocol (MCP) that runs on your development machine. It gives AI agents access to Fabric API documentation, OneLake data operations, and item creation tools — designed for building and extending Fabric solutions.

Note

This article provides an overview. For complete documentation, see the GitHub repository.

What is Fabric MCP Server (local)

Fabric MCP Server runs as a local subprocess on your development machine, providing AI agents with Fabric context and OneLake data access. It's designed for development workflows where you need:

  • Local execution — Runs on your machine with no cloud dependency for documentation tools
  • API documentation — Offline access to Fabric API specs, schemas, and best practices
  • OneLake access — Read and write files, manage directories, and query tables in OneLake
  • Extensibility — Add custom tools and workflows for your needs

Key features

  • Local subprocess architecture — Starts and stops with your AI agent session
  • Three tool categories — API documentation, OneLake data operations, and core Fabric operations
  • Offline capable — Documentation and best practice tools work without a Fabric connection
  • Open source — Extend and customize on GitHub
  • Multiple install methods — VS Code extension (recommended), npm/npx, or .NET source build

When to use the local server

Choose Fabric MCP Server (local) when you need to:

  • Explore Fabric APIs — Browse API specs, item schemas, and best practices offline
  • Build Fabric integrations — Generate code from OpenAPI specs with AI assistance
  • Work with OneLake data — List, download, upload, and manage files in OneLake
  • Create Fabric items — Scaffold lakehouses, notebooks, and other items
  • Develop custom tooling — Extend the server with tools specific to your workflow

Architecture

Diagram showing the local Fabric MCP Server architecture with three tool categories: API Documentation and Best Practices (offline), OneLake data operations, and Fabric REST APIs, plus local file system access.

The server:

  1. Runs as a subprocess started by your AI agent.
  2. Serves API documentation and best practices from bundled resources (offline).
  3. Authenticates using locally configured credentials for OneLake and Fabric API calls.
  4. Returns results through the MCP protocol.
  5. Stops when the AI agent session ends.

Getting started