Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Fabric MCP Server (local) is an open-source implementation of the Model Context Protocol (MCP) that runs on your development machine. It gives AI agents access to Fabric API documentation, OneLake data operations, and item creation tools — designed for building and extending Fabric solutions.
Note
This article provides an overview. For complete documentation, see the GitHub repository.
What is Fabric MCP Server (local)
Fabric MCP Server runs as a local subprocess on your development machine, providing AI agents with Fabric context and OneLake data access. It's designed for development workflows where you need:
- Local execution — Runs on your machine with no cloud dependency for documentation tools
- API documentation — Offline access to Fabric API specs, schemas, and best practices
- OneLake access — Read and write files, manage directories, and query tables in OneLake
- Extensibility — Add custom tools and workflows for your needs
Key features
- Local subprocess architecture — Starts and stops with your AI agent session
- Three tool categories — API documentation, OneLake data operations, and core Fabric operations
- Offline capable — Documentation and best practice tools work without a Fabric connection
- Open source — Extend and customize on GitHub
- Multiple install methods — VS Code extension (recommended), npm/npx, or .NET source build
When to use the local server
Choose Fabric MCP Server (local) when you need to:
- Explore Fabric APIs — Browse API specs, item schemas, and best practices offline
- Build Fabric integrations — Generate code from OpenAPI specs with AI assistance
- Work with OneLake data — List, download, upload, and manage files in OneLake
- Create Fabric items — Scaffold lakehouses, notebooks, and other items
- Develop custom tooling — Extend the server with tools specific to your workflow
Architecture
The server:
- Runs as a subprocess started by your AI agent.
- Serves API documentation and best practices from bundled resources (offline).
- Authenticates using locally configured credentials for OneLake and Fabric API calls.
- Returns results through the MCP protocol.
- Stops when the AI agent session ends.
Getting started
Related content
- Fabric MCP Servers overview — Compare Core and local servers
- Get started with Fabric MCP Server (local) — Installation and setup
- Tools reference — Complete list of available tools
- Model Context Protocol specification — MCP standard
- Microsoft Fabric REST API — Fabric APIs