In our last article, we covered how to use the MCP plugin to let Dify act as a client, tapping into external MCP server tools like Zapier’s 7,000+ integrations. Thanks to Dify’s modular design and flexible plugin system, it's not just limited to being a client. Dify can also easily function as an MCP Server, allowing you to share the AI applications you've built with other compatible systems for a wider range of uses.
MCP Server Plugin: Connecting Dify to MCP Clients
The mcp-server plugin, contributed by the Dify community, is an Extension-type plugin. Once installed, it lets you turn any Dify app into an MCP-compliant server endpoint that external MCP clients can directly access. Here’s what it does:
Exposes Dify as an MCP tool: Turns your Dify app into a single MCP tool that clients like Cursor, Claude Desktop, Cline, Windsurf, or even other Dify instances can call.
Leverages Dify Endpoint: After creating an app endpoint, you get a unique URL that MCP clients can use to connect.
Runs MCP services: The plugin automatically starts an HTTP service in Dify’s plugin environment, handling requests from MCP clients via HTTP and SSE protocols. This covers everything from protocol handshakes to capability discovery and tool execution.
How to Set Up a Dify App as an MCP Server
Install the Plugin
Head to the Dify Marketplace, download, and install the mcp-server plugin.

Pick Your Dify App
Let’s use the “Deep Research” app as an example. This app takes a user’s question, runs multiple rounds of searches using the Tavily plugin (with the number of searches set by a depth parameter), and then uses an LLM to compile the results into a research report.

Configure the MCP Server Endpoint
In the mcp-server plugin’s settings, fill out:
Endpoint Name: Name your endpoint.
App: Select the Dify app you want to publish as an MCP server.
App Type: Choose whether it’s a Chat or Workflow app.
App Input Schema: Define the app’s input parameters in JSON to help external systems understand how to interact with it.

Here’s a sample JSON for the deep_research app:
Schema breakdown:
properties: Lists all parameters the app accepts and their types.
description: Explains the app’s main function to MCP clients, helping them discover and use it.
required: Specifies must-have parameters. For chat-based apps (Agent/Chatflow), query is typically required.
Get Your Endpoint URL
Once you save the config, the plugin generates a unique Endpoint URL (your MCP server address). This URL supports HTTP and SSE protocols, making it easy for MCP clients to connect and interact.

Add Dify MCP Server to Cursor
Now you can add your Dify MCP server to any MCP-compatible client. For example, in Cursor IDE, update the MCP server settings with something like this (swap in your actual Endpoint URL):

Once set up, you’re good to go. The screenshot below shows how Cursor can use the Deep Research tool in Agent mode to run multi-step research on Dify plugin types, pulling in deeper insights to boost coding efficiency and quality.

More Ways to Use It
Beyond dev tools, the Dify MCP Server is great for embedding AI into internal workflows. Think tasks like auto-classifying customer requests, summarizing reports, or extracting key info from documents, all built in Dify workflows and shared as MCP services via the plugin.
Unlike REST APIs, MCP is tailor-made for AI scenarios, making it easier for AI agents to discover and dynamically call tools. Agents can figure out how to use Dify services on their own, no hardcoding or manual setup required, which keeps things flexible and efficient.
Heads-up: For security, we recommend running the MCP Server plugin only in private network environments.
Wrapping Up
With the community-built MCP Server plugin, you can easily turn Dify apps into MCP-compliant services for external systems to tap into, boosting reuse and integration. We’re also working on native MCP support for Dify. Future updates will let you connect to external MCP servers and publish Dify apps as MCP servers with a single click, making it even easier to weave AI into all kinds of scenarios. Stay tuned!