AI applications are quickly moving beyond simple conversation. To act effectively, an agent must reach external data, APIs, calendars, and code bases. Until now that meant writing large amounts of custom glue code, which was costly to build and hard to scale.
The Model Context Protocol (MCP) standardizes how AI agents discover and use outside servers. Earlier you had to invoke MCP tools through a plugin. In the latest Dify release, MCP is built in both directions:
The result is faster, more reliable integration and easier feature expansion.
Three Ways to Use MCP in Dify
Configure an MCP Server as a Tool
On the Tools page, select MCP and add a server such as Linear or Notion (native MCP apps) or Zapier or Composio (integration platforms). One Zapier configuration unlocks more than 8,000 authorized apps.
Note: HTTP-based MCP servers only. Protocol version 2025-03-26.
Linear setup:
Tools > MCP > Add MCP Server
Enter the Linear MCP URL, a display name, and a server identifier.

Complete authorization. You now have 22 Linear tools for creating, updating, and querying projects, issues, comments, documents, teams, and users.

Let an Agent Call MCP Tools Intelligently
Define the agent’s role in the prompt and attach the Linear server:
“You are an agent connected to Linear with 22 API tools. Use them as needed to manage issues, projects, and documents, and to query team, user, and cycle information.”

When a user asks to create an issue for the R&D team, the agent selects get_team, get_user, and create_issue, then creates and assigns the task automatically.
Orchestrate MCP Tools in a Workflow
Dynamic Path: Agent Node
Insert the “Linear Assistant” agent node into a workflow. The agent chooses the right Linear tools at run‑time. It's ideal for complex, varied tasks such as triaging user feedback into three specialized agents:
Positive Feedback Agent → forwards highlights to Marketing.
Technical Issue Agent → creates bug tasks for Support.
Product Suggestion Agent → generates structured requirement docs for Product.
When a feedback item reaches its dedicated agent, the agent instantly sorts the content, sets its priority, and uses Linear’s MCP APIs to open the right issue in the correct team project. Work that once meant slow, manual triage now happens in seconds, so every department can respond to users almost in real time.

Precise Path: Standalone MCP Nodes
Add individual MCP tools as separate workflow nodes. You decide the call order, with no LLM decisions involved. This approach suits:
Standardized business processes
Strictly ordered task chains
Situations with tight latency or cost constraints
You can enrich workflows by adding knowledge bases, notification plugins, or extra MCP servers to enable collaboration across platforms.

Publish Your AI as an MCP Server
Any Dify agent or workflow can be exposed as a standard MCP endpoint:
Service description: State concisely what the workflow does, so external LLMs know when to invoke it.
Parameter description: Document every input on the Start node to ensure clients pass the correct values.

After you complete these two fields, Dify issues a server URL. From that address your workflow becomes a reusable MCP-standard server that tools such as Claude, Cursor, or any other MCP client can call directly.

Ready for a Connected Future
Native MCP integration is more than a feature. It is a commitment to open standards. The applications you build today are already prepared for tomorrow’s interconnected AI ecosystem. Explore the new capabilities in Dify v1.6.0 and start building.