The practice of integrating stdio MCP Server with one-click access by Cherry Studio and Dify

壬云

|

May 14, 2025

|

Share on X

Author: Ren Yun

Introduction

As AI applications become increasingly complex, the integration of MCP tools faces three major pain points:

  • Low deployment efficiency for single tools, supporting only single tool deployment at a time

  • Fragmented call methods, incompatibility with OpenAPI and MCP

  • Difficult dynamic management, requiring repeated deployments for adding or removing tools

To overcome these limitations, we propose a one-stop solution based on Alibaba Cloud Computing Nest—achieving bulk deployment of multiple MCP tools through standardized protocols, increasing resource utilization on the cloud, and pioneering dual-channel calling capabilities for OpenAPI and MCP, allowing mainstream AI assistants like Dify, Cherry Studio, OpenWebUI, etc., to be seamlessly integrated.

Background

MCP (Model Context Protocol) is an open-source communication standard released by Anthropic in November 2024, with the core goal of establishing a unified interaction paradigm to eliminate integration barriers between large language models (LLM) and heterogeneous data sources and tools. This protocol addresses the data island problem in the AI domain through a three-tiered innovation, enabling local data and internet data to achieve a de facto "Internet of Everything" based on MCP, including but not limited to data and file systems, operation of any resources on Alibaba Cloud, browser automation, etc.

This technological breakthrough allows AI applications to truly realize the "Internet of Everything"—from document processing on personal devices to enterprise-level cloud resource scheduling, all can be completed through a unified protocol.

Figure 1 Official Architecture Diagram of MCP Protocol

As an AI-native gateway, Higress has been the first to realize automated conversion from REST API to MCP Server. Its API-to-MCP tool can quickly encapsulate REST APIs like Gaode Map into MCP tools through declarative configuration files (YAML format), without the need for coding. For example, by defining request templates and response transformation rules, developers can complete interface adaptation in minutes and enjoy enterprise-level capabilities like unified authentication and fine-grained rate limiting.

However, the existing solutions have critical limitations: Higress only supports MCP Server hosting and cannot connect with numerous MCP Servers based on stdio communication in the open-source ecosystem (e.g., time service, Fetch tool, etc.). These tools usually run in command line mode. To solve this problem, our solution innovatively integrates the MCPO open-source project with Higress capabilities, bridging the gap in Higress's capabilities in this area, and providing the first one-click bulk deployment solution for multiple MCP tools based on Computing Nest and Higress!

Comparison

Traditional MCP Market

Computing Nest-Higress Solution

Deployment Method

Can only select a single MCP tool for deployment at a time

Supports selecting multiple MCP tools for simultaneous deployment, reusing ECS resources, and improving resource utilization

Invocation Compatibility

Cannot simultaneously invoke tools through OpenAPI and MCP, leading to compatibility with only some platforms

Supports invocation through both OpenAPI and MCP, compatible with AI assistants like Dify, Cherry Studio, OpenWebUI, etc., and supports platforms like Bailian

Tool Management

Complicated to add or remove MCP tools

Supports dynamic addition and deletion of MCP tools through configuration changes

Platform Comparison

Cannot well accommodate hosted and private deployment methods

This solution addresses the private MCP problem while the Higress platform tackles the hosting issue

Principle Analysis

Figure Two Computing Nest MCP Service Architecture

The core of this solution relies on the open-source projects mcpo and Higress. The overall working steps are as follows:

Tool Selection and Configuration Generation

Users can select multiple MCP tools (like time service, Gaode Map, etc.), and the system converts the user selections into standardized JSON configurations. Each tool's configuration includes the following key fields:

  • <font style="color:rgb(38, 38, 38);background-color:rgb(229, 229, 229);">args</font>: Array of tool startup parameters

  • <font style="color:rgb(38, 38, 38);background-color:rgb(229, 229, 229);">serverCode</font>: Unique service identifier

  • <font style="color:rgb(38, 38, 38);background-color:rgb(229, 229, 229);">env</font>: Environment variable configuration (including API keys and other sensitive information)

  • <font style="color:rgb(38, 38, 38);background-color:rgb(229, 229, 229);">command</font>: Type of execution command (such as <font style="color:rgb(38, 38, 38);background-color:rgb(229, 229, 229);">uvx</font>, <font style="color:rgb(38, 38, 38);background-color:rgb(229, 229, 229);">npx</font>, etc.)

The example configuration showcases the different parameter structures and environmental requirements for time service and Gaode Map services.

{
  "args": [
    "mcp-server-time"
  ],
  "serverCode": "time",
  "env": {},
  "command": "uvx"
},
{
  "args": [
    "-y",
    "@amap/amap-maps-mcp-server"
  ],
  "serverCode": "amap-maps",
  "env": {
    "AMAP_MAPS_API_KEY": "dadasdaad"
  },
  "command": "npx"
}

2. Start the MCPO Service

As the core MCP tool function provider, the MCPO open-source service exposes standardized OpenAPI interfaces in the local environment after parsing the JSON configuration from the first step, facilitating service discovery and invocation.

3. Protocol Conversion Stage

The OpenApi-To-Mcp automation tool executes critical protocol conversion tasks:

  • Extracting the original OpenAPI specification documents (usually in Swagger/OpenAPI format)

  • Analyzing API endpoints, parameter structures, and response patterns

  • Generating configuration files that comply with MCP protocol specifications

Note: The protocol configuration conversion involved in this step is lossless and will not affect the final invocation of the MCP tools.

4. Service Registration and Routing Configuration

Automated registration of tools integrated with the Higress gateway completes:

  • Service metadata registration (including service source, health check endpoint, etc.)

  • Dynamic routing rule configuration

  • MCP configuration injection

  • Establishment and maintenance of service dependencies

5. Unified Service Exposure and Security Control

The final architecture exposes services through a dual-channel approach:

  • The Higress gateway exposes the SSE (Server-Sent Events) link of MCP tools, supporting real-time data push

  • The MCPO service exposes standardized OpenAPI interfaces, supporting request-response mode interactions

  • Overall service access is authenticated through a unified ApiKey to ensure security

Deployment and Use Practice

Deployment Process

Deployment Steps

  1. Click on the deployment link to enter the service instance deployment interface, choose the region you want to deploy in, and fill in the parameters according to the interface prompts to complete the deployment.

  2. Select multiple MCP tools you wish to use.

  3. There are two types of MCP here. One type does not require environment variables, such as "time service", which can be directly checked, skipping the parameter configuration step.

  4. The other type of MCP requires environment variables, such as Gaode Map, which must have environment variables configured; otherwise, the MCP Server will fail to deploy.

  5. If you are unsure how to use the current MCP tool, you can click to view the "help document" to learn.

  6. The system automatically generates an API KEY for you to protect the MCP tools you are about to deploy. You can manually modify this parameter.

  7. Configure your ECS instance specifications, it is recommended to choose specifications of 2 cores and 4G or higher. Configure the ECS login password

  8. Configure the availability zone and network. It is recommended to choose any available zone and directly create a network and virtual machine

  9. Click to create immediately, wait for successful deployment; this process generally takes about 3 minutes. The duration may vary based on the number of tools you select.

  10. Visit the interface of the successfully deployed instance to see the addresses and API keys of your deployed exclusive MCP tools.

Deploy Custom MCP Tools

  1. Click to add a new custom MCP tool, note: custom MCP tools can be added repeatedly.

  2. Fill in the name and ID of the custom tool, note that the ID cannot be duplicated.

  3. Select the startup method for the MCP tool you wish to use. If you choose one like npx or uvx, you need to fill in the startup command, which should be in the form of an array. For example:

['mcp-server-time', '--local-timezone=America/New_York']
  1. If this command requires some environment variables when starting, fill them out below: for example, key:GITHUB_PERSONAL_ACCESS_TOKEN, value: xxx

  2. If you choose the SSE startup method, you need to fill in the URL. Note: if authentication is needed, the required authentication key must be included in this URL. For example: "https://mcp-xxx-22b0-4037.api-inference.modelscope.cn/sse"

  3. Click to select the current custom tool.
    Note: The custom tools you add will be re-rendered when changing configurations, allowing you to increase or decrease the MCP tools used. When configuring your custom MCP tools, you can continue with deployment step "6.

AI Assistant Usage Example

Cherry Studio Usage Example

  1. Go to the Computing Nest instance interface; you will need to use the "MCP Server access address" in the following operations.

  2. Open your Cherry Studio assistant; if you don't have it, you can access the official deployment and create a new MCP server according to the example in the following picture.

  3. The "name" and description can be filled in arbitrarily; select "Server-Sent Events (sse)" as the type.

  4. Fill in the URL at this link on the instance page

Note that HTTP protocol is used here, and the suffix is sse

  1. Add authentication parameters to the request header: xxx; note that you need to change ":" to "=" in your input, for example, Authorization=Bearer 123

  1. First, click the enable button in the upper right corner, and then click the save button.

  2. Go to the dialogue interface, and select the MCP tool you want to use.

  3. Select an appropriate model and have a conversation with the AI, for example, "I’m at the Great Wall in Beijing city, please tell me some restaurants I can walk to in less than 40 minutes," which allows AI to use the model to help you find suitable restaurants.

Dify Usage Example

  1. Same as Cherry Studio's first step; in the following operations, you will need to use the "MCP Server access address" section.

  2. Open your Dify; if you don't have it, you can access the Aliyun fast deployment link for deployment and install "SSE discovery and invocation of MCP tools" according to the example in the picture below.

  3. If there are issues in subsequent use, you may lower the version of this tool to 0.0.10.

  4. Click the "Authorize" button to configure the SSE tool. You can directly paste the MCP Server access address from Step 1 here.

  5. Create an Agent and enter.

  6. According to the example in the picture below, enable the invocation of the MCP tool, fill in appropriate prompts, and select the appropriate model, such as QWEN-max.

  7. Engage in conversation, which will allow the invocation of the MCP tool.

Open WebUI Usage Example

  1. Access the Computing Nest instance interface and find the OpenAPI access address for the tool.

  2. Open your Open WebUI client; if you don't have one, please visit the Aliyun fast deployment link for deployment and paste the address and API key into it.

  3. Create a new conversation and enable the MCP tool.

  4. Verify that the AI is using your MCP tool!

Dynamic Addition and Removal of MCP Tools in Use

If you want to modify the MCP tools in use, please refer to the following operations:

  1. In the Computing Nest console, click on "My Instances", select the previously deployed MCP Server instance, and click "Modify Configuration" in the upper right corner.

  2. Click on the third option from the top to bottom, and click "Next Step".

  3. Select the MCP tools you want to add, for example, I have added the Fetch tool here. (Note: Previously selected tools will be re-rendered here.)

  4. If the MCP tool here involves environment variables, you must set them according to the documentation.

  5. Click OK to initiate the tool modification request.

  6. Wait for the instance status to change.

  7. Add the newly added MCP tool from the output to the AI dialogue client.

Troubleshooting

If you find that the instance has not been successfully deployed, there is a 90% chance that it is due to incorrect environment variable configuration; please refer to the following steps for troubleshooting:

  1. Log into the ECS instance through session management.

  2. Input the following command to confirm whether the environment variables are correct.

cat /root/config.json
  1. Make modifications to the configuration. Restart the docker compose application.

sudo systemctl restart quickstart-mcp

For more issues, please visit the official MCP to learn how to use: User Documentation

Conclusion

The MCP protocol is reshaping the connection between artificial intelligence and the real world as a standardized interface. It is not only the "universal language" among AI tools but also a bridge connecting digital systems and the physical world. By defining a unified interaction specification, MCP enables AI models to autonomously invoke meteorological satellite data to predict monsoon trajectories, manipulate mechanical arms to carve chips at the nanometer scale, and even analyze the mineral color compositions of Dunhuang murals and the rhythmic structures of the "Iliad", revealing the cognitive laws hidden in human civilization!

Contact

Follow and engage with us through the following channels to stay updated on the latest developments from higress.ai.

Contact

Follow and engage with us through the following channels to stay updated on the latest developments from higress.ai.

Contact

Follow and engage with us through the following channels to stay updated on the latest developments from higress.ai.

Contact

Follow and engage with us through the following channels to stay updated on the latest developments from higress.ai.