Tutorial: Deploy an AI Agent with MCP on Your Server in 20 Minutes
The Model Context Protocol (MCP) is the standard connecting AI to real-world tools. This step-by-step guide shows you how to create an MCP server, connect it to Claude, and automate complex tasks.

William Aklamavo
March 5, 2026
Tutorial: Deploy an AI Agent with MCP on Your Server in 20 Minutes
The Model Context Protocol (MCP) is becoming the universal standard for connecting AI to your tools, databases, and services. Adopted by OpenAI, Anthropic, Google, and all major AI agent frameworks, MCP is the missing piece that transforms a simple chatbot into an assistant capable of acting in the real world.
In this tutorial, we'll build an MCP server that gives Claude (or any LLM) the ability to read and modify your files, query your database, and trigger actions in your systems.
What is MCP?
Think of MCP as a universal adapter for AI:
- Without MCP: Your AI can only read text and generate text. It's isolated from the world.
- With MCP: Your AI can read your database, create files, send emails, trigger deployments, modify Jira tickets, etc.
MCP Architecture in 30 Seconds
[User] → [MCP Client (e.g., Claude)] → [MCP Server] → [Your tools/APIs/DB]
The MCP Server exposes a collection of tools that the LLM can call. The process:
- Discovery: The MCP client connects to the server and discovers available tools.
- Decision: The LLM analyzes the user request and chooses which tool to use.
- Invocation: The MCP client executes the tool with parameters generated by the LLM.
- Execution: The MCP server processes the request (DB access, API, files...).
- Response: The result is returned to the LLM, which integrates it into its response.
Prerequisites
- Node.js 20+ installed
- npm or pnpm
- A code editor (VS Code recommended)
- (Optional) A Claude API key to test full integration
Step 1: Initialize the Project
mkdir my-mcp-server
cd my-mcp-server
npm init -y
npm install @modelcontextprotocol/sdk zod
npm install -D typescript @types/node tsx
Step 2: Create the MCP Server
Create src/server.ts:
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { z } from "zod";
import * as fs from "fs/promises";
import * as path from "path";
const server = new McpServer({
name: "my-mcp-server",
version: "1.0.0",
});
// Tool 1: List files in a directory
server.tool(
"list_files",
"Lists files in a given directory",
{
directory: z.string().describe("Path of the directory to list"),
},
async ({ directory }) => {
const files = await fs.readdir(directory, { withFileTypes: true });
const result = files.map(f => ({
name: f.name,
type: f.isDirectory() ? "directory" : "file"
}));
return {
content: [{ type: "text", text: JSON.stringify(result, null, 2) }]
};
}
);
// Tool 2: Read file content
server.tool(
"read_file",
"Reads and returns the content of a file",
{
filepath: z.string().describe("Full path of the file"),
},
async ({ filepath }) => {
const content = await fs.readFile(filepath, "utf-8");
return {
content: [{ type: "text", text: content }]
};
}
);
// Tool 3: Write to a file
server.tool(
"write_file",
"Writes content to a file (creates or overwrites)",
{
filepath: z.string().describe("File path"),
content: z.string().describe("Content to write"),
},
async ({ filepath, content }) => {
await fs.mkdir(path.dirname(filepath), { recursive: true });
await fs.writeFile(filepath, content, "utf-8");
return {
content: [{ type: "text", text: "File written successfully." }]
};
}
);
const transport = new StdioServerTransport();
await server.connect(transport);
Step 3: Test Locally
npx tsx src/server.ts
Step 4: Integrate with Claude Desktop
Add your server to the Claude Desktop configuration:
{
"mcpServers": {
"my-server": {
"command": "npx",
"args": ["tsx", "/path/to/my-mcp-server/src/server.ts"]
}
}
}
Restart Claude Desktop. You'll now see a 🔧 icon indicating your MCP tools are available.
Step 5: Integrate with n8n
If you use n8n 2.0+, MCP support is native:
- In n8n, add an "MCP Client" node
- Point to your MCP server (via stdio or HTTP)
- The AI agent in n8n can now use your MCP tools as native tools
This is the ultimate combination: n8n for orchestration + MCP for tool access + LLM for reasoning.
Security: Golden Rules
- Principle of Least Privilege: Only grant access to strictly necessary folders and APIs
- Input Validation: Use Zod to rigorously validate all inputs
- Sandbox: Run MCP servers in Docker containers in production
- Audit Logs: Record every tool call with parameters and results
Conclusion
MCP is the missing piece that transforms generative AI into agentic AI. In 20 minutes, you've created a server that gives Claude the ability to read your files, write code, and interact with your APIs.
Want a custom MCP server for your business? At BOVO Digital, we build personalized MCP servers that connect your AI to all your internal systems. Automation + Intelligence = Productivity.