MCP Todo Server
A distributed task management server built on the Model Context Protocol (MCP). Uses TypeScript, Redis for shared state, and OpenAI for smart task analysis.
Features
- MCP protocol implementation with 6 custom tools
- Multi-node setup with load balancing via Caddy
- Redis for distributed state (works across nodes)
- AI task prioritization using OpenAI
- Docker Compose for easy deployment
- Health monitoring and graceful shutdown
Available Tools
todo_add- Add new tasks with priority levelstodo_list- List todos with status filteringtodo_remove- Remove specific taskstodo_clear- Clear all taskstodo_mark_done- Mark tasks as completedtodo_analyze- Get AI-powered task prioritization
Architecture
┌─────────────┐
│ Caddy │ Load Balancer
│ Port 3000 │
└──────┬──────┘
│
├────────────┬────────────┐
▼ ▼ ▼
┌──────────┐ ┌──────────┐ ┌──────────┐
│ Node 1 │ │ Node 2 │ │ Node N │
│ Port 3001│ │ Port 3002│ │ Port 300N│
└────┬─────┘ └────┬─────┘ └────┬─────┘
│ │ │
└────────────┴────────────┘
│
┌─────▼──────┐
│ Redis │ Shared State
│ Port 6379 │
└────────────┘
Getting Started
Prerequisites
- Docker and Docker Compose
- Node.js 18+ (for local dev)
- OpenAI API key
Installation
-
Clone the repository
git clone https://github.com/yourusername/mcp-todo-server.git cd mcp-todo-server -
Set up environment variables
cp .env.example .env # Edit .env and add your OPENAI_API_KEY -
Start with Docker Compose
docker compose up --buildThis starts:
- Redis on port 6379
- MCP Server Node 1 on port 3001
- MCP Server Node 2 on port 3002
- Caddy load balancer on port 3000
-
Verify the deployment
curl http://localhost:3000/health
API Endpoints
Health Check
GET http://localhost:3000/health
MCP Endpoint
GET/POST http://localhost:3000/mcp
Uses Server-Sent Events (SSE) for communication.
Development
Running Locally (without Docker)
# Install dependencies
npm install
# Start Redis
docker run -d -p 6379:6379 redis:7-alpine
# Set environment variables
export REDIS_URL=redis://localhost:6379
export OPENAI_API_KEY=your_key_here
export NODE_ID=dev-node
# Run development server
npm run dev
Build for Production
npm run build
npm start
Testing
Run the test suite:
./test.sh
Or try the example client:
npm run test:client
Using with MCP Clients
VS Code / Cursor
Add to your MCP config:
{
"mcpServers": {
"todo": {
"url": "http://localhost:3000/mcp",
"transport": "sse"
}
}
}
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"todo": {
"command": "node",
"args": ["/path/to/mcp-todo-server/dist/main.js"],
"env": {
"REDIS_URL": "redis://localhost:6379",
"OPENAI_API_KEY": "your-key-here"
}
}
}
}
Project Structure
.
├── src/
│ ├── main.ts # Express server & MCP transport
│ ├── mcp-tools.ts # MCP tool implementations
│ ├── redis-client.ts # State management
│ └── ai-service.ts # OpenAI integration
├── docker-compose.yml # Multi-node orchestration
├── Dockerfile # Container definition
├── package.json # Dependencies
├── tsconfig.json # TypeScript config
├── test.sh # Automated testing
└── example-client.js # MCP client example
Security Notes
For production, add:
- Authentication
- HTTPS
- Redis password
- Input validation
- Rate limiting
Resources
License
MIT
