Monotype MCP Server & Chat Application
A complete system consisting of:
- MCP Server - Plugin-ready server for Monotype API integration
- Backend - Ollama-powered bridge between chat UI and MCP server
- Frontend - React-based chat interface
Architecture
┌─────────────┐ ┌──────────────┐ ┌─────────────┐ ┌──────────────┐
│ Frontend │─────▶│ Backend │─────▶│ MCP Server │─────▶│ Monotype API │
│ (React) │ │ (Ollama) │ │ (Plugin) │ │ │
└─────────────┘ └──────────────┘ └─────────────┘ └──────────────┘
Project Structure
NextGenAgenticAI/
├── src/ # MCP Server (can be used as plugin)
│ ├── server.js # Main MCP server
│ ├── api-client.js # Monotype API client
│ ├── auth.js # Authentication service
│ ├── token-decryptor.js # Token decryption utilities
│ └── ...
├── backend/ # Backend server
│ ├── server.js # Express server with Ollama integration
│ └── package.json
├── frontend/ # React chat UI
│ ├── src/
│ │ ├── App.jsx # Main chat component
│ │ └── ...
│ └── package.json
└── README.md
Quick Start
1. MCP Server (Plugin)
The MCP server can be used independently as a plugin with any chat agent.
Setup:
cd src
npm install
Configuration: Add to your MCP client config:
{
"mcpServers": {
"monotype-mcp": {
"command": "node",
"args": ["/path/to/src/server.js"],
"env": {
"MONOTYPE_TOKEN": "your-token-here"
}
}
}
}
2. Backend Server
Prerequisites:
- Install Ollama: https://ollama.ai
- Pull llama3 model:
ollama pull llama3
Setup:
cd backend
npm install
npm start
Server runs on http://localhost:3001
3. Frontend
Setup:
cd frontend
npm install
npm run dev
Frontend runs on http://localhost:3000
Features
MCP Server Tools
invite_user_for_customer- Invite users to your companyget_teams_for_customer- Get all teamsget_roles_for_customer- Get all roles
Backend Intelligence
- Uses Ollama (llama3) to detect which tool to call
- Extracts parameters from natural language
- Fallback keyword matching if Ollama unavailable
Frontend
- Secure token input
- Modern chat interface
- Real-time responses
- Tool usage indicators
Usage Examples
Via Chat UI
- Start backend and frontend
- Enter your token
- Try these commands:
- "What roles are in my company?"
- "Invite user@example.com to my company"
- "Show me all teams"
Via MCP Plugin
Use the MCP server directly with any MCP-compatible chat agent (like Cursor, Claude Desktop, etc.)
Development
Running All Services
Terminal 1 - Backend:
cd backend
npm run dev
Terminal 2 - Frontend:
cd frontend
npm run dev
Terminal 3 - MCP Server (if testing standalone):
cd src
npm start
Environment Variables
Backend
MCP_SERVER_PATH- Path to MCP server script (default:../src/server.js)OLLAMA_API_URL- Ollama API URL (default:http://localhost:11434)
MCP Server
MONOTYPE_TOKEN- Your Monotype authentication token (optional, can be set in MCP config)
License
MIT
