🧠 MCP Server with LangChain and AI Tools
This project demonstrates how to build a multi-tool AI assistant using the Model Context Protocol (MCP), LangChain, and Groq’s Qwen model. It includes:
- 📐 A local Math MCP Server
- 🌤️ A simulated Weather MCP Server
- 🤖 A conversational AI agent (MCP client) that talks to both
🧰 Features
- Uses LangChain MCP Adapters to connect tools
- Powered by Groq's Qwen LLM
- Handles local and remote tool servers via MCP
- Interactive CLI chat with tool usage detection
🏁 Prerequisites
- Python >= 3.11
uvfor project/environment management (https://github.com/astral-sh/uv)- Internet connection for loading LLM (Groq)
⚙️ Setup Instructions
1. Create Project
mkdir mcp_project
cd mcp_project
uv init
Set Python version in .python-version and pyproject.toml to >=3.11
2. Create Virtual Environment
uv venv
source .venv/Scripts/activate
3. Add Dependencies
Create a requirements.txt file:
langchain-mcp-adapters
langchain-groq
langgraph
mcp
Install them
uv add -r requirements.txt
Project Structure
mcp_project/ │ ├── math_server.py # MCP server for math tools ├── weather_server.py # MCP server for weather API simulation ├── client.py # MCP client with AI agent ├── requirements.txt ├── .python-version └── .env # For storing Groq API key (GROQ_API_KEY)
How to Run
1. Run the Weather Server
python weather_server.py
2. Run the Client (Automatically runs math server as sub process)
python client.py
Example Conversation
You: What is the output of 2*3/(4-2)
AI: The result is 3.0
You: What is the weather in New York?
AI: The current weather in New York is sunny.
You: thanks
AI: You're welcome! 😊
