[!WARNING] This project has been relocated to the Astronomer agents monorepo.
Airflow MCP Server
A Model Context Protocol (MCP) server for Apache Airflow that provides AI assistants with access to Airflow's REST API. Built with FastMCP.
Quickstart
IDEs
Manual configuration
Add to your MCP settings (Cursor: ~/.cursor/mcp.json, VS Code: .vscode/mcp.json):
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
CLI Tools
Claude Code
claude mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Gemini CLI
gemini mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Codex CLI
codex mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Desktop Apps
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
Other MCP Clients
Manual JSON Configuration
Add to your MCP configuration file:
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
Or connect to a running HTTP server: "url": "http://localhost:8000/mcp"
Note: No installation required -
uvxruns directly from PyPI. The--transport stdioflag is required because the server defaults to HTTP mode.
Configuration
By default, the server connects to http://localhost:8080 (Astro CLI default). Set environment variables for custom Airflow instances:
| Variable | Description |
|---|---|
AIRFLOW_API_URL | Airflow webserver URL |
AIRFLOW_USERNAME | Username (Airflow 3.x uses OAuth2 token exchange) |
AIRFLOW_PASSWORD | Password |
AIRFLOW_AUTH_TOKEN | Bearer token (alternative to username/password) |
Example with auth (Claude Code):
claude mcp add airflow -e AIRFLOW_API_URL=https://your-airflow.example.com -e AIRFLOW_USERNAME=admin -e AIRFLOW_PASSWORD=admin -- uvx astro-airflow-mcp --transport stdio
Features
- Airflow 2.x and 3.x Support: Automatic version detection with adapter pattern
- MCP Tools for accessing Airflow data:
- DAG management (list, get details, get source code, stats, warnings, import errors, trigger, pause/unpause)
- Task management (list, get details, get task instances, get logs)
- Pool management (list, get details)
- Variable management (list, get specific variables)
- Connection management (list connections with credentials excluded)
- Asset/Dataset management (unified naming across versions, data lineage)
- Plugin and provider information
- Configuration and version details
- Consolidated Tools for agent workflows:
explore_dag: Get comprehensive DAG information in one calldiagnose_dag_run: Debug failed DAG runs with task instance detailsget_system_health: System overview with health, errors, and warnings
- MCP Resources: Static Airflow info exposed as resources (version, providers, plugins, config)
- MCP Prompts: Guided workflows for common tasks (troubleshooting, health checks, onboarding)
- Dual deployment modes:
- Standalone server: Run as an independent MCP server
- Airflow plugin: Integrate directly into Airflow 3.x webserver
- Flexible Authentication:
- Bearer token (Airflow 2.x and 3.x)
- Username/password with automatic OAuth2 token exchange (Airflow 3.x)
- Basic auth (Airflow 2.x)
Available Tools
Consolidated Tools (Agent-Optimized)
| Tool | Description |
|---|---|
explore_dag | Get comprehensive DAG info: metadata, tasks, recent runs, source code |
diagnose_dag_run | Debug a DAG run: run details, failed task instances, logs |
get_system_health | System overview: health status, import errors, warnings, DAG stats |
Core Tools
| Tool | Description |
|---|---|
list_dags | Get all DAGs and their metadata |
get_dag_details | Get detailed info about a specific DAG |
get_dag_source | Get the source code of a DAG |
get_dag_stats | Get DAG run statistics (Airflow 3.x only) |
list_dag_warnings | Get DAG import warnings |
list_import_errors | Get import errors from DAG files that failed to parse |
list_dag_runs | Get DAG run history |
get_dag_run | Get specific DAG run details |
trigger_dag | Trigger a new DAG run (start a workflow execution) |
pause_dag | Pause a DAG to prevent new scheduled runs |
unpause_dag | Unpause a DAG to resume scheduled runs |
list_tasks | Get all tasks in a DAG |
get_task | Get details about a specific task |
get_task_instance | Get task instance execution details |
get_task_logs | Get logs for a specific task instance execution |
list_pools | Get all resource pools |
get_pool | Get details about a specific pool |
list_variables | Get all Airflow variables |
get_variable | Get a specific variable by key |
list_connections | Get all connections (credentials excluded for security) |
list_assets | Get assets/datasets (unified naming across versions) |
list_plugins | Get installed Airflow plugins |
list_providers | Get installed provider packages |
get_airflow_config | Get Airflow configuration |
get_airflow_version | Get Airflow version information |
MCP Resources
| Resource URI | Description |
|---|---|
airflow://version | Airflow version information |
airflow://providers | Installed provider packages |
airflow://plugins | Installed Airflow plugins |
airflow://config | Airflow configuration |
MCP Prompts
| Prompt | Description |
|---|---|
troubleshoot_failed_dag | Guided workflow for diagnosing DAG failures |
daily_health_check | Morning health check routine |
onboard_new_dag | Guide for understanding a new DAG |
Advanced Usage
Running as Standalone Server
For HTTP-based integrations or connecting multiple clients to one server:
# Run server (HTTP mode is default)
uvx astro-airflow-mcp --airflow-url https://my-airflow.example.com --username admin --password admin
Connect MCP clients to: http://localhost:8000/mcp
Airflow Plugin Mode
Install into your Airflow 3.x environment to expose MCP at http://your-airflow:8080/mcp/v1:
# Add to your Astro project
echo astro-airflow-mcp >> requirements.txt
CLI Options
| Flag | Environment Variable | Default | Description |
|---|---|---|---|
--transport | MCP_TRANSPORT | stdio | Transport mode (stdio or http) |
--host | MCP_HOST | localhost | Host to bind to (HTTP mode only) |
--port | MCP_PORT | 8000 | Port to bind to (HTTP mode only) |
--airflow-url | AIRFLOW_API_URL | Auto-discovered or http://localhost:8080 | Airflow webserver URL |
--airflow-project-dir | AIRFLOW_PROJECT_DIR | $PWD | Astro project directory for auto-discovering Airflow URL from .astro/config.yaml |
--auth-token | AIRFLOW_AUTH_TOKEN | None | Bearer token for authentication |
--username | AIRFLOW_USERNAME | None | Username for authentication (Airflow 3.x uses OAuth2 token exchange) |
--password | AIRFLOW_PASSWORD | None | Password for authentication |
Architecture
The server is built using FastMCP with an adapter pattern for Airflow version compatibility:
Core Components
- Adapters (
adapters/): Version-specific API implementationsAirflowAdapter(base): Abstract interface for all Airflow API operationsAirflowV2Adapter: Airflow 2.x API (/api/v1) with basic authAirflowV3Adapter: Airflow 3.x API (/api/v2) with OAuth2 token exchange
- Version Detection: Automatic detection at startup by probing API endpoints
- Models (
models.py): Pydantic models for type-safe API responses
Version Handling Strategy
- Major versions (2.x vs 3.x): Adapter pattern with runtime version detection
- Minor versions (3.1 vs 3.2): Runtime feature detection with graceful fallbacks
- New API parameters: Pass-through
**kwargsfor forward compatibility
Deployment Modes
- Standalone: Independent ASGI application with HTTP/SSE transport
- Plugin: Mounted into Airflow 3.x FastAPI webserver
Development
# Setup development environment
make install-dev
# Run tests
make test
# Run all checks
make check
# Local testing with Astro CLI
astro dev start # Start Airflow
make run # Run MCP server (connects to localhost:8080)
Contributing
Contributions welcome! Please ensure:
- All tests pass (
make test) - Code passes linting (
make check) - prek hooks pass (
make prek)
