MCP server for Azerbaijan’s Open Data Portal (opendata.az); search datasets, read metadata, get download links via your favourite AI app
Opendata.az MCP Server
Model Context Protocol (MCP) server that allows AI chatbots (Claude, ChatGPT, Cursor, etc.) to search, explore, and get download links for datasets from opendata.az, the Republic of Azerbaijan's Open Data Portal, directly through conversation.
Note: This server uses STDIO transport only (no public HTTP endpoint). You run it locally; your chatbot starts the server process and talks to it via standard input/output. Replace
PATH_TO_OPENDATA_AZ_MCPin the configs below with the full path to your cloned repo.
Connect your chatbot to the MCP server
Configuration depends on your client. Use the format that matches your tool. Every config runs the server with uv and stdio; the only variable is the path to the project.
Quick reference: Command is uv, arguments are --directory, PATH_TO_OPENDATA_AZ_MCP, run, main.py.
Claude Desktop | Cursor | Claude Code | VS Code | Windsurf | AnythingLLM | ChatGPT | Gemini CLI | HuggingChat | IBM Bob | Kiro CLI | Kiro IDE | Le Chat (Mistral) | Mistral Vibe
Claude Desktop
Add this to your Claude Desktop config file:
- Linux:
~/.config/Claude/claude_desktop_config.json - macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"opendata-az": {
"command": "uv",
"args": [
"--directory",
"PATH_TO_OPENDATA_AZ_MCP",
"run",
"main.py"
]
}
}
}
Replace PATH_TO_OPENDATA_AZ_MCP with the full path to your cloned repo. Restart Claude Desktop; the tools will appear (hammer icon).
Cursor
- Open Cursor Settings.
- Search for MCP or Model Context Protocol.
- Add a server with stdio and:
- Command:
uv - Arguments:
--directoryPATH_TO_OPENDATA_AZ_MCPrunmain.py
(Use your actual path for PATH_TO_OPENDATA_AZ_MCP.)
Claude Code
If your Claude Code setup supports stdio MCP servers, add the server with command uv and args --directory, PATH_TO_OPENDATA_AZ_MCP, run, main.py.
VS Code
Add to your VS Code MCP config (run MCP: Open User Configuration from the Command Palette to open it):
- Linux:
~/.config/Code/User/mcp.json - macOS:
~/Library/Application Support/Code/User/mcp.json - Windows:
%APPDATA%\Code\User\mcp.json
{
"servers": {
"opendata-az": {
"command": "uv",
"args": [
"--directory",
"PATH_TO_OPENDATA_AZ_MCP",
"run",
"main.py"
]
}
}
}
Windsurf
Add to ~/.codeium/windsurf/mcp_config.json (Windows: %USERPROFILE%\.codeium\windsurf\mcp_config.json):
{
"mcpServers": {
"opendata-az": {
"command": "uv",
"args": [
"--directory",
"PATH_TO_OPENDATA_AZ_MCP",
"run",
"main.py"
]
}
}
}
AnythingLLM
If AnythingLLM supports stdio (command + args), use command uv and args: --directory, PATH_TO_OPENDATA_AZ_MCP, run, main.py. Check AnythingLLM MCP documentation for the exact schema.
ChatGPT
Available for paid plans (Plus, Pro, Team, Enterprise).
ChatGPT connectors typically use a URL. This server does not expose an HTTP endpoint by default. To use it with ChatGPT you would need to run the server with HTTP transport (see "Run locally" below for code) and expose it at a URL, or use a bridge that runs the stdio server and exposes HTTP.
Gemini CLI
Add to ~/.gemini/settings.json (Windows: %USERPROFILE%\.gemini\settings.json). If Gemini supports stdio:
{
"mcpServers": {
"opendata-az": {
"command": "uv",
"args": [
"--directory",
"PATH_TO_OPENDATA_AZ_MCP",
"run",
"main.py"
]
}
}
}
HuggingChat
If HuggingChat allows adding an MCP server by command + args, use command uv and args --directory, PATH_TO_OPENDATA_AZ_MCP, run, main.py. If it only accepts a URL, this server would need to be run with HTTP (see "Run locally").
IBM Bob
Edit global or project MCP config and add:
{
"mcpServers": {
"opendata-az": {
"command": "uv",
"args": [
"--directory",
"PATH_TO_OPENDATA_AZ_MCP",
"run",
"main.py"
]
}
}
}
Kiro CLI
Add to ~/.kiro/settings/mcp.json (Windows: %USERPROFILE%\.kiro\settings\mcp.json):
{
"mcpServers": {
"opendata-az": {
"command": "uv",
"args": [
"--directory",
"PATH_TO_OPENDATA_AZ_MCP",
"run",
"main.py"
]
}
}
}
Kiro IDE
Use .kiro/settings/mcp.json in your workspace or the global Kiro config; same structure as Kiro CLI above.
Le Chat (Mistral)
If Le Chat supports custom MCP with command/args, use command uv and args --directory, PATH_TO_OPENDATA_AZ_MCP, run, main.py. If it only supports a connector URL, you would need an HTTP endpoint (see "Run locally").
Mistral Vibe CLI
Edit your Vibe config (e.g. ~/.vibe/config.toml, Windows: %USERPROFILE%\.vibe\config.toml). If Vibe supports stdio:
[[mcp_servers]]
name = "opendata-az"
command = "uv"
args = ["--directory", "PATH_TO_OPENDATA_AZ_MCP", "run", "main.py"]
See Mistral Vibe MCP configuration.
Summary: All configs use uv as the command and --directory PATH_TO_OPENDATA_AZ_MCP run main.py as the arguments. No API key is required; the server only exposes read-only tools.
Run locally
Prerequisites
- uv (recommended) or Python 3.13+ with pip
1. Clone and install
git clone https://github.com/your-org/opendata-az-mcp.git
cd opendata-az-mcp
uv sync
2. Optional environment
Copy the example env and adjust if needed:
cp .env.example .env
Supported variable:
- LOG_LEVEL – Python logging level (default:
INFO). Values:DEBUG,INFO,WARNING,ERROR,CRITICAL.
3. Run the server
The server runs over stdio and waits for MCP client messages. In a terminal:
uv run main.py
Leave this running; connect with MCP Inspector or your chatbot using the configs above (with the path set to this directory).
4. Docker (optional)
If you prefer Docker, the image runs the same stdio server. Note: most MCP clients expect to start the process themselves, so Docker is mainly for consistency or CI. To build and run:
docker compose up -d
The server process runs inside the container; for local chatbot use, the stdio config pointing at a local clone (with uv run main.py) is usually simpler.
Transport support
This server uses the official Python SDK for MCP and supports STDIO transport only. There is no HTTP or SSE endpoint unless you add one yourself.
Available tools
The server exposes three read-only tools for the opendata.az CKAN API.
Datasets
-
search_datasets– Search the opendata.az catalog by keyword. Returns dataset titles, IDs, short descriptions, organization, and tags.Parameters:
query(required),limit(optional, default: 10) -
get_dataset_info– Get detailed metadata for a dataset: organization, full description, tags, and the list of attached resources (files) with their IDs and formats.Parameters:
dataset_id(required) -
get_resource_info– Get format, size, and the direct download URL for a specific resource (file). Use this so the user or LLM can download the file; do not load large files into context.Parameters:
resource_id(required)
Suggested workflow: 1) search_datasets to find datasets, 2) get_dataset_info to see resources, 3) get_resource_info to get the download URL for a file.
Architecture note
This project is a thin middleware between the LLM and the opendata.az CKAN API. It uses only the public API at https://opendata.az/api/3/action/ (package_search, package_show, resource_show). No heavy data parsing, no analytics or tracking—just async HTTP with a 15-second timeout.
Tests
Automated tests (pytest)
uv run pytest
uv run pytest -v
Interactive testing (MCP Inspector)
- Install Node.js and ensure
npxis available. - Start the MCP server in one terminal:
uv run main.py. - In another terminal, start the Inspector and connect via stdio with:
- Command:
uv - Arguments:
--directory<path-to-this-repo>runmain.py
- Command:
Or, if your Inspector can spawn the process: npx @modelcontextprotocol/inspector and configure it to run uv --directory <path> run main.py.
Contributing
Contributions are welcome. Please:
- Keep changes small (one feature or fix per PR).
- Ensure code is reviewed and tested before submission.
Linting and formatting
This project uses Ruff for linting and formatting:
uv run ruff check --fix && uv run ruff format
Optional type checking with ty:
uv run ty check
License
This project is licensed under the MIT License — see the LICENSE file for details.