MCP Servers

A collection of Model Context Protocol servers, templates, tools and more.

C
Civitai MCP Server

MCP server by Yi-luo-hua

Created 2/25/2026
Updated about 9 hours ago
Repository documentation and setup instructions

Civitai MCP Server

A Model Context Protocol (MCP) server that empowers AI assistants (like Claude, Cursor, Cline) to natively interact with the Civitai API.

With this server, your AI can search for models, browse tags, read model details, get preview images, and even generate ready-to-run download commands for your local ComfyUI or WebUI setup—directly from the chat interface!

Key Capabilities

  • Smart Search & Browse: Full-text search with automatic fallbacks. Filter by creator, type (LORA, Checkpoint, etc.), base model, and tags.
  • Deep Model Info: Access comprehensive model details including available versions, trigger words, download URLs, and file hashes.
  • Image Prompts: Retrieve example images generated by a model, complete with their positive/negative prompts and generation parameters.
  • One-Click Downloads: Get ready-to-use curl or PowerShell download commands that automatically place models into their correct ComfyUI directories.

Available Tools for AI

| Tool | Action | Example Use Case | |------|--------|------------------| | civitai_search_models | Search & browse models | "Find anime LoRAs for SDXL" or "Show me trending checkpoints" | | civitai_get_model | Get model details by ID | "What are the trigger words for model 12345?" | | civitai_get_model_version | Get specific version info | "Get details for version 67890 of this model" or "Show me versions for model 12345" | | civitai_browse_tags | List available tags | "What are the most popular tags right now?" | | civitai_get_model_images | Get example images | "Show me examples and prompts for model 12345" | | civitai_get_download_info | Generate download scripts | "Give me the download command for this LoRA into my ComfyUI" |

How to ask the AI (Usage Examples)

Once installed, you can simply ask your AI assistant:

"Find me some good anime SDXL LoRAs by the creator 'civitai_user'." "What are the trigger words and recommended prompts for model ID 123456?" "Give me the PowerShell command to download model 123456 directly into my ComfyUI." "Show me some example images and their generation prompts for the latest Flux checkpoint."


Important Tips for Users & AI

Due to upstream Civitai API text-search limitations:

  • Search by username (creator name) is the most reliable way to find specific models.
  • Model names with special characters (like |) often fail in exact text search. The server employs fallback strategies, but simple keywords work best.
  • If you know a model's ID (from its URL, e.g., civitai.com/models/12345), passing the ID directly to civitai_get_model is the fastest and most accurate method.

Setup & Installation

1. Install Dependencies & Build

cd mcp/civitai-mcp-server
npm install
npm run build

2. Configure your MCP Client

Add the following to your MCP client's configuration file:

{
  "mcpServers": {
    "civitai": {
      "command": "node",
      "args": ["path/to/mcp/civitai-mcp-server/dist/index.js"],
      "env": {
        "CIVITAI_API_KEY": "your_civitai_api_key_here",
        "COMFYUI_MODELS_PATH": "C:\\path\\to\\ComfyUI\\models"
      }
    }
  }
}

Environment Variables

| Variable | Required | Description | |----------|----------|-------------| | CIVITAI_API_KEY | Yes | Your Civitai API key (needed for downloading and NSFW access). Get it from your Civitai Account Settings. | | COMFYUI_MODELS_PATH | No | Base path where your ComfyUI reads models. The server will automatically map downloads to checkpoints/, loras/, etc. |

ComfyUI Folder Auto-Mapping

When generating download commands, the server intelligently routes files based on their type:

| Model Type | Target Subfolder | |-----------|---------------| | Checkpoint | checkpoints/ | | LORA / LoCon | loras/ | | TextualInversion | embeddings/ | | Hypernetwork | hypernetworks/ | | Controlnet | controlnet/ | | Poses | poses/ |

Architecture

src/
├── index.ts        # Server entry, tool registration
├── constants.ts    # API config, enums, paths
├── types.ts        # TypeScript interfaces
├── formatters.ts   # Markdown output formatters
└── services/
    └── api.ts      # HTTP client, sanitization logic

License

MIT


中文说明 / Chinese Documentation

Civitai MCP Server

一个基于模型上下文协议 (MCP) 的服务器,使 AI 助手(如 Claude、Cursor、Cline)能够原生地与 Civitai API 交互。

通过该服务器,您的 AI 可以在聊天界面中直接为您搜索模型、浏览标签、读取模型详情、获取预览图,甚至为您本地的 ComfyUI 或 WebUI 生成可以直接运行的下载命令!

核心功能

  • 智能搜索与浏览:支持带有自动备选词(回退机制)的全文搜索。可按创作者、模型类型(LORA、Checkpoint 等)、基础大模型和标签进行过滤查询。
  • 深度模型信息:获取全面的模型详情,包括所有可用版本、触发词(Trigger Words)、下载链接和文件哈希值。
  • 图像提示词(Prompts):检索各模型生成的示例图像,以及对应的正向/反向提示词和生成参数。
  • 一键下载指令:生成开箱即用的 curl 或 PowerShell 下载指令,并智能地将模型放入您本地对应的 ComfyUI 文件夹中。

AI 可用工具

| 工具 | 动作 | 使用场景案例 | |------|--------|------------------| | civitai_search_models | 搜索和浏览模型 | "帮我找几个适用于 SDXL 的动漫风格 LoRA""给我看看近期最火的大模型" | | civitai_get_model | 根据 ID 获取模型详情 | "模型 12345 的触发词是什么?" | | civitai_get_model_version | 获取特定版本信息 | "获取该模型版本号 67890 的详细信息" | | civitai_browse_tags | 浏览可用标签 | "现在热门的标签有哪些?" | | civitai_get_model_images | 获取示例图片预览 | "让我看看模型 12345 的示例生成图和提示词" | | civitai_get_download_info | 生成下载脚本 | "给我直接下载这个 LoRA 到我 ComfyUI 的一键命令" |

如何向 AI 提问 (使用示例)

安装好此 MCP 后,您可以直接这样问您的 AI:

"帮我找几个创作者名字叫 'civitai_user' 的优秀动漫 SDXL LoRA。" "模型 ID 123456 的触发词和推荐提示词是什么?" "给我一条 PowerShell 指令,把模型 123456 直接下载到我的 ComfyUI 里去。" "让我看看最新的 Flux 模型的示例生成图以及它们的提示词。"


用户与 AI 助手特别注意事项

受限于 Civitai API 上游文本搜索的限制:

  • username (作者用户名) 搜索 是查找特定模型最准确、最可靠的方法。
  • 包含特殊字符(如 |)的名称在精确文本匹配搜索中通常会失败。服务器内置了退化策略机制,但建议使用简单的几个关键词搜索效果更好。
  • 如果您事先知道某个模型的 ID(从 URL 获得,例如 civitai.com/models/12345),直接要求 AI 将 ID 传给 civitai_get_model 是最快捷、最精准的操作。

安装与配置

1. 安装依赖并构建

cd mcp/civitai-mcp-server
npm install
npm run build

2. 配置您的 MCP 客户端

将以下内容添加到您的 MCP 客户端(如 Cursor / Claude Desktop 等)配置文件中:

{
  "mcpServers": {
    "civitai": {
      "command": "node",
      "args": ["path/to/mcp/civitai-mcp-server/dist/index.js"],
      "env": {
        "CIVITAI_API_KEY": "your_civitai_api_key_here",
        "COMFYUI_MODELS_PATH": "C:\\path\\to\\ComfyUI\\models"
      }
    }
  }
}

环境变量设置

| 变量 | 是否必填 | 说明 | |----------|----------|-------------| | CIVITAI_API_KEY | | 您的 Civitai API Key(用于解锁部分下载及获取 NSFW 内容必备)。可在您的 Civitai 账号设置处提取。 | | COMFYUI_MODELS_PATH | 否 | ComfyUI 读取模型的基础根目录。服务器将自动分类下载模型至 checkpoints/loras/ 等子目录。 |

ComfyUI 文件夹自动映射

当要求生成下载命令时,服务器会根据所对应的模型类型,自动设定模型下载到对应的子文件夹下:

| 模型类型 | 目标子文件夹 | |-----------|---------------| | Checkpoint | checkpoints/ | | LORA / LoCon | loras/ | | TextualInversion | embeddings/ | | Hypernetwork | hypernetworks/ | | Controlnet | controlnet/ | | Poses | poses/ |

架构参考

src/
├── index.ts        # 服务器启动入口,工具注册
├── constants.ts    # API 配置变量,枚举与路径常量
├── types.ts        # TypeScript 接口规范
├── formatters.ts   # 输出 Markdown 内容的格式化工具
└── services/
    └── api.ts      # HTTP 客户端以及过滤特殊字符逻辑

许可 (License)

本项目采用 MIT 许可证。

Quick Setup
Installation guide for this server

Install Package (if required)

npx @modelcontextprotocol/server-civitai-mcp-server

Cursor configuration (mcp.json)

{ "mcpServers": { "yi-luo-hua-civitai-mcp-server": { "command": "npx", "args": [ "yi-luo-hua-civitai-mcp-server" ] } } }