MCP server for AI image generation with Gemini — character consistency, object fidelity, grounding, and thinking mode
Nano Banana MCP Server 🍌
A production-ready Model Context Protocol (MCP) server for AI-powered image generation using Google's Gemini image models — fast iteration, high-resolution output, character consistency, Google Search grounding, and thinking mode.
✨ Features
Three-Tier Model Support
| Model | ID | Speed | 1K / 2K / 4K | Best For |
|-------|----|-------|--------------|----------|
| 🍌 Nano Banana 2 | gemini-3.1-flash-image-preview | 4–6s | $0.067 / $0.101 / $0.151 | Default — object fidelity, grounding, thinking, 14 ARs |
| 💎 Nano Banana Pro | gemini-3-pro-image-preview | 10–20s | $0.067 / $0.134 / $0.240 | Character consistency, max lighting/texture fidelity, 4K |
| ⚡ Nano Banana | gemini-2.5-flash-image | 10–15s | $0.039 flat | Budget drafts, high-volume workflows |
0.5K draft tier available on Nano Banana 2 at $0.045/image. 50% batch discount available on all models.
Nano Banana 2 — Default Model (gemini-3.1-flash-image-preview)
The right choice for most tasks. 3–5× faster than Pro, exclusive features:
- 🎁 Object Fidelity — reproduce products, logos, and brand assets with precision across scenes; up to 10 object reference images
- 🖼️ Photo Editing with Identity Preservation — change background, outfit, or setting while keeping a person's exact features
- 🔍 Image Search Grounding — retrieves real-world visual references from Google during generation; improves accuracy for landmarks, real people, and brand logos (NB2-exclusive)
- 🧠 Thinking Mode —
thinking_level="low"or"high"for complex layouts, text-heavy images, and multi-element compositions (NB2-exclusive) - 📝 Text in Images — strong short-text rendering; use JSON-structured prompts for precise infographic copy and labels
- 🌍 Multilingual Text — generate and translate text within images across 8+ languages
- 📐 14 Aspect Ratios — including extreme formats: 4:1 banners, 1:4 strips, 8:1 ultra-wide, 1:8 ultra-tall
- 📏 0.5K → 4K Resolution — four output tiers from fast drafts to print quality
Nano Banana Pro — When to Use It
Pro's strength is character consistency across people and characters plus max lighting/texture fidelity:
- 👤 Character Consistency — up to 5 characters maintained across a series with higher fidelity; preferred for protagonist-driven storyboards and campaign mascots
- 💡 Complex lighting and materials — intricate shadow relationships, reflective surfaces, photorealistic skin
- 🖨️ 4K print-quality output — highest resolution for physical production
- 14 total references (5 character + 10 object) combined
Core Features
- 🎨 Generate or Edit — create from scratch or modify existing images with natural language
- 📁 Named References — configure brand assets once (
REFERENCE_REGISTRY), reference by name forever - 🛡️ Production Ready — asset deduplication, error handling, structured logging
- ⚡ MCP Native — works with Claude.ai, Claude iOS, Claude Desktop, Claude Code, Cursor, and any MCP client
🚀 Quick Start
📚 New to nano banana? Check out the Complete Usage Guide with detailed examples and best practices!
⚡ Need quick reference? See the Quick Reference Card for common patterns.
📋 Want code examples? Browse Practical Examples for copy-paste ready code.
Prerequisites
- Google Gemini API Key - Get one free here
- Python 3.11+ (for development only)
Installation
Option 1: From MCP Registry (Recommended) This server is available in the Model Context Protocol Registry. Install it using your MCP client.
mcp-name: io.github.zhongweili/nanobanana-mcp-server
Option 2: Using uvx
uvx nanobanana-mcp-server@latest
Option 3: Using pip
pip install nanobanana-mcp-server
🔧 Configuration
Claude Desktop
Add to your claude_desktop_config.json:
{
"mcpServers": {
"nanobanana": {
"command": "uvx",
"args": ["nanobanana-mcp-server@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key-here"
}
}
}
}
Configuration file locations:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
Claude Code (VS Code Extension)
Install and configure in VS Code:
- Install the Claude Code extension
- Open Command Palette (
Cmd/Ctrl + Shift + P) - Run "Claude Code: Add MCP Server"
- Configure:
{ "name": "nanobanana", "command": "uvx", "args": ["nanobanana-mcp-server@latest"], "env": { "GEMINI_API_KEY": "your-gemini-api-key-here" } }
Cursor
Add to Cursor's MCP configuration:
{
"mcpServers": {
"nanobanana": {
"command": "uvx",
"args": ["nanobanana-mcp-server@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key-here"
}
}
}
}
Continue.dev (VS Code/JetBrains)
Add to your config.json:
{
"mcpServers": [
{
"name": "nanobanana",
"command": "uvx",
"args": ["nanobanana-mcp-server@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key-here"
}
}
]
}
Open WebUI
Configure in Open WebUI settings:
{
"mcp_servers": {
"nanobanana": {
"command": ["uvx", "nanobanana-mcp-server@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key-here"
}
}
}
}
Gemini CLI / Generic MCP Client
# Set environment variable
export GEMINI_API_KEY="your-gemini-api-key-here"
# Run server in stdio mode
uvx nanobanana-mcp-server@latest
# Or with pip installation
python -m nanobanana_mcp_server.server
🌐 Self-Hosting with Cloudflare Tunnel (Claude.ai + Claude iOS)
Want to run Nano Banana on your home computer and access it from Claude.ai (web) and Claude iOS — from anywhere?
Use a free Cloudflare Tunnel — no port forwarding, no static IP, no VPS required.
Claude.ai / Claude iOS
│ HTTPS
▼
Cloudflare Tunnel (free)
│ localhost:9000
▼
Your Home Computer
Quick setup:
# 1. Start server in HTTP mode
FASTMCP_TRANSPORT=http uvicorn asgi:app --host 0.0.0.0 --port 9000
# 2. Install cloudflared and create tunnel
brew install cloudflare/cloudflare/cloudflared # macOS
cloudflared tunnel login
cloudflared tunnel create nanobanana
cloudflared tunnel route dns nanobanana mcp.yourdomain.com
cloudflared tunnel run nanobanana
# 3. Add to Claude.ai: Settings → Integrations → Add Integration
# URL: https://mcp.yourdomain.com/mcp
Full step-by-step guide: docs/CLOUDFLARE_TUNNEL.md
Covers: macOS launchd, Linux systemd, Windows NSSM, Cloudflare Access auth, and troubleshooting.
🛠️ Claude Code Skills
If you use Claude Code, install the included workflow skills to get smarter, context-aware image generation — no need to explain the server every session.
# Install all skills
for skill in image-generation-workflow ai-image-model-selector image-reference-workflow; do
mkdir -p ~/.claude/skills/$skill
cp skills/${skill}.md ~/.claude/skills/$skill/SKILL.md
done
| Skill | Triggers |
|-------|---------|
| image-generation-workflow | "generate image", "create picture" |
| ai-image-model-selector | "which model", "best for character consistency" |
| image-reference-workflow | "use this photo", "reference image" |
See skills/README.md for customization instructions.
⚙️ Environment Variables
Optional configuration:
# Required
GEMINI_API_KEY=your-gemini-api-key-here
# Optional
IMAGE_OUTPUT_DIR=/path/to/image/directory # Default: ~/nanobanana-images
FASTMCP_TRANSPORT=http # "stdio" (local) or "http" (remote/tunnel)
FASTMCP_HOST=0.0.0.0 # Bind address for HTTP mode
FASTMCP_PORT=9000 # Port for HTTP mode
LOG_LEVEL=INFO # DEBUG, INFO, WARNING, ERROR
LOG_FORMAT=standard # standard, json, detailed
See .env.example for all options with descriptions.
🐛 Troubleshooting
Common Issues
"GEMINI_API_KEY not set"
- Add your API key to the MCP server configuration in your client
- Get a free API key at Google AI Studio
"Server failed to start"
- Ensure you're using the latest version:
uvx nanobanana-mcp-server@latest - Check that your client supports MCP (Claude Desktop 0.10.0+)
"Permission denied" errors
- The server creates images in
~/nanobanana-imagesby default - Ensure write permissions to your home directory
Development Setup
For local development:
# Clone repository
git clone https://github.com/ryaker/nanobanana-mcp-server.git
cd nanobanana-mcp-server
# Install with uv
uv sync
# Set environment
export GEMINI_API_KEY=your-api-key-here
# Run locally
uv run python -m nanobanana_mcp_server.server
📄 License
MIT License - see LICENSE for details.
🆘 Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions