MCP Servers

A collection of Model Context Protocol servers, templates, tools and more.

MCP server by Durannd

Created 4/29/2026
Updated 1 day ago
Repository documentation and setup instructions

Vertex AI MCP Server

License: MIT

A lightweight, zero-config MCP server to offload massive repository analysis from your local IDE to Google Cloud's Vertex AI (Gemini).

This server acts as a bridge, allowing any MCP-compatible client (like Gemini CLI, Cursor, Antigravity, etc.) to securely use your enterprise Google Cloud account for heavy-lifting AI tasks, powered by Gemini models.

💡 Key Feature: Zero-Token Offloading

The primary goal of this project is to enable "Zero-Token Offloading". Instead of your local tools reading and processing entire repositories—consuming your personal API keys and hitting local context window limits—this server does the work:

  1. It reads the repository from your local disk, respecting .gitignore and other ignore files.
  2. It intelligently packages the code along with your prompt.
  3. It sends the entire workload to Vertex AI, using your secure, enterprise-grade gcloud credentials.

This allows you to leverage the massive context windows of models like Gemini 3.1 Pro (2M+ tokens) and use your company's Google Cloud credits instead of paying out-of-pocket.


🛠️ Prerequisites

  1. Node.js (v18 or higher)
  2. Google Cloud CLI (gcloud) installed on your machine.
  3. A Google Cloud Project with the Vertex AI API enabled.

🚀 Configuration (Choose your method)

This server is designed to be Zero-Config. It will try multiple ways to find your Google Cloud Project ID.

1. Automatic (Recommended)

Simply authenticate your terminal using the Google Cloud CLI. The server will automatically detect your project ID from these credentials.

gcloud auth application-default login

2. Global Config (For multiple projects)

If you work across multiple projects or want a persistent setup, create a file at ~/.vertex-mcp.json (in your user's home directory):

{
  "GOOGLE_CLOUD_PROJECT_ID": "your-project-id",
  "GOOGLE_CLOUD_LOCATION": "us-central1"
}

3. Environment Variables

You can also set the project ID directly in your MCP client's settings (e.g., in Gemini CLI's settings.json or Cursor's configuration).


📦 Installation & Build

  1. Install dependencies:
    npm install
    
  2. Build the server:
    npm run build
    
    This will compile the TypeScript code into the dist/ folder.

🔌 Connecting to a Client

This MCP server communicates using Stdio (Standard I/O), not a network port. Your client application will run the server as a background process.

Example: Connecting the Gemini CLI

To add this server to the Gemini CLI, run the following command from the root of this project directory. This is the recommended and most robust method.

gemini mcp add vertex-ai "node C:\Users
icae\OneDrive\Documentos\Projetos\vertex-ai-mcp\dist\index.js"

This command tells Gemini to use "vertex-ai" as the name for this server and specifies how to start it.

Example: Connecting Other Clients (e.g., Antigravity)

Some clients may require an absolute path to the server's entry point.

  1. Get the absolute path to the compiled index.js file inside the dist folder.

  2. Add the server configuration to your client. The example below is for Antigravity's mcp_config.json:

    {
      "mcpServers": {
        "vertex-ai": {
          "command": "node",
          "args": [
            "C:\path	o\your\project\vertex-ai-mcp\dist\index.js"
          ]
        }
      }
    }
    
  3. Restart your client completely for the changes to take effect.


🐳 Docker Support

You can also run the server inside a Docker container.

1. Build the Image

docker build -t vertex-ai-mcp .

2. Run with GCP Credentials

To allow the container to access your Google Cloud credentials, mount your local ADC file as a volume.

Windows (PowerShell):

docker run -i --rm `
  -v "${env:APPDATA}\gcloud\application_default_credentials.json:/.config/gcloud/application_default_credentials.json" `
  vertex-ai-mcp

Linux/macOS:

docker run -i --rm 
  -v "$HOME/.config/gcloud/application_default_credentials.json:/.config/gcloud/application_default_credentials.json" 
  vertex-ai-mcp

🛠️ Available Tools

Once connected, your client will have access to the following tools:

  • ask_vertex_agent: Sends a generic prompt to a Vertex AI model.
  • vertex_analyze_repo: Reads an entire local directory, packages its content, and asks Vertex AI to analyze it based on your prompt.
  • vertex_analyze_ui_screenshots: Reads a local folder of images and asks Vertex AI to perform a visual analysis.
Quick Setup
Installation guide for this server

Install Package (if required)

npx @modelcontextprotocol/server-vertex-ai-mcp

Cursor configuration (mcp.json)

{ "mcpServers": { "durannd-vertex-ai-mcp": { "command": "npx", "args": [ "durannd-vertex-ai-mcp" ] } } }