MCP Servers

模型上下文协议服务器、框架、SDK 和模板的综合目录。

A
Azure Ai Travel Agents

The AI Travel Agents is a robust enterprise application (hosted on ACA) that leverages MCP and multiple LamaIndex AI agents to enhance travel agency operations.

创建于 3/14/2025
更新于 15 days ago
Repository documentation and setup instructions
ai-travel-agents-logo - Azure Ai Travel Agents by Azure-Samples

Azure AI Travel Agents with Llamaindex.TS and MCP

Join Azure AI Foundry Community Discord Join Azure AI Foundry Developer Forum Announcement blog post
Open project in GitHub Codespaces Build Status Node version TypeScript Java .NET Python License

:star: To stay updated and get notified about changes, star this repo on GitHub!

GitHub Repo stars

OverviewArchitectureFeaturesPreview the application locallyCost estimationJoin the Community

Animation showing the chat app in action

Overview

The AI Travel Agents is a robust enterprise application that leverages multiple AI agents to enhance travel agency operations. The application demonstrates how LlamaIndex.TS orchestrates multiple AI agents to assist employees in handling customer queries, providing destination recommendations, and planning itineraries. Multiple MCP (Model Context Protocol) servers, built with Python, Node.js, Java and .NET, are used to provide various tools and services to the agents, enabling them to work together seamlessly.

| Agent Name | Purpose | | -------------------------------- | ----------------------------------------------------------------------------------------------------------------------------- | | Customer Query Understanding | Extracts key preferences from customer inquiries. | | Destination Recommendation | Suggests destinations based on customer preferences. | | Itinerary Planning | Creates a detailed itinerary and travel plan. | | Code Evaluation | Executes custom logic and scripts when needed. | | Model Inference | Runs a custom LLM using ONNX and vLLM on Azure Container Apps' serverless GPU for high-performance inference. | | Web Search | Uses Grounding with Bing Search to fetch live travel data. | | Echo Ping | Echoes back any received input (used as an MCP server example). |

High-Level Architecture

The architecture of the AI Travel Agents application is designed to be modular and scalable:

  • All components are containerized using Docker so that they can be easily deployed and managed by Azure Container Apps.
  • All agents tools are available as MCP (Model Context Protocol) servers and are called by the MCP clients.
  • MCP servers are implemented independently using variant technologies, such as Python, Node.js, Java, and .NET.
  • The Agent Workflow Service orchestrates the interaction between the agents and MCP clients, allowing them to work together seamlessly.
  • The Aspire Dashboard is used to monitor the application, providing insights into the performance and behavior of the agents (through the OpenTelemetry integration).
Application architecture

[!NOTE] New to the Model Context Protocol (MCP)? Check out our free MCP for Beginners guide.

Features

Prerequisites

Ensure you have the following installed before running the application:

Preview the application locally

To run and preview the application locally, follow these steps:

  1. Clone the repository:
Using HTTPS
git clone https://github.com/Azure-Samples/azure-ai-travel-agents.git

Using SSH
git clone git@github.com:Azure-Samples/azure-ai-travel-agents.git

Using GitHub CLI
gh repo clone Azure-Samples/azure-ai-travel-agents

  1. Navigate to the cloned repository:
cd azure-ai-travel-agents
  1. Login to your Azure account:
azd auth login
For GitHub Codespaces users

If the previous command fails, try:

azd auth login --use-device-code

  1. Provision the Azure resources:
azd provision

When asked, enter a name that will be used for the resource group. Depending on the region you choose and the available resources and quotas, you may encouter provisioning errors. If this happens, please read our troubleshooting guide in the Advanced Setup documentation.

  1. Open a new terminal and run the following command to start the API:
npm start --prefix=src/api
  1. Open a new terminal and run the following command to start the UI:
npm start --prefix=src/ui
  1. Once all services are up and running, you can:
  • Access the UI at http://localhost:4200.
  • View the traces via the Aspire Dashboard at http://localhost:18888.
    • On Structured tab you'll see the logging messages from the tool-echo-ping and api services. The Traces tab will show the traces across the services, such as the call from api to echo-agent.

UI Screenshot

[!IMPORTANT] In case you encounter issues when starting either the API or UI, try running azd hooks run postprovision to force run the post-provisioning hooks. This is due to an issue with the azd provision command not executing the post-provisioning hooks automatically, in some cases, the first time you run it.

Use GitHub Codespaces

You can run this project directly in your browser by using GitHub Codespaces, which will open a web-based VS Code:

Open in GitHub Codespaces

Use a VSCode dev container

A similar option to Codespaces is VS Code Dev Containers, that will open the project in your local VS Code instance using the Dev Containers extension.

You will also need to have Docker installed on your machine to run the container.

Open in Dev Containers

Cost estimation

Pricing varies per region and usage, so it isn't possible to predict exact costs for your usage. However, you can use the Azure pricing calculator for the resources below to get an estimate.

  • Azure Container Apps: Consumption plan, Free for the first 2M executions. Pricing per execution and memory used. Pricing
  • Azure Container Registry: Free for the first 2GB of storage. Pricing per GB stored and per GB data transferred. Pricing
  • Azure OpenAI: Standard tier, GPT model. Pricing per 1K tokens used, and at least 1K tokens are used per query. Pricing
  • Azure Monitor: Free for the first 5GB of data ingested. Pricing per GB ingested after that. Pricing

⚠️ To avoid unnecessary costs, remember to take down your app if it's no longer in use, either by deleting the resource group in the Portal or running azd down --purge (see Clean up).

Deploy the sample

  1. Open a terminal and navigate to the root of the project.
  2. Authenticate with Azure by running azd auth login.
  3. Run azd up to deploy the application to Azure. This will provision Azure resources, deploy this sample, with all the containers, and set up the necessary configurations.
    • You will be prompted to select a base location for the resources. If you're unsure of which location to choose, select swedencentral.
    • By default, the OpenAI resource will be deployed to swedencentral. You can set a different location with azd env set AZURE_LOCATION <location>. Currently only a short list of locations is accepted. That location list is based on the OpenAI model availability table and may become outdated as availability changes.

The deployment process will take a few minutes. Once it's done, you'll see the URL of the web app in the terminal.

Screenshot of the azd up command result

You can now open the web app in your browser and start chatting with the bot.

Clean up

To clean up all the Azure resources created by this sample:

  1. Run azd down --purge
  2. When asked if you are sure you want to continue, enter y

The resource group and all the resources will be deleted.

Advanced Setup

To run the application in a more advanced local setup or deploy to Azure, please refer to the troubleshooting guide in the Advanced Setup documentation. This includes setting up the Azure Container Apps environment, using local LLM providers, configuring the services, and deploying the application to Azure.

Contributing

We welcome contributions to the AI Travel Agents project! If you have suggestions, bug fixes, or new features, please feel free to submit a pull request. For more information on contributing, please refer to the CONTRIBUTING.md file.

Join the Community

We encourage you to join our Azure AI Foundry Developer Community​ to share your experiences, ask questions, and get support:

Join us on Discord