Join our Discord Server
Ajeet Raina Ajeet Singh Raina is a former Docker Captain, Community Leader and Distinguished Arm Ambassador. He is a founder of Collabnix blogging site and has authored more than 700+ blogs on Docker, Kubernetes and Cloud-Native Technology. He runs a community Slack of 9800+ members and discord server close to 2600+ members. You can follow him on Twitter(@ajeetsraina).

Integration of Model Context Protocol and Docker AI Agent under Docker Desktop

7 min read

I recently had the opportunity to collaborate with Raveendiran RR, a Docker Community Speaker and Generative AI enthusiast, to present on this exciting topic at Cloud-Native LLMOps Day in Bengaluru. Together, we explored the transformative potential of Model Context Protocol in modern AI development workflows, sharing insights with the vibrant tech community. This blog post expands on the key concepts we discussed during our presentation.

Image1

In today’s rapidly evolving AI landscape, developers face numerous challenges when integrating AI capabilities into their applications. One of the most promising solutions to address these challenges is the Model Context Protocol (MCP), which is gaining significant traction, especially when paired with Docker’s containerization technology. In this blog post, we’ll explore what MCP is, why it matters, and how Docker’s AI Agent (Gordon) leverages this protocol to provide a seamless AI development experience.

What is Model Context Protocol (MCP)?

Image2

The Model Context Protocol (MCP) is a standardized communication protocol that facilitates seamless integration between Large Language Models (LLMs) and external tools. As AI applications become increasingly sophisticated, they need to interact with various data sources, APIs, and software components. MCP provides a consistent way for these systems to communicate, eliminating the fragmentation that previously existed across different AI platforms.

Image3

At its core, MCP addresses a fundamental challenge: each LLM provider (like OpenAI, Google’s Gemini, Anthropic’s Claude, etc.) has developed its own approach to function and tool calling. This fragmentation creates unnecessary complexity for developers who want to leverage AI capabilities across different models. MCP serves as a unifying standard, allowing developers to write integrations once and deploy them across multiple AI platforms.

Click here to learn more

The Evolution of Generative AI

Image6

The journey toward MCP reflects the broader evolution of generative AI:

  • Initial Emergence: The advent of generative AI introduced powerful language models capable of understanding and generating human-like text.
  • RAG Implementation: Retrieval Augmented Generation (RAG) significantly improved AI responses by allowing models to reference external knowledge bases, using vector and graph databases alongside frameworks like LangChain.
  • AI Agents with MCP: The latest evolution combines AI agents with the Model Context Protocol, enabling standardized interactions between AI models and external tools.

Understanding AI Agents

Image5

Before diving deeper into MCP, it’s important to understand what AI agents are. An AI agent is an autonomous system with several key capabilities:

  • Environment Perception: Ability to collect and process input data through various mechanisms
  • Information Processing: Analysis and decision-making using machine learning, logic, and rules
  • Goal Achievement: Optimization, self-learning, and adaptation to achieve specified objectives
  • Action Execution: The ability to perform tasks using APIs, software, or other tools
  • Autonomy: Operating independently with minimal human intervention

This agent-based approach represents a shift from passive AI assistants to active AI systems that can take initiative in solving problems.

Agent Design Patterns

Image7

AI agents can be organized in several patterns:

  • Sequential Agents: A linear chain of AI agents working in sequence
  • Hierarchical Agents: A tree structure with manager agents overseeing specialized worker agents
  • Hybrid Agents: A combination of sequential and hierarchical patterns for complex workflows

Agents become significantly more powerful when equipped with tools that extend their capabilities, similar to how Batman or Superman leverage their tools and abilities to solve problems.

The Need for a Standard Protocol

Image9

As AI tools proliferated, each major LLM provider (OpenAI, Google, Anthropic, etc.) developed their own methods for function and tool calling. This fragmentation created a complex ecosystem where developers needed to adapt their code for each platform. MCP emerged as a solution, providing a standard protocol that defines how tools should be used across different AI platforms.

What Makes MCP Unique?

Image11

MCP offers several key advantages:

  • Instant Integration: Provides immediate and easy integration between LLMs and external tools
  • Platform Independence: Allows freedom to switch between LLM providers without rewriting code
  • Secure Data Handling: Keeps sensitive data within your infrastructure rather than sending it to the LLM provider

The protocol operates through a client-server architecture:

  • MCP Clients: Function as connectors, establishing communication links between the host and MCP servers
  • MCP Servers: Core components that execute specific tasks or functions using the protocol to expose defined capabilities

MCP Message Types and Communication Flow

Image11

MCP uses three primary message types for communication:

  • Requests: From clients to servers to initiate actions
  • Responses: From servers back to clients with results
  • Notifications: Event updates from servers to clients

The communication workflow typically involves:

  • Client initialization
  • Session establishment with capability negotiation
  • Request/response cycles for tool interactions
  • Notifications for status updates
  • Session termination

MCP Client Requirements

For effective operation, MCP clients must handle several core functions:

  • Prompts: Managing input to the LLM
  • Tools: Defining available capabilities
  • Resources: Managing external data sources
  • Sampling: Controlling LLM output generation
  • Roots: Managing conversation context

Current Challenges with MCP Servers

Image14

Despite its advantages, implementing MCP servers presents several challenges:

  • Environment Conflicts: MCP servers often require specific versions of Node.js, Python, and other dependencies
  • Host Isolation Concerns: Current MCP servers run directly on the host, potentially creating security issues
  • Complex Setup: The installation process can be complicated, inhibiting adoption
  • Cross-Platform Compatibility: Ensuring consistent operation across different operating systems and architectures
  • Dependency Management: Safely distributing and managing server-specific runtime dependencies

How Docker Addresses These Challenges

This is where Docker’s containerization technology provides significant value:

  • Docker Desktop: Offers a development platform to build, test, and run MCP servers
  • Docker Hub: Provides a centralized repository for distributing containerized MCP servers
  • Docker Scout: Ensures images remain secure and free of vulnerabilities
  • Docker Build Cloud: Facilitates faster and more reliable cross-platform image building

By containerizing MCP servers, Docker addresses the environmental conflicts, simplifies setup, improves isolation, and enhances cross-platform compatibility.

Introducing Docker AI Agent (Project Gordon)

Taking this integration a step further, Docker has developed an AI assistant called Gordon that is integrated directly into Docker Desktop and CLI. Gordon provides:

Context-Aware Assistance: Real-time guidance for container operations
Workflow Integration: Eliminates context-switching during development

Key features include:

  • Dockerfile optimization and rating
  • Smart container running assistance
  • Context-aware troubleshooting
  • Project containerization guidance
  • GitHub Actions integration
  • Contextual container management

Gordon’s workflow follows an agentic pipeline:

  • Understanding user input to determine required actions
  • Gathering necessary context (working directory, Dockerfiles, running containers)
  • Preparing prompts with gathered context
  • Generating responses using LLMs
  • Evaluating output quality
  • Performing actions on behalf of the user when appropriate

Click here to learn more

Getting Started with MCP in Docker

Image77

To begin using MCP with Docker, follow these steps:

  • Install Docker Desktop: Download and install the latest version
  • Enable Docker AI Agent: Activate the “Ask Gordon” feature in Docker Desktop settings
  • Add MCP Servers: Create a configuration file (e.g., gordon-mcp.yml) defining your MCP servers
  • Use Docker AI with MCP: Start interacting with the system via the Docker AI interface

A sample MCP configuration might look like:

services:
  time:
    image: mcp/time

  postgres:
    image: mcp/postgres
    command: postgresql://postgres:dev@host.docker.internal:5433/postgres

  git:
    image: mcp/git
    volumes:
      - /Users/username:/Users/username

  gh:
    image: mcp/github
    environment:
      GITHUB_PERSONAL_ACCESS_TOKEN: ${GITHUB_PERSONAL_ACCESS_TOKEN}

  fetch:
    image: mcp/fetch

This configuration makes time services, PostgreSQL database access, Git repository management, GitHub API access, and web fetching capabilities available to your AI agent.

Getting Started

  • Install Docker Desktop 4.38.+
  • Enable AI
image

Greeting Gordon

how are you doing?

Listing all the containers

Prompt #1

list all the containers running on my system in a tabular format
image

Prompt #2

docker ai list all the containers running on my system in a tabular format and highlight ones that is consuming maximum space
image

Dockerfile Optimisation

Clone the repo

https://github.com/ajeetraina/todo-list/
cd todo-list/build

Build the image with name “huge”

docker build -t huge .

Note the size of Docker image 1.8 GB

Let’s ask Gordon to optimise this Image

Prompt

docker ai please optimise this Docker image

it creates a new Dockerfile file and keeps Dockerfile.bak old too.

docker ai can you optimise my Dockerfile

The RUN command for npm install now includes –mount=type=cache,target=/root/.npm. This uses Docker’s BuildKit

feature to cache the npm dependencies in the /root/.npm directory.

 diff Dockerfile Dockerfile
1c1
< FROM node:21-alpine
---
> FROM node:21
4d3
< 
6,7c5
< RUN npm install --production
< 
---
> RUN npm install
9d6
< 
11d7
< 
12a9
>

Let’s rebuild it again with name “small”

docker build -t small .
docker images 
REPOSITORY                                  TAG                                        IMAGE ID       CREATED          SIZE
small                                       latest                                     052adc5729e8   7 minutes ago    377MB
huge                                        latest                                     6bcd991ba3e2   30 minutes ago   1.83GB

You can see that Gordon optimised the size.

Optimisation using Multi-stage Build

docker ai can you optimise using Multi-stage build

It creates the following Dockerfile.

FROM node:21-alpine AS builder

WORKDIR /app

COPY package*.json ./
RUN npm install --production

COPY . .

FROM node:21-alpine

WORKDIR /app

COPY --from=builder /app /app

EXPOSE 3000

CMD ["node", "src/index.js"]

Let’s build it with name “extra-small”

docker build -t extra-small .
REPOSITORY                                  TAG                                        IMAGE ID       CREATED          SIZE
extra-small                                 latest                                     41868a6e197f   3 minutes ago    235MB
small                                       latest                                     052adc5729e8   18 minutes ago   377MB
huge                                        latest                                     6bcd991ba3e2   41 minutes ago   1.83GB

Gordon and MCP

Assuming that you have cloned the repo that has gordon-mcp.yml file with the following content:

services:
  time:
    image: mcp/time

  postgres:
    image: mcp/postgres
    command: postgresql://postgres:dev@host.docker.internal:5433/postgres

  git:
    image: mcp/git
    volumes:
      - /Users/ajeetsraina:/Users/ajeetsraina


  github:
    image: mcp/github
    environment:
      - GITHUB_PERSONAL_ACCESS_TOKEN=${GITHUB_PERSONAL_ACCESS_TOKEN}

  fetch:
    image: mcp/fetch

  fs:
    image: mcp/filesystem
    command:
      - /rootfs
    volumes:
      - .:/rootfs

List all the MCP Tools

docker ai mcp
Initializing time
Initializing fs
Initializing postgres
Initializing fetch
Initializing git
Initializing github
Initialized fs
Initialized postgres
Initialized github
...
...

Github MCP Server

Ensure that you add your PAT to ~/.zshrc like:

export GITHUB_PERSONAL_ACCESS_TOKEN='XXX'

Next, source the shell

source ~/.zshrc

Prompt

Creating a new GitHub Repo

docker ai can you create a github repo called sensor-analytics, add a README file with random sensor values that includes temp, pressure and humidity in a tbaular format

Prompt

$ docker ai can you fetch dockerlabs.collabnix.com and write the summary to a file tests.txt
    • Calling fetch ✔️
    • Calling write_file ✔️
    • Calling list_allowed_directories ✔️
    • Calling write_file ✔️

  The summary of DockerLabs has been successfully written to the file /rootfs/tests.txt. Let me know if you need further assistance
  !

Validating

cat tests.txt
DockerLabs is a comprehensive learning platform for Docker enthusiasts, offering resources for beginners, intermediate, and advanced users. It features over 500 interactive tutorials and guides, accessible via Docker Desktop or browser. Key highlights include community engagement through Slack and Discord, a GitHub repository for contributions, and a variety of blog posts and articles on Docker-related topics. The platform also provides hands-on labs covering Docker core concepts, advanced features, and industry use cases. Additionally, it offers workshops for beginners, tutorials on Dockerfile creation, and guidance on managing Docker containers and volumes.%

Using Postgres

  • Start 3 Postgres container
docker run -d --name postgres1 -e POSTGRES_PASSWORD=dev -p 5432:5432 postgres:latest
docker run -d --name postgres2 -e POSTGRES_PASSWORD=dev -p 5433:5432 postgres:13
docker run -d --name postgres3 -e POSTGRES_PASSWORD=dev -p 5434:5432 postgres:12
  • Create dummy tables
-- Create a table for Users
CREATE TABLE users (
    id SERIAL PRIMARY KEY,
    name VARCHAR(100) NOT NULL,
    email VARCHAR(100) UNIQUE NOT NULL,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

-- Create a table for Orders
CREATE TABLE orders (
    id SERIAL PRIMARY KEY,
    user_id INT REFERENCES users(id) ON DELETE CASCADE,
    product_name VARCHAR(100) NOT NULL,
    price DECIMAL(10,2) NOT NULL,
    order_date TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

-- Create a table for Products
CREATE TABLE products (
    id SERIAL PRIMARY KEY,
    name VARCHAR(100) NOT NULL,
    description TEXT,
    price DECIMAL(10,2) NOT NULL,
    stock INT NOT NULL DEFAULT 0
);

Query the list of tables

SELECT table_name 
FROM information_schema.tables 
WHERE table_schema = 'public';

Using Ask Gordon

docker ai list of all tables in the postgres database running in a postgres container named postgres2

Conclusion

Image999

The Model Context Protocol represents a significant step forward in standardizing AI tool integration. When combined with Docker’s containerization technology and AI agent capabilities, it creates a powerful, flexible foundation for developing AI-enhanced applications.

By addressing the challenges of environment conflicts, cross-platform compatibility, and setup complexity, Docker makes MCP more accessible to developers. The integration of Docker’s AI Agent (Gordon) with MCP servers further streamlines the development experience, providing context-aware assistance throughout the container lifecycle.

As the AI landscape continues to evolve, standards like MCP will become increasingly important for ensuring interoperability and reducing fragmentation. Docker’s support for this protocol positions it as a key player in the emerging AI development ecosystem.

Resources

Have Queries? Join https://launchpass.com/collabnix

Ajeet Raina Ajeet Singh Raina is a former Docker Captain, Community Leader and Distinguished Arm Ambassador. He is a founder of Collabnix blogging site and has authored more than 700+ blogs on Docker, Kubernetes and Cloud-Native Technology. He runs a community Slack of 9800+ members and discord server close to 2600+ members. You can follow him on Twitter(@ajeetsraina).
Join our Discord Server
Index