10 Essential Steps to Mastering Custom MCP Catalogs & Profiles for Enterprise AI
Managing AI tooling at scale is a challenge every forward-thinking organization faces. With the general availability of Custom MCP Catalogs and Profiles, teams now have a powerful duo to curate, distribute, and run Model Context Protocol (MCP) servers with unprecedented control. This listicle walks through ten key aspects—from understanding the core concepts to building your own catalogs and profiles. Whether you're an enterprise architect or a developer, these steps will help you streamline AI tool discovery, enforce governance, and boost productivity.
1. What Are Custom MCP Catalogs and Profiles?
Custom MCP Catalogs are curated collections of approved MCP servers that organizations can publish and maintain. They replace the chaotic hunt for servers across the open internet with a single, trusted source. MCP Profiles, on the other hand, are portable, named groupings of MCP servers. They allow individual developers to define and share their tool configurations across projects and teams. Together, they form a complete lifecycle: catalogs provide the approved inventory, while profiles enable runtime customization and sharing. This separation of concerns is critical for enterprise adoption—IT manages the catalog, developers compose profiles, and both stay in sync.

2. Why Enterprises Need Curated Catalogs
Without a central curation mechanism, teams waste hours evaluating and integrating MCP servers from unknown sources. Security risks, version conflicts, and duplication are common. Custom Catalogs solve this by acting as a single source of truth. They allow organizations to define which MCP servers are trusted, tested, and compliant with internal policies. By limiting discovery to approved servers, enterprises can enforce governance without stifling innovation. Developers can instantly find and use vetted tools—from internal services like logging APIs to external integrations like weather data—all within a consistent, auditable framework.
3. How Custom Catalogs Work Under the Hood
A custom catalog is essentially a metadata file (YAML or JSON) that references MCP server images and their configuration details. These servers can come from the Docker MCP Catalog, community repositories, or internally built images. The catalog file specifies server names, titles, images, descriptions, and types. Once created, it is published to a registry (like Docker Hub) and can be imported by any Docker Desktop instance. The magic is in the automation: when a developer activates a catalog, all approved servers appear in their MCP interface, ready to be used. This eliminates manual setup and ensures everyone uses the same, updated versions.
4. Building Your First Custom MCP Server
To create a custom server, start with a simple example: a dice-rolling MCP server that communicates over stdio. Write a Python or Node.js script that listens for MCP requests and responds. Package it into a Docker image and push to a registry like Docker Hub. For instance, name the image yourhub/mcp-dice. This server becomes the building block for your catalog. The official documentation provides a reference implementation (roll-dice) on GitHub. Once built, capture its metadata—name, title, image, description—in a YAML file (mcp-dice.yaml). This metadata is what the catalog will reference. The entire process is designed to be repeatable and automated through CI/CD pipelines.
5. Creating a Custom Catalog YAML File
With your server image ready, create a catalog YAML file that lists all approved servers. Include both external servers from the Docker MCP Catalog and your own. For example:
servers:
- name: roll-dice
title: Roll Dice
type: server
image: yourhub/mcp-dice@latest
description: An MCP server that can roll dice
- name: fetch-web
title: Fetch Web
type: server
image: dockerhub/mcp-fetch@latest
description: Fetches web pages
This catalog can reference servers from multiple sources—mix public, community, and internal. The key is that the catalog file is static, version-controlled, and signed if necessary. Publish it to a repository that Docker Desktop can access (e.g., a GitHub repo or an OCI-compliant registry). Your team then simply points Docker Desktop to this file, and all servers become available instantly.
6. Importing Catalogs via Docker Desktop
Docker Desktop makes it trivial to import custom catalogs. Go to the MCP section in Docker Desktop, click on "Manage Catalogs," and add the URL to your published catalog YAML file. Docker Desktop will fetch, parse, and display the approved servers in a dedicated interface. Developers can then browse and activate servers with a single click. This import process respects all metadata—descriptions, titles, and images—so team members know exactly what each server does and who made it. For power users, the CLI offers equivalent commands (docker mcp catalog add), enabling automation in scripts and CI pipelines.

7. Using MCP Profiles for Portability
While catalogs define what's available, Profiles define what's active. A profile is a named set of MCP server configurations that a developer can create, save, and share. For instance, a "frontend-dev" profile might include servers for linting, component generation, and API mocking. A "data-science" profile could add servers for database queries, Jupyter integration, and visualization. Profiles can be versioned and exchanged via JSON or YAML files. When a developer switches profiles, Docker Desktop automatically activates the corresponding servers, adjusting the AI toolset in seconds. This is especially powerful in monorepos or microservices environments where different tasks demand different tools.
8. Profiles and Catalogs: A Symbiotic Relationship
Catalogs and profiles are designed to complement each other. Think of catalogs as the library—the complete set of approved books. Profiles are your personalized reading lists. An organization curates the catalog once; developers mix and match profiles as needed. This separation ensures that IT maintains control over approved sources while developers retain the agility to choose exactly what they need. In practice, a developer first imports the corporate catalog, then creates a profile that selects only the servers relevant to their current sprint. If a required server isn't in the catalog, they request it through the established governance process rather than downloading unvetted code.
9. Real-World Enterprise Use Cases
The combination unlocks numerous practical scenarios. A financial institution can curate a catalog of MCP servers for regulatory compliance, data anonymization, and reporting—then developers build profiles for risk analysis, fraud detection, or client onboarding. A SaaS company might maintain a catalog of internal API wrappers, monitoring tools, and deployment helpers, with separate profiles for backend, frontend, and DevOps teams. Even within the same team, a developer could have a "debugging" profile with verbose logging servers and a "production" profile with performance-optimized ones. The flexibility ensures that AI tooling adapts to the workflow, not the other way around.
10. Future Directions and Best Practices
The MCP ecosystem is evolving rapidly. Expect catalogs to support version ranges, dependency resolution, and automatic updates. Profiles may gain inheritance (profiles based on other profiles) and conditional activation based on project context. For now, best practices include: always version your catalogs and profiles, sign them with cryptographic keys, and establish a review process for new server additions. Encourage developers to create and share profiles as part of their onboarding documentation. Monitor usage analytics to retire unused servers. By investing in these primitives now, your organization will be ready for the next wave of AI tooling—where MCP becomes the standard interface for all AI interactions.
Custom MCP Catalogs and Profiles are more than a feature; they are the foundation for safe, scalable, and developer-friendly AI adoption. By following these ten steps, you can transform your enterprise's approach to MCP management—from chaotic discovery to governed, portable tooling. Start small with one server and one catalog, then iterate. Your developers (and your compliance team) will thank you.
Related Articles
- Building AI at Scale: Why Kubernetes Is Your New Foundation for Inference and Production Workloads
- AWS Interconnect Goes Live: Managed Private Connectivity Across Clouds and to the Last Mile
- From Static to Dynamic: Cloudflare's New Workflows for Multi-Tenant Durable Execution
- 5 Key Insights into Kubernetes v1.36 Pod-Level Resource Managers (Alpha)
- How to Identify and Prevent Credential Theft from Malicious PyPI Packages
- Microsoft Achieves Leader Status in Forrester's Sovereign Cloud Platforms Evaluation
- Kubernetes v1.36 Introduces Tiered Memory Protection with Enhanced Memory QoS
- Securing ClickHouse Deployments: How Docker Hardened Images Bypass CVE Blockades