Applied Intelligence
Module 10: MCP and Tool Integration

Understanding Model Context Protocol

The integration problem

Every enterprise has tools. Databases, issue trackers, documentation systems, monitoring platforms, internal APIs. Dozens of systems, sometimes hundreds.

AI coding agents need access to these tools. A bug fix might require querying a database to understand the data model. A feature implementation might need to check Jira for acceptance criteria. A deployment might involve updating a Confluence page with release notes.

Without standardization, each agent-tool combination requires custom integration code. Ten AI applications connecting to fifty tools means 500 unique integrations. Each integration requires maintenance, security review, and testing. The math gets ugly fast.

This is the M×N problem. M models multiplied by N tools equals an unsustainable number of integrations.

The MCP solution

Model Context Protocol solves M×N with standardization.

Instead of building custom integrations for every combination, MCP defines a common interface. Tools implement the server side once. AI applications implement the client side once. Any client can talk to any server.

Ten clients plus fifty servers means sixty implementations, not five hundred. New tools automatically work with existing clients. New clients automatically work with existing servers.

Anthropic launched MCP in November 2024. By April 2025, downloads had grown from 100,000 to over 8 million. The ecosystem includes more than 5,800 servers and 300 clients. OpenAI adopted MCP in March 2025. Google announced Gemini support in April. Microsoft joined the steering committee in May.

In December 2025, Anthropic donated MCP to the Agentic AI Foundation under the Linux Foundation. Founding members include Amazon, Anthropic, Block, Bloomberg, Cloudflare, Google, Microsoft, and OpenAI. When competitors agree to share infrastructure, it usually means the problem space has matured.

Gartner predicts 75% of API gateway vendors and 50% of iPaaS vendors will have MCP features by end of 2026.

The USB-C analogy

Before USB, every peripheral had its own connector. Printers used parallel ports. Keyboards used PS/2. Cameras used proprietary cables. New devices meant new ports, new drivers, new incompatibilities.

USB-C solved this with one universal standard. Any device, any peripheral, one connector.

MCP is USB-C for AI. Before MCP, connecting Claude to a database required custom code. Connecting Cursor to Jira required different custom code. Every combination was its own project.

With MCP, a PostgreSQL server works with Claude Code, Cursor, Windsurf, and any other MCP client. A GitHub server works the same way. Write the server once, connect to anything.

The analogy extends further. USB-C carries power, data, and video over the same connector. MCP carries tools (executable actions), resources (data access), and prompts (reusable templates) over the same protocol. One standard handles multiple concerns.

Protocol versus API

MCP is a protocol, not an API. The distinction matters.

APIs are stateless. Each HTTP request is independent. The server has no memory of previous requests. Maintaining state requires explicit session management: tokens, cookies, database lookups.

MCP is stateful. Connections maintain sessions. Context accumulates over the lifetime of a connection. The server knows what happened earlier in the conversation.

APIs have fixed endpoints. Documentation defines available routes. Adding functionality means adding endpoints. Clients must be updated to use new endpoints.

MCP has dynamic discovery. Clients query servers for capabilities at runtime. If a server adds a new tool, clients learn about it automatically. No code changes required on the client side.

APIs use request-response. Client sends request. Server sends response. Communication flows one direction at a time.

MCP supports bidirectional messaging. Clients can request from servers. Servers can send notifications to clients. Servers can even request that clients perform LLM completions. Communication flows both ways.

AspectTraditional APIMCP
StateStateless (explicit session management)Stateful (session context maintained)
DiscoveryStatic endpoints (Swagger/OpenAPI docs)Dynamic capability queries at runtime
UpdatesPolling or webhooksReal-time notifications
DirectionRequest-responseBidirectional messaging

The protocol abstraction enables behaviors that stateless APIs cannot support. A server can notify clients when available tools change. A server can ask the client's LLM to generate content. State persists without explicit management.

Why protocol design matters for ASD

The protocol versus API distinction has practical implications for agentic work.

Dynamic discovery reduces configuration. When agents can query servers for capabilities, developers do not need to maintain tool inventories. The agent learns what is available by asking.

Stateful sessions reduce token waste. With stateless APIs, every request must include context: previous conversation history, authentication details, relevant state. With MCP, this context persists. Follow-up requests can reference earlier interactions without repeating information.

Bidirectional messaging enables complex workflows. A database server can notify the agent when a query completes. A file server can stream updates as files change. A code execution server can send output as it becomes available. The agent does not need to poll.

Server-initiated sampling enables augmented capabilities. MCP servers can request that the client's LLM perform completions. A documentation server could ask the LLM to summarize a page before returning it. A code analysis server could ask the LLM to classify a code pattern. The server gains access to LLM capabilities it does not host itself.

When evaluating whether to build custom integrations or use MCP, consider the protocol's built-in capabilities. Dynamic discovery, persistent state, and bidirectional messaging often eliminate custom code that stateless APIs would require.

MCP in context of earlier modules

This course has mentioned MCP before. Module 1 introduced it as a way to extend Claude Code and Codex capabilities. Module 3 discussed MCP as part of the context hierarchy: configuration files define static context, while MCP servers provide dynamic context retrieved at runtime. Module 4 noted MCP's token cost, approximately 16.5% of context for tool definitions.

This module goes deeper. The pages ahead cover architecture and primitives, transport mechanisms, the server ecosystem, configuration and security, and building custom servers.

Database queries, API integrations, file system access, browser automation, issue tracking: all become accessible through MCP. The practical difference between "possible with custom code" and "available through configuration" determines whether most teams will bother.

The enterprise adoption case

Enterprise environments have particular requirements that MCP addresses.

Centralized governance. Security teams can audit MCP server configurations. Permission boundaries exist at the protocol level. Managed configurations can enforce organizational policies.

Credential management. OAuth 2.1 with PKCE is the standard authentication mechanism. Existing identity providers integrate naturally. No custom authentication schemes to review.

Audit trails. The protocol includes standardized logging patterns. Compliance teams can verify what agents accessed and when.

Gradual adoption. Servers can be added incrementally. Teams can start with one integration and expand. Nothing forces an all-or-nothing deployment.

Bloomberg uses MCP as an organization-wide standard. Block, Amazon, and hundreds of Fortune 500 companies have deployed MCP servers. These are production deployments, not pilots.

What comes next

The following pages examine MCP's technical foundations: architecture and primitives (tools, resources, prompts, sampling), transport mechanisms (stdio for local, HTTP for remote), the ecosystem of available servers, configuration and authentication, building custom servers, and security considerations.

On this page