- The Inktelligence
- Posts
- The Inktelligence - May 12, 2025
The Inktelligence - May 12, 2025
One Protocol to Rule Them All: Connecting AI to Your Favorite Apps

This week we’re diving into the the world of Model Context Protocol (MCP) and how it works with LLMs. Think of it as APIs for LLMs.
MCP is a simple set of rules that lets AI tools talk safely and directly to other apps, files, or databases so they always have the right background information . Think of MCP like a USB-C port for AI. It plugs the AI chatbots into your tools (like Notion, HubSpot, or your local files) so they give better, more accurate answers every time
Origin of the name
The name Model Context Protocol was chosen to emphasize each of its core functions:
Model: MCP is designed specifically for large language models (LLMs), positioning them as the primary “consumer” of context.
Context: The protocol standardizes how background information like data from databases, file systems, APIs, etc. is packaged and delivered into a model’s context window.
Protocol: As an open, formal specification (like HTTP or USB), MCP defines a consistent set of rules and message formats for communication between LLMs and external systems.
Anthropic’s announcement spells it out plainly: “MCP provides a universal, open standard for connecting AI systems with data sources, replacing fragmented integrations with a single protocol.”