You’ve heard the buzzwords, you’ve seen the tweets. Probably gotten sick of them too! But just exactly what the hell is MCP?
In its essence, Model Context Protocol (MCP), is the definitive way to upgrade an LLM from a regular chatbot to an agent.
Practically speaking, an agent boils down to an LLM interacting in an environment with its tools, but a standard LLM does not inherently have the capability to interact with tools.
Up until now, this has led to having to construct tooling mechanisms from scratch, or by using external frameworks and SDKs like Langchain. While these existing and self-made frameworks work, they came with their own set of issues. Most importantly, LLMs simply lost track of what tools were available to it in the first place - not calling the tools when required, incorrect calling format, adherence to instructions, etc.
Traditional tooling requires manually connecting tools to an LLM, or by using agnetic frameworks that do something similar. Too cumbersome.
So, how do we solve this?
Enter MCP. By definition it is the standardised way to provide context to LLMs. The word context is specifically used here, as agents may not only interact with tools, but also need to interact with resources, and pre-existing prompt templates.
Why should I care? What use is it to me?
Here’s where it gets really fun.
Think of any task that you do on a day-to-day basis, that requires you to open a certain app and manually go through or add information.
This can be anything from your note taking app, to today’s stock market performance. From reading and writing emails, to reviewing pull requests to a repository.
What if instead of having to open 10 different apps, you could just ask any LLM of your choice “add potatoes to my shopping list”, and it could just do that for you without you having to open any app? That’s exactly what MCP aims to do. The LLM internally will have context of the fact that it has access to your note-taking app, then internally uses predefined tools to fetch/write data.
Here’s how it might look like in action:
Watch someone integrate it with Obsidian, a popular notetaking app.:
You wanna know the kicker?
MCP is designed such that if anyone writes code for a bunch of tools, you can directly integrate those tools into your existing chatbot of choice, without having to worry about tool integration. By setting a standardised way of programs providing context to LLMs, you unlock the doors to full modularity across tools, allowing you to add and remove them on the fly, and do some really cool stuff with it.
Some stuff people have built with MCP:
- Blender MCP, possibly the coolest one yet. Interact directly with blender and prompt scenes into existence!
- Ableton MCP, Ableton is a music editing software used by professionals and hobbyists, known for its intuitive controls.
- Obsidian MCP, directly interact with obsidian via applications that support MCP (like the Claude desktop app), without opening its ui
- Godot MCP, for the Godot game engine.
Conclusion
This is just the start. MCP has only just started to take off, A lot will be coming our way this year. 2025 is the year of agents, and MCP is here to make it infinitely more accessible and more available to everyone. What are your ideas on using MCP? Let us know in the comments :)