Let’s talk about Model Context Protocol (MCP)

First, let’s talk about LLVM, the modular and reusable compiler toolchain:

Text describing the features of LLVM: Standards-based Abstraction, Decoupled Components, Highly Extensible, Cross Platform, with the LLVM logo and an unclear symbol.

With LLVM as a compiler backend, we write code in the language of our choice, and compile it to any number of supported backend architectures.

“develop a frontend for any programming language and a backend for any instruction set architecture.”

A list of programming languages (frontends) including C, C++, and Rust, alongside a list of architectures (backends) including ARM, x86, and PowerPC, formatted in bold text.

LLVM’s extensible architecture lets us compile code (using Clang, Rustc, etc.) from any supported language to an intermediate layer. That intermediate can then take care of optimization and compilation to any supported target platform.

Diagram comparing programming language frontends and instruction set architectures with and without LLVM, showing connections for ARM, x86, and PowerPC, represented by C, C++, and Rust logos.

Now that we have a deep understanding of LLVM, let’s talk about MCP (for real this time). MCP is just like LLVM, except that it’s totally different. MCP was introduced by Anthropic in 2024 and provides data sources and tool calling over HTTP (or STDIO) for AI assistants.

A visual representation showcasing key attributes: 'Standards-based Abstraction', 'Decoupled Components', 'Highly Extensible', and 'Cross Platform', alongside logos of an unspecified tool and LLVM.

With MCP, we connect any tool or datasource to any supported LLM, like Claude, LLama, or OpenAI.

“develop a frontend for any tool or data source and a backend for any LLM.”

Pretty much anything can be a data source for MCP. You just have to write the corresponding MCP integration for that data source. Anthropic has some nice examples.

Text listing data sources like databases, files, and web services, along with large language models such as Claude, Llama, and OpenAI.

MCP’s extensible architecture lets us fetch data or run tools any supported platform, providing a well-known intermediate layer. That intermediate layer can then be invoked by any supported LLM platform.

Comparative diagram showing connections between AI models Claude, Llama, and OpenAI with and without MCP, alongside data sources including an Oracle Database.

Again, fewer lines on the diagram is better. Less bespoke integrations, more plug-and-play functionality.

There’s an MCP server for Oracle Database?

Curious? Try it out. It works with Oracle Database Free, supporting versions 19c, 21c, and 23ai.

Or, watch a demo to see how MCP works in action (with Oracle Database).

Leave a Reply

Discover more from andersswanson.dev

Subscribe now to keep reading and get access to the full archive.

Continue reading