MCP Part I - Core Concepts, Past, Present And Future Of Agentic systems

This article, the first in a three-part series, introduces the Model Context Protocol (MCP), a standard designed to enable AI agents to interact with diverse digital environments beyond simple chat windows. It defines key concepts like agents, environments, and autonomy, highlighting the need for agents to access digital tools via a structured communication protocol. MCP facilitates this by defining MCP Servers (tool providers), Hosts (applications running LLMs), and MCP Clients, enabling agents to utilize resources, tools, and prompts exposed by servers. The article argues that MCP is a key enabler of the next digital revolution, where AI assistants perform complex cognitive tasks, and it sets the stage for future articles that will delve into practical implementations using Google’s VertexAI and a custom cybersecurity server.

MCP Part II - Implementation: Custom Host with VertexAI and Gemini

This article details my journey in building a custom chat host for AI agents, moving away from existing solutions to gain a deeper understanding of the underlying technologies. I implement a chat engine using Google’s Vertex AI and Go, focusing on compatibility with the OpenAI API to integrate with tools like Big-AGI. The article covers the core architecture, including my use of ChatSession and GenerativeModel from the Vertex AI SDK. It delves into the implementation of the /v1/chat/completions endpoint, highlighting the challenges of streaming responses and integrating function calls. I also describe a workaround for handling function calls in a streaming context and introduce the concept of a callable interface to prepare for implementing the Model Context Protocol (MCP) in future work. The goal is to move the tools outside of the agent. This will be detailes in the last part of this series.

MCP Part III - Application: Custom Server for a Specific Use Case

This final article in a three-part series explores decoupling tools from the host using the Model Context Protocol (MCP) for flexibility and reusability. The author builds an MCP-based tool in Go to execute SQL queries via DuckDB, enabling seamless chatbot interaction while preserving privacy. The implementation covers JSON-RPC handling, tool encapsulation, and integration. The project validates MCP’s effectiveness, with future plans to replace VertexAI with Ollama and add multi-session support.

How to Activate the Value Flywheel Effect with Your Data

In today’s hyper-competitive world, businesses no longer rely solely on gut decisions or intuition; they depend on data-driven insights to stay agile and make fast, smart decisions. However, data alone isn’t the answer; it’s the enabler to create momentum on a business & technology flywheel: a model where data drives decisions, decisions drive actions, and those actions drive value, propelling the business forward in a self-reinforcing cycle. In a previous post, I used a model to explain how data could cross the borders of applications and domains to bring increasing value at the organizational level.

The Future of Data Management: An Enabler of AI Development? A Basic Illustration with RAG, Open Standards, and Data Contracts

Context In a recent meetup I organized in my hometown of Lille, I had the pleasure of hosting Jean-Georges Perrin, who provided a comprehensive introduction to data contracts. As a geek, I felt compelled to test this concept to fully grasp its practical implications. The goal of this article is to demonstrate how data contracts can be applied to and add value within a small ecosystem facing cross-domain challenges. To illustrate, I will use my personal experience in the fields I work in, which can be categorized into two separate domains:

Exploring exaptations in engineering practices within a RAG-Based application

In this article, I delve into the concept of RAG, aiming to write a RAG nearly from scratch to view it as a pure engineering problem. Learning by doing from scratch will help me eventually discover a kind of exaptation that can guide my decisions as an engineer and clarify any points of confusion I have in understanding the system. I used information from an article in Go because I am fluent in that language. I will write a step-by-step method to create a simple (though not efficient or effective) RAG, noting discoveries that may be useful for my work as a consultant and engineer.

Data-as-a-Product and Data-Contract: An evolutionary approach to data maturity

Using Simon Wardley’s evolution model, I propose a framework for visualizing the maturity of data within a business context, emphasizing the importance of treating data as a product and implementing data contracts to facilitate integration and ensure trust. Ultimately, I suggest that starting with a focus on data-as-a-product is crucial for organizations embarking on their data mesh journey, paving the way for a comprehensive and agile transformation.