Introducing the MCP Server 1.0 for EventSourcingDB¶
AI-powered agents are no longer a future promise. They are here, embedded in development workflows, automating decisions, analyzing data, and helping teams move faster. Tools like Claude, ChatGPT, and Gemini have become part of the daily toolkit for developers and architects alike. But until now, these agents had no way to talk to your event store. They could reason about code, summarize documents, and generate queries, but they could not read your events, explore your subjects, or run an EventQL query against live data. The most valuable data source in an event-sourced system was invisible to AI.
Today, we are changing that. We are releasing the MCP Server 1.0 for EventSourcingDB, an extension that connects large language models directly to a running EventSourcingDB instance. It implements the Model Context Protocol, an open standard that defines how AI models discover and invoke external tools. With this release, your AI agents can read events, write events, browse subjects, inspect event types, register schemas, and run EventQL queries, all through natural language.
Events Know More Than State¶
Why does connecting AI to an event store matter more than connecting it to a traditional database? Because events capture the full story, not just the latest snapshot. A traditional database tells you that a customer's subscription is cancelled. An event store tells you that the customer signed up six months ago, upgraded twice, contacted support three times, received a discount offer, and then cancelled after a billing dispute. That narrative is exactly the kind of rich context that makes AI agents genuinely useful.
We explored this idea in depth in Training AI Without the Data You Don't Have: events encode behavior over time, not just state at a point in time. When an AI agent has access to the full event history, it can answer questions that would be impossible with snapshots alone. "What happened before this customer churned?" is a fundamentally different question from "Is this customer active?" and it requires a fundamentally different data source to answer.
Data Is the New Gold, Here's How to Mine It made the case that the value of data depends on how accessible it is. The MCP Server is our answer to making event data accessible to an entirely new class of consumers: AI agents. Instead of building custom integrations for every LLM, the MCP Server provides a single, standardized interface that any MCP-compatible client can use.
What the Model Context Protocol Does¶
The Model Context Protocol (MCP) is an open standard that defines how AI models discover and invoke external tools. Think of it as a contract between an LLM and the outside world. The model connects to an MCP server, receives a list of available tools with descriptions of what each tool does, and then decides which tools to call based on your natural language prompts.
The MCP Server for EventSourcingDB sits between your LLM and your database. When the LLM connects, it discovers tools for every core EventSourcingDB operation. From that point on, you can interact with your event store through conversation. The MCP Server translates the LLM's tool calls into HTTP API requests against EventSourcingDB and returns the results back through the same chain.
The transport protocol is Streamable HTTP, which most MCP-compatible clients support out of the box. The MCP Server exposes its endpoint at /mcp, so if it runs on http://localhost:3000, the full URL is http://localhost:3000/mcp. Authentication is optional: you can secure the MCP Server with its own API token, separate from the EventSourcingDB API token, to control who can connect.
From Natural Language to Event Queries¶
The MCP Server exposes ten tools that cover the full range of EventSourcingDB operations. Rather than listing them in a table, here is what they enable you to do in practice.
Ask your LLM to explore the event store. "What subjects exist under /orders?" calls the subjects tool. "What event types have been recorded?" calls the event types tool. "Show me the schema for order-placed events" calls the event type detail tool. Within seconds, you have a map of your data without writing a single API call.
Read and inspect events. "Show me the last ten events for /customers/42" retrieves events for a specific subject. "What happened to this order between Monday and Wednesday?" filters by time. The LLM formats the results into a readable summary, highlighting the information that matters for your question.
Write events when prototyping. "Record a book-acquired event for /books/84 with title 'Rendezvous with Rama'" constructs the event candidate and writes it. This is particularly useful during design sessions when you want to test how your event model feels before committing to code.
Run EventQL queries. This is where the MCP Server becomes truly powerful. "How many orders were placed this week?" or "Which customers have more than five returns?" triggers an EventQL query. The LLM can even ask for the EventQL documentation through a dedicated tool, learn the syntax on the fly, and write queries that would take you minutes to compose manually. If you are curious about the design decisions behind EventQL, Designing EventQL, an Event Query Language tells the full story.
Validate and enforce schemas. "Register a JSON Schema for payment-received events requiring amount and currency" calls the schema registration tool. From that point on, EventSourcingDB validates every incoming event of that type against the schema. Your LLM just helped you set up data quality guardrails in a single sentence.
Getting Started in Three Commands¶
The MCP Server is distributed as a Docker image. Pull it, run it, and point your LLM at it. If you have followed our Local Development Setup guide before, the pattern will feel familiar.
First, pull the image:
Then start the MCP Server, pointing it at your running EventSourcingDB instance:
docker run -it -p 3001:3000 \
thenativeweb/eventsourcingdb-mcp run \
--esdb-url http://host.docker.internal:3000 \
--esdb-api-token secret \
--http-enabled \
--https-enabled=false
This starts the MCP Server on port 3001 of your host machine, connecting to an EventSourcingDB instance on port 3000. In production, you would configure HTTPS with a certificate and private key instead of disabling it.
Finally, configure your LLM client to connect to http://localhost:3001/mcp using Streamable HTTP as the transport protocol. The exact configuration depends on your client. Once connected, the LLM discovers all available tools automatically.
That is it. Three commands, and your AI agent can talk to your event store.
Where It Shines¶
The MCP Server is built for interactive, exploratory workflows. The scenarios where it adds the most value are the ones where speed of understanding matters more than raw throughput.
Onboarding new team members. Instead of reading through documentation and tracing code paths, a new developer can ask the LLM: "What event types does this system use? Show me a typical event sequence for an order." The LLM queries the live event store and returns real examples, not abstract documentation.
Prototyping and design. When you are modeling a new feature, you want fast feedback. Write a few test events through the LLM, run queries against them, adjust your model, repeat. No boilerplate, no test harness, just conversation.
Debugging. Something went wrong in production. You need to understand the sequence of events that led to the current state. Ask the LLM to read the event history for the affected subject, filter by time range, and summarize what happened. The answer comes in seconds, not after minutes of assembling API calls.
Learning EventQL. If you are new to EventQL, the MCP Server is the fastest way to learn. Describe what you want to know in plain English, and the LLM writes the query for you. Read the query, understand the syntax, and learn by example rather than by reading reference documentation.
Where It Does Not Belong¶
We want to be clear about the boundaries. The MCP Server is not intended for production write paths. Writing events in a real application should go through a Client SDK or the HTTP API directly, where you have full control over preconditions, error handling, and transactional guarantees. The MCP Server adds a layer of interpretation that is valuable for human interaction but unnecessary and inappropriate for automated systems that need deterministic behavior.
Similarly, high-throughput pipelines that process large volumes of events are better served by direct API access. The MCP Server is designed for conversational interaction, not for streaming millions of events through an LLM. Use it where human understanding is the goal, and use the direct APIs where machine efficiency is the goal.
This is not a limitation. It is a deliberate design decision. The MCP Server and the Client SDKs serve different purposes, and trying to use one where the other belongs leads to frustration. Use the right tool for the right job.
Your Events, Your Agents¶
The MCP Server 1.0 opens EventSourcingDB to the world of AI-powered workflows. Whether you are exploring your event store, onboarding a colleague, prototyping a new feature, or debugging a production issue, your LLM can now work directly with your events.
To get started, head to the MCP Server documentation for the full setup guide, configuration options, and detailed tool descriptions. If you want to explore how AI agents and Event Sourcing work together beyond the MCP Server, eventsourcing.ai is a good place to start.
We would love to hear what you build with it. Reach out at hello@thenativeweb.io.