Mkdnsite is a runtime-only Markdown web server that uses HTTP content negotiation to serve HTML to human browsers and raw Markdown to AI agents. Released in late March 2026 after just three weeks of development, the project represents a fundamental shift in how we think about content delivery for the agentic era. Unlike traditional static site generators that compile Markdown to HTML at build time, Mkdnsite renders content on demand using Bun.markdown, eliminating build steps entirely while solving the circular inefficiency of converting Markdown to HTML only to have AI agents scrape it back to Markdown. Built almost entirely by OpenClaw autonomous agents collaborating through Slack, Mkdnsite includes built-in MCP server support, automatic llms.txt generation, and GitHub repository integration, making it possibly the easiest way to deploy agent-readable documentation in 2026.
What is Mkdnsite and why does it matter for AI agents?
Mkdnsite (pronounced “mark-down-site”) is an open-source web server that treats Markdown as a first-class citizen rather than a source format to be compiled away. You write content in GitHub-Flavored Markdown, drop the files in a directory, and point the server at them. When a human visits your site, they get rendered HTML with syntax highlighting, Mermaid diagrams, KaTeX math, and Chart.js visualizations. When an AI agent requests the same URL, it receives the raw Markdown source with frontmatter intact. This approach is central to the project’s philosophy of “agent-native content.”
This matters because the current web forces agents to waste tokens parsing HTML DOM trees to extract content that started as Markdown. Cloudflare’s recent “Markdown for Agents” announcement highlighted this problem but proposed converting HTML back to Markdown at the edge. Mkdnsite skips both conversion steps. You maintain one source of truth in Markdown, and the server serves the appropriate format based on who is asking. For OpenClaw developers building agent ecosystems, this means your documentation, API specifications, and knowledge bases remain in a format agents can consume directly without HTML parsing overhead. This significantly improves the efficiency and accuracy of agent interactions with your digital content.
How does content negotiation work between humans and agents?
Mkdnsite implements standard HTTP content negotiation to determine what format to serve. When your browser requests a page, it sends an Accept header that typically looks like text/html,application/xhtml+xml,application/xml;q=0.9. Mkdnsite detects this preference and renders the Markdown to HTML using Bun.markdown, applying your custom theme and including all the runtime JavaScript for diagrams and math rendering. This ensures a rich, interactive experience for human users.
When an AI agent like an OpenClaw instance or Claude Code requests the same endpoint, it either sends Accept: text/markdown or includes specific agent identification headers, such as User-Agent: OpenClawBot/1.0. Mkdnsite responds with the raw Markdown file, preserving frontmatter metadata and avoiding any token-wasting HTML wrapper. This direct access to source material is crucial for agent efficiency. You can test this behavior yourself with curl:
# Human request - returns HTML
curl -H "Accept: text/html" https://your-site.com/docs/api.md
# Agent request - returns raw Markdown
curl -H "Accept: text/markdown" https://your-site.com/docs/api.md
This dual-mode serving happens at runtime without separate build pipelines or duplicate file generation. The server maintains an in-memory cache of rendered HTML for performance, but agents always receive fresh Markdown directly from disk or your configured GitHub repository, guaranteeing they always have the most up-to-date information.
Why did the builder reject Cloudflare’s Markdown-for-Agents approach?
The creator of Mkdnsite started the project after reading Cloudflare’s February 2026 announcement about “Markdown for Agents.” The core insight was recognizing the absurdity of the modern content pipeline: you write documentation in Markdown, run a static site generator to convert it to HTML, deploy that HTML to a CDN, and then have Cloudflare convert it back to Markdown at the edge so AI agents can read it. This Markdown-to-HTML-to-Markdown roundtrip wastes compute, bandwidth, and valuable LLM tokens.
Instead of adding another conversion layer at the edge, Mkdnsite keeps content in Markdown at rest and renders to HTML only when necessary for human consumption. Agents bypass the rendering entirely, accessing the source material directly. This approach eliminates the “impedance mismatch” between human-facing websites and agent-facing APIs. You do not need to maintain separate documentation sites or JSON APIs for agents. The same URLs work for everyone, but the content adapts to the consumer. For builders running OpenClaw agents on local machines, this means your agents can pull documentation directly from your Mkdnsite instance without parsing HTML or wrestling with complex DOM selectors, streamlining their information retrieval processes.
What runtime features come bundled with Mkdnsite?
Mkdnsite ships with a “batteries included” philosophy, particularly suited for Bun’s runtime environment. You get GitHub-Flavored Markdown rendering out of the box, including tables, strikethrough, and task lists. The server automatically handles Mermaid diagram rendering for flowcharts and sequence diagrams, KaTeX for mathematical expressions, Chart.js for data visualizations, and syntax highlighting for over 100 programming languages, ensuring a rich visual experience for human readers.
For human users, Mkdnsite provides full-text search functionality that indexes your Markdown content at runtime, allowing for quick information retrieval. For agents, it exposes an MCP (Model Context Protocol) server that allows OpenClaw and other compatible agents to query your site’s structure, search content programmatically, and pull specific documents into their context windows. The theming system supports automatic light and dark mode detection based on user preferences, and you can customize the CSS without rebuilding the entire site. Unlike static generators that require plugin installation and build configuration, these features work immediately when you run the server, simplifying deployment and maintenance.
How does Mkdnsite compare to traditional static site generators?
Static site generators like Astro, Hugo, and Jekyll have dominated documentation and blog hosting for years, but they operate on a fundamentally different paradigm than Mkdnsite. While static generators excel at delivering pre-built HTML for high-traffic scenarios, Mkdnsite optimizes for the emerging pattern where a significant portion of traffic originates from AI agents who benefit from raw Markdown. The following table breaks down the key differences to provide a clearer understanding of their respective strengths and weaknesses.
| Feature | Mkdnsite | Astro/Hugo/Jekyll |
|---|---|---|
| Build Step | None (runtime only) | Required before deployment |
| Agent Support | Native (raw Markdown via negotiation) | Requires plugins, custom parsers, or scraping |
| Content Updates | Instant (filesystem/GitHub polling) | Requires rebuild and redeploy |
| MCP Integration | Built-in server | Not available, requires custom API |
| Deployment | Single binary/Docker container | Build artifact + web server (e.g., Nginx) |
| Token Efficiency | Optimal for agents (raw Markdown) | Agents must parse verbose HTML |
| Primary Audience | Humans AND AI Agents | Primarily Humans |
| Dynamic Features | Runtime rendering, search, MCP | Limited to client-side JS, pre-rendered HTML |
| Complexity | Minimal setup, direct content mapping | Configuration, templating, build pipelines |
Mkdnsite trades the build-time optimization of static generators for runtime flexibility and agent-native content delivery. While static generators excel at serving pre-built HTML to massive traffic spikes, Mkdnsite optimizes for the emerging pattern where half your traffic consists of AI agents that prefer Markdown anyway. You can run Mkdnsite on a small VPS or even edge functions, pulling content directly from GitHub repositories without ever running a local git clone or build command, leading to a much more agile content management workflow.
How did OpenClaw agents build Mkdnsite in just three weeks?
The development timeline of Mkdnsite demonstrates the acceleration possible when using autonomous AI agents for software development. The project started on March 7, 2026, when the creator worked with Claude to refine requirements and scaffold the initial codebase. The following Friday, they configured OpenClaw agents on their personal machine and integrated them with Slack for asynchronous communication, mirroring a human development team structure.
From that point, the development workflow operated like a distributed team. The creator logged feature ideas and bugs as GitHub issues, then discussed priorities with a “team lead” agent via Slack. The OpenClaw agents picked up tasks, researched solutions, implemented features, and submitted code via pull requests. Individual Claude Code sessions handled specific refactoring and complex logic challenges, providing specialized expertise. By March 16, v1.0.0 was released, offering core functionality. By March 28, v1.4.1 shipped with crucial features like MCP support and GitHub integration.
Almost every line of code in the repository was written by AI. The human served as product manager, architect, and quality assurance, while OpenClaw agents handled the implementation details. This represents a new model of software development where a single developer with a cluster of agents can ship production-ready infrastructure in weeks rather than months, significantly compressing development cycles.
What MCP tools does Mkdnsite expose for agent integration?
Mkdnsite implements a Model Context Protocol (MCP) server that allows AI agents to interact with your content programmatically. When you connect an OpenClaw agent to your Mkdnsite instance, it gains access to a suite of tools like search_content, get_page_metadata, list_directory_structure, and fetch_raw_markdown. These tools are designed to provide agents with structured access to information, eliminating the need for heuristic-based web scraping.
These tools enable agents to discover documentation without the overhead of HTML parsing. For example, an agent can call search_content with a query like “authentication endpoints” and receive structured JSON containing matching Markdown files with relevance scores, summaries, and direct links to the raw content. The get_page_metadata tool extracts frontmatter from your Markdown files, allowing agents to understand content hierarchy, authorship, and relationships without guessing based on HTML structure.
You configure the MCP server in your OpenClaw settings by pointing to your Mkdnsite URL. The agent then treats your documentation as a queryable knowledge base rather than a website to be scraped. This integration works particularly well with the hosted service at mkdn.io, where agents can access public documentation repositories with zero configuration, enabling seamless knowledge transfer.
How do you deploy Mkdnsite in production?
You have several deployment options depending on your infrastructure preferences and existing tooling. If you use Bun, you can compile Mkdnsite to a standalone executable that runs without requiring Bun or Node.js to be installed on the target system, offering a highly portable deployment artifact:
bun build --compile ./src/index.ts --outfile mkdnsite
./mkdnsite --port 3000 --content ./docs
For Docker environments, the project provides a container image that accepts environment variables for configuration, making it easy to integrate into container orchestration platforms:
docker run -p 3000:3000 -v $(pwd)/docs:/content mkdnsite:latest \
--github-repo owner/name \
--theme dark
Mkdnsite also runs on standard Node.js or Deno if you prefer those runtimes, though you lose the single-binary convenience of Bun’s compilation. The GitHub repository integration allows you to point the server at a remote repo without cloning it locally, making it ideal for ephemeral environments or serverless deployments. The server pulls Markdown files on demand and caches them in memory, so you can deploy a documentation site in seconds without setting up complex build pipelines or CI/CD for content updates. This flexibility makes Mkdnsite adaptable to various hosting scenarios.
Why is runtime rendering superior to build-time for agent-facing sites?
Static site generators optimize for the assumption that content changes infrequently and human readers dominate traffic. They trade flexibility for speed by pre-rendering HTML. However, this assumption breaks down in the agentic era where your documentation serves both humans and AI agents, and where agents consume raw Markdown more efficiently than HTML. Runtime rendering directly addresses these new priorities.
Runtime rendering eliminates the build step entirely. When you update a Markdown file in your repository, agents see the change immediately without waiting for a rebuild and redeploy cycle. This matters for OpenClaw workflows where agents might be reading documentation to complete tasks, and stale content leads to errors and inefficient operations. Runtime rendering also means you do not need to maintain separate build pipelines for different output formats. The same Markdown file serves both audiences from a single source, simplifying content management.
The performance cost of runtime rendering is minimal with modern JavaScript runtimes like Bun. Markdown conversion is computationally cheap compared to HTML parsing for agents. You can always add a caching layer in front of Mkdnsite if you need to serve massive traffic, but for most documentation and blog use cases, the runtime overhead is imperceptible while the flexibility gains are substantial. This shift in paradigm prioritizes agility and agent efficiency over traditional static site benefits.
How does Mkdnsite handle theming for human users?
While agents receive raw Markdown, humans get a fully themed browsing experience designed for optimal readability and aesthetics. Mkdnsite includes a default UI that automatically detects system light or dark mode preferences and adjusts the color scheme accordingly, providing a personalized experience. You can customize the CSS by providing a custom stylesheet path or embedding styles directly in the configuration, allowing for complete brand alignment.
The theming system supports syntax highlighting themes for code blocks, allowing you to match your brand colors and maintain visual consistency across your site. Mermaid diagrams inherit the current theme colors, ensuring they do not clash with your design, and KaTeX math renders with proper font stacks and sizing for clear mathematical notation.
Because theming happens at runtime, you can change the appearance of your site without rebuilding content. This is particularly useful for white-label documentation where different clients might need different branding, or for A/B testing UI changes. You simply restart the server with new theme parameters, and all pages reflect the new design immediately. Agents ignore theming entirely, so they do not waste tokens processing CSS or JavaScript, further enhancing their efficiency.
What is llms.txt and why does Mkdnsite automate it?
The llms.txt standard is emerging as a way for websites to declare their agent-friendly endpoints and content structure. Similar to robots.txt but specifically tailored for AI consumption, this file tells agents where to find raw content, what the site contains, and how to navigate it in a structured manner. It acts as a machine-readable sitemap for the agentic web.
Mkdnsite automatically generates llms.txt based on your Markdown file structure and frontmatter. When you add a new documentation page, the server dynamically updates the llms.txt endpoint to include the new path, title, and description. Agents can fetch this file once to understand your entire site architecture, then request specific Markdown files as needed, optimizing their content discovery process.
This automation saves you from manually maintaining separate sitemaps or API documentation specifically for agents. The llms.txt file includes metadata about which files are available, their last modified dates, and their content summaries extracted from frontmatter. For OpenClaw developers, this means your agents can discover and index your documentation without human intervention or custom scraping logic, integrating seamlessly into their operational workflows.
How does the GitHub repository integration work?
Mkdnsite can operate in “remote mode” where it pulls Markdown files directly from a GitHub repository rather than relying on the local filesystem. You pass a repository identifier like owner/repo and optional branch or tag parameters, allowing for flexible content sourcing:
mkdnsite --github-repo openclaw/docs --branch main --port 3000
The server uses the GitHub API to fetch file listings and raw content. It caches these responses in memory with a configurable time-to-live (TTL), so repeated requests for the same file do not unnecessarily hit GitHub’s API rate limits. When you push updates to your repository, Mkdnsite fetches the new content on the next request or cache invalidation, ensuring content freshness.
This mode is ideal for documentation sites where you want the content to live in GitHub but do not want to set up webhooks, CI/CD pipelines, or static hosting infrastructure. You can run Mkdnsite on a small Virtual Private Server (VPS) or even locally, and it serves the latest content from GitHub without any local git repository. Agents accessing your site get the same fresh content, making this a zero-maintenance solution for agent-readable documentation, simplifying content deployment significantly.
What are the performance implications of runtime Markdown rendering?
You might initially worry that rendering Markdown at runtime introduces latency compared to serving pre-built static HTML files. In practice, Bun.markdown is highly optimized and fast enough to handle hundreds of requests per second on modest hardware. The server maintains an efficient Least Recently Used (LRU) cache of rendered HTML, so repeat human visitors experience near-instant responses without the need for re-rendering, mitigating potential performance concerns.
For agents receiving raw Markdown, the overhead is essentially zero, as the server merely streams the file content directly from disk or GitHub. No rendering occurs for agent requests, ensuring maximum efficiency for AI consumers. Memory usage scales with the number of unique pages cached, but you can configure cache limits and TTL values to fit your specific hardware constraints and traffic patterns.
If you need to serve thousands of requests per second, you can place a Content Delivery Network (CDN) or a reverse proxy like Varnish in front of Mkdnsite. The inherently static nature of the content (even when rendered dynamically for humans) makes it highly cacheable at the edge. The key insight is that you only pay the rendering cost for human traffic, while agent traffic bypasses the rendering pipeline entirely, optimizing resource allocation based on the consumer.
How does Mkdnsite change the economics of agent communication?
Token costs are a significant factor when you are running AI agents at scale, especially with large language models (LLMs). HTML is notoriously verbose. A simple paragraph wrapped in <div class="content"><p>...</p></div> consumes tokens that convey no semantic value to the agent, essentially leading to wasted processing. When agents scrape traditional websites, they often waste 30-50% of their context window on parsing extraneous HTML tags, CSS classes, and JavaScript snippets.
By serving raw Markdown, Mkdnsite reduces the token count for agent consumption by 60-80% compared to HTML scraping. This translates directly to substantial cost savings on API calls to LLMs, making agent operations more economically viable. It also reduces latency, since agents process the content faster and can make fewer recursive requests to follow links (they can parse Markdown links more easily and reliably than HTML href attributes).
For OpenClaw developers running local models or paying per-token for cloud APIs, this efficiency matters greatly. Your agents can fit more documentation into their context windows, leading to better task completion rates, more comprehensive understanding, and fewer hallucinations caused by truncated or poorly parsed content. This fundamental shift improves the overall effectiveness and cost-efficiency of agentic systems.
What security considerations exist for agent-facing endpoints?
Serving content to autonomous agents introduces different security considerations than serving solely to humans. Agents may request content at much higher rates, making rate limiting a critical component to prevent accidental or malicious denial-of-service attacks. Mkdnsite supports configuration for request throttling based on IP address or user-agent patterns, allowing you to manage access effectively.
You should also validate that sensitive Markdown files are not inadvertently exposed to agents. Mkdnsite respects frontmatter flags like agent_visible: false to explicitly exclude specific pages from agent responses while still serving them to humans. This allows you to keep internal documentation in the same repository as public documents without exposing proprietary implementation details to AI agents.
MCP connections should be authenticated if your documentation contains proprietary or sensitive information. Mkdnsite supports API key validation for MCP endpoints, ensuring only your authorized OpenClaw agents can query the content programmatically. Unlike public HTML scraping, MCP access can be restricted, monitored, and audited, giving you granular visibility and control over what agents are reading and accessing within your knowledge base.
How can OpenClaw developers integrate Mkdnsite into their agent workflows?
If you are building with OpenClaw, Mkdnsite serves as an ideal documentation and knowledge base backend, providing a native way for your agents to consume information. You can deploy an instance pointing at your OpenClaw skills repository, giving your agents immediate access to skill documentation without the complexities of HTML parsing. This streamlines the process of agents understanding and utilizing new capabilities.
Configure your OpenClaw agents to use the Mkdnsite MCP server by adding the endpoint to your clawconfig.json file. This tells your agents where to find the structured knowledge base:
{
"mcpServers": {
"docs": {
"command": "curl",
"args": ["https://docs.yoursite.com/mcp"]
}
}
}
Your agents can then query documentation using natural language through the MCP tools, retrieving relevant information directly. When you update your skills documentation in Markdown, the agents see the changes immediately, ensuring they always operate with the most current information. This creates a tight feedback loop where your documentation stays in sync with your agents’ capabilities, leading to more reliable and effective agent performance.
You can also use Mkdnsite to serve agent-readable API specifications. Instead of maintaining separate OpenAPI JSON files and human-readable documentation, write your API documentation in Markdown with structured frontmatter. Agents consume the Markdown for context and direct interaction, while humans view the beautifully rendered HTML with examples and explanations, providing a unified source of truth.
What limitations should builders know about before adopting Mkdnsite?
Mkdnsite is a specialized tool and is not intended as a replacement for every web development use case. If you need complex dynamic functionality like user authentication, sophisticated database queries, or server-side rendering of interactive JavaScript frameworks (e.g., React components), you will still require a full-stack framework or a more traditional web application architecture. Mkdnsite is purpose-built for content sites: documentation, blogs, knowledge bases, and API references, where content is primarily text-based.
The runtime rendering model means you need a running server process. Unlike static files that can be thrown on object storage like S3 or GitHub Pages for free, Mkdnsite requires dedicated compute resources. However, the resource requirements are generally minimal; a modest $5 VPS can typically handle substantial traffic for most documentation sites.
Currently, Mkdnsite works best with Bun for the full feature set, including standalone executable compilation and optimal performance. While Node.js and Deno are supported, you may encounter subtle differences or edge cases with specific Markdown extensions or runtime behaviors. The ecosystem is also young; you will not find the hundreds of themes and plugins available for more established static site generators like Hugo or Gatsby. You may need to write custom CSS or JavaScript for specialized layouts or advanced interactive elements.
What comes next for Mkdnsite and the agentic web?
The existence of a hosted service at mkdn.io suggests a Software-as-a-Service (SaaS) offering is likely in the future, potentially providing managed Mkdnsite instances with automatic SSL, custom domains, and team collaboration features. Expect to see deeper integrations with the OpenClaw ecosystem, possibly including native plugins that allow OpenClaw agents to deploy and manage Mkdnsite instances automatically as part of broader agent workflows, further automating infrastructure.
The creator is actively seeking feedback on the Hacker News thread, indicating a commitment to rapid iteration and community-driven development. Expect improvements to the MCP server specification compliance, additional runtime targets beyond Bun, and possibly a more robust plugin architecture for custom Markdown processing and extensibility.
For the broader agentic web, Mkdnsite represents a new category: agent-native infrastructure. We will likely see similar tools emerge for other content types, creating a bifurcated web where humans primarily browse richly rendered HTML and agents efficiently consume structured raw formats. Mkdnsite is an early pioneer in this pattern, but it establishes a clear template for future developments. Builders should closely watch whether standards like llms.txt gain widespread traction and whether major search engines begin prioritizing sites that offer native agent endpoints for optimized information retrieval and indexing. This approach could fundamentally reshape how information is organized and accessed online.
Frequently Asked Questions
What is Mkdnsite and how does it differ from a static site generator?
Mkdnsite is a runtime-only web server that serves Markdown files directly without a build step. Unlike static site generators like Astro or Hugo that compile Markdown to HTML at build time, Mkdnsite uses HTTP content negotiation to serve HTML to browsers and raw Markdown to AI agents on demand. This eliminates build pipelines while keeping content in its native format for agent consumption.
How does Mkdnsite decide whether to serve HTML or Markdown?
Mkdnsite inspects the HTTP Accept header and User-Agent signatures. Browsers requesting text/html receive rendered HTML with CSS theming, while agents sending Accept: text/markdown or specific AI agent headers receive the raw Markdown source. This happens at runtime via Bun.markdown, ensuring both humans and agents get optimal formats without conversion overhead.
Can Mkdnsite run on Node.js or Deno, or is it Bun-only?
While Mkdnsite leverages Bun.markdown for optimal performance and can compile to standalone executables using bun build --compile, it also supports Node.js and Deno runtimes. You can deploy it as a Docker container, OS-specific binary, or directly via Bun, Node, or Deno depending on your infrastructure requirements.
How did AI agents contribute to building Mkdnsite?
The creator used OpenClaw agents configured to communicate via Slack as a development team. Starting March 7, 2026, these agents handled requirements gathering, scaffolding, feature implementation, and bug fixes. By March 28, they produced v1.4.1 with almost every line of code written by AI through autonomous OpenClaw sessions or Claude Code interactions.
Is Mkdnsite suitable for production documentation sites?
Yes, Mkdnsite includes production-ready features like full-text search for humans, MCP server integration for agents, GitHub repository pulling, syntax highlighting, Mermaid diagrams, and KaTeX math rendering. However, for extremely high-traffic sites, you should implement caching layers (e.g., a CDN or reverse proxy) since it renders Markdown at runtime rather than serving pre-built static files, to ensure optimal performance.