The internet was made for people. That feels like a strange thing to have to say, but it’s worth saying because it’s becoming less true by the day.
When the World Wide Web first took shape in the early ’90s, the whole point was human communication. Web pages. Emails. Hyperlinks you clicked because you were curious. The consumer of the internet was a person sitting at a computer, reading text rendered in a browser. Everything about the web, its markup, its visual design, its information architecture, was oriented around a human on the other end.
Sure, there were nerds writing scripts to make machines talk to each other. Some rudimentary automations, some bots crawling pages. But that was a footnote. The overwhelming majority of the internet’s traffic and purpose was human-to-human. People publishing for people. People reading what other people wrote.
That was era one.
The Invisible Web
Fast forward to the last two decades, and something shifted underneath us. A massive portion of the internet became machine-to-machine. Servers talking to servers. APIs exchanging data. Microservices calling microservices. System-to-system communication that no human ever directly sees or touches.
This is the internet that powers your food delivery app, your bank’s fraud detection, your streaming recommendations. It’s working constantly, at enormous scale, but it exists entirely behind the scenes. You never visit these endpoints in a browser. You can’t. They weren’t made for you.
And that was fine. Because this invisible web was still in service of human needs. The APIs existed so that the apps you used could function. The machines talked to each other so you didn’t have to. The infrastructure was hidden, but the purpose was clear: make things work for people.
That was era two.
The New Consumer
Now something else is happening. AI services (LLMs, search agents, training pipelines) are consuming the web at a scale and in a way that doesn’t fit neatly into either of those first two eras.
They’re not humans browsing. They’re not APIs serving an app. They’re machines reading content that was written for humans, extracting meaning from it, and using it for purposes the original authors may never have intended. Training data. Search responses. Summarization. Knowledge retrieval.
This has been happening for a while now. Every major AI lab has scraped vast portions of the public web to build their models. But what’s changed recently is the volume and the permanence of this pattern. AI isn’t just training on the web once and moving on. AI search tools like ChatGPT, Perplexity, and Google’s AI Overviews are actively crawling pages in real time to answer user queries. AI agents are browsing on behalf of users. The web isn’t just a training set anymore. It’s a live data source being queried continuously by machines.
This is starting to form a huge part of internet traffic. And it’s beginning to reshape what the internet is.
The Accommodation
This is where it gets interesting.
Until very recently, most web pages were designed with a single consumer in mind: a human with a browser. HTML, CSS, JavaScript, all of it exists to render a visual, interactive experience for a person looking at a screen. When an AI bot crawls that same page, it has to wade through all of that presentational markup to get to the actual content. Navigation bars, footers, ad containers, tracking scripts, cookie banners. None of it is useful to an AI trying to understand what the page is about.
This is wasteful. HTML is verbose. A page that’s 16,000 tokens in raw HTML might contain 3,000 tokens of actual content. That difference matters when you’re working within context windows that have hard limits.
So in the last year or two, I’ve noticed something: developer-oriented websites have started offering their content as clean markdown files alongside their regular HTML pages. Documentation sites, technical blogs, reference pages, places where the audience already skews technical, began providing a machine-friendly format. Not as a replacement for the human version, but as an addition. A parallel track.
Third-party services popped up to do this conversion too. Tools that strip HTML down to markdown so AI agents can consume pages more efficiently. It was a workaround, useful but fragmented.
Some people saw this coming early. Over the past year, developer-facing sites (docs platforms, API references, dev tools) started offering first-party markdown alongside HTML so the AI coding agents their users rely on could read docs without choking on HTML. Developers are the heaviest users of AI agents, so those sites adapted first.
I did this myself, actually. I added LLMs.txt and robots.txt files to my own sites and sprinkled JSON-LD structured data into head sections, basically making it easier for AI systems to find and understand information without parsing the whole site. But all of this was manual, niche, and done one site at a time.
And then Cloudflare stepped in.
The Toggle
In February 2026, Cloudflare launched a feature called “Markdown for Agents.” It does exactly what it sounds like.
The thing about Cloudflare is that they sit in front of a staggering portion of the internet. Over 20% of all global web traffic flows through their network. More than 20 million websites. When Cloudflare adds a feature, it doesn’t affect a niche, it reshapes infrastructure.
The feature works through standard HTTP content negotiation. When an AI bot sends a request with the header Accept: text/markdown, Cloudflare’s edge network intercepts it, converts the HTML to markdown on the fly, and returns the clean version. No separate endpoint needed. No changes to your origin server. Just a toggle in the dashboard.
The efficiency gains are significant. Cloudflare’s own example showed a page going from 16,180 HTML tokens to 3,150 markdown tokens, an 80% reduction. Something as simple as <h2 class="section-title" id="about">About Us</h2> costs 12-15 tokens in HTML. The markdown equivalent, ## About Us, costs about 3.
Site owners can choose which AI bots get access. They can limit which paths are exposed. And the response includes a token count header so AI agents can manage their context budgets.
This isn’t a hacky workaround from a startup. This is the company that operates a fifth of the internet saying: here’s a switch. Flip it, and your website speaks machine.
One toggle. Over 20 million websites can now speak machine.
The Shift in Dynamic
Before this, the dynamic was clear: content existed for humans, and AI came along and took it. Scraped it, parsed it, cleaned it, repurposed it. The creator had no role in that process.
Now that’s flipping. Site owners are choosing to present their content to AI directly, serving a clean, machine-optimized version alongside the human one. AI isn’t just a consumer taking what it can get; it’s becoming a first-class audience that content is prepared for.
It’s the difference between someone photocopying your book without asking and you handing them a formatted digital copy. The content may be the same. The relationship is completely different.
What This Means
I’m not making a judgment call about whether this is good or bad. I’m just observing what’s happening and where it leads.
The trend was already underway, the internet moving toward what I’ve written about before as the The Zero-Click Internet: When AI Becomes the Only Door to Information. Your AI interface browses for you. It reads the pages, synthesizes the answers, delivers the result. You never visit the source. The website gets consumed but never seen.
Cloudflare’s feature doesn’t create that trend. But it accelerates it enormously. It removes the friction. Before, AI bots had to do the messy work of parsing HTML. Now, with a single toggle, over 20 million websites can serve pre-digested content directly to AI. The plumbing is done.
And this is the part I find genuinely fascinating, and a little disorienting. The logical consequence of all this is that the internet starts being made for AI.
It’s already happening. Web content is increasingly being written not just for human readers, but for AI consumption. SEO, which for two decades meant optimizing for Google’s algorithm, is mutating into something new. How do you rank higher in an AI-generated response? What patterns does ChatGPT favor when it synthesizes search results? How do you ensure your content gets surfaced by Perplexity or Claude when a user asks a question?
These aren’t hypothetical questions. People are already asking them. Businesses are already optimizing for them. Content is being written with AI, for AI, based on AI search patterns. The audience for a web page is no longer just the person who might visit it. It’s the model that might ingest it.
Three Eras
So here’s the arc, as I see it.
Era one: the human web. Pages made for people, by people. Emails, forums, hyperlinks. The internet as a communication tool between humans.
Era two: the API web. Machines talking to machines in the background. Invisible infrastructure serving human needs through apps and services. The internet as plumbing.
Era three, the one we’re entering now: the AI web. Machines consuming content that was originally made for humans, and that content increasingly being reshaped to serve machines first. The internet as a data source for AI, with humans interacting through an AI intermediary rather than directly.
Each era didn’t replace the previous one. The human web still exists. APIs are more prevalent than ever. But each new era shifts the center of gravity. The thing that most of the internet’s design, content, and infrastructure orients itself around.
And right now, that center of gravity is moving fast toward AI as the primary consumer of web content. Cloudflare putting a toggle on it just made the shift official.
That’s the reality we’re in.
This article was written by me, a human. I used an LLM-powered grammar checker for final review.