WebMCP Is the Semantic Web Reborn
Depends on: agent-traffic-thesis
Twenty years ago, Tim Berners-Lee proposed a vision: make the web machine-readable. RDF, OWL, SPARQL — an entire stack of specifications designed to let software understand web content at a semantic level.
It failed. Not because the technology was bad, but because it was born into the wrong era.
The Missing Consumer
In 2004, there were no machine consumers of the web. Search engine crawlers wanted keywords, not ontologies. No software agent was going to navigate a SPARQL endpoint to book a flight. The ROI of annotating your website with RDF triples was approximately zero — all the cost fell on publishers, with no demand-side pull.
This is the classic two-sided market cold start problem. You need both sides to show up, and neither will without the other.
What Changed
AI agents changed everything. By 2025, autonomous agents are browsing the web, making purchases, filing reports, and consuming APIs. They are real, paying consumers of web content. The demand side of the market now exists.
WebMCP (Chrome 146, February 2026) lets websites declare structured tool interfaces — what actions are available, what parameters they accept, what they return. The browser acts as the protocol mediator between website and agent.
The critical differences from the Semantic Web:
- JSON Schema instead of RDF/OWL — developers already know JSON Schema. Zero learning curve.
- Browser-native API —
navigator.modelContextis a standard Web API, not a separate technology stack. - Read-write-execute, not read-only — agents can search, create, purchase, not just observe.
- Incremental adoption — add a
toolnameattribute to an existing HTML form. Five minutes of work.
The Timing Thesis
The same technology in a different era produces completely different outcomes. RDF in 2004 had no consumers. WebMCP in 2026 has millions of agents ready to use it.
This isn't a new insight — it's one of the most reliable patterns in technology: the market has to be ready for the solution. What killed the Semantic Web wasn't bad technology. It was premature deployment into a market that didn't need it yet.
Implications
If this analysis is correct, we should expect rapid WebMCP adoption once agent traffic becomes a meaningful fraction of web traffic. The tipping point isn't a technical milestone — it's an economic one: when websites see enough agent-driven revenue to justify the annotation effort.
The smart move for any web property right now is to instrument early, before the land grab. The first sites with good WebMCP tool coverage will capture disproportionate agent traffic, just as the first sites with good SEO captured disproportionate search traffic in the early 2000s.