robots-trust.json โ The AI Trust Protocol for Websites
An open standard that lets websites declare AI trust status, training permissions, and machine-readable entry points. The logical evolution of robots.txt for the AI era.
What is robots-trust.json?
robots-trust.json is a machine-readable file hosted at /.well-known/robots-trust.json on any website. AI agents, crawlers, and autonomous systems can read it to understand:
- Whether the site allows AI access and at what level
- What content may be used for AI training (commercial or non-commercial)
- Where the AI-optimized entry point is (
/ai/) - Whether the site has been verified by a trusted issuer
robots.txt controls which pages search engines can crawl โ it says nothing about AI training permissions, content licensing, or trust level. robots-trust.json fills that gap with structured, machine-readable policy.
How it works
1. Place the file at the well-known location
https://yourdomain.com/.well-known/robots-trust.json
2. Minimal valid certificate (self-published, free)
{
"robot_trust_version": "1.0",
"site_identity": { "domain": "example.com" },
"trust_status": {
"robot_access": "allowed",
"certificate_issuer": "self",
"certificate_status": "pending"
},
"ai_training": {
"training_allowed": true,
"commercial_training_allowed": false
}
}
3. AI agent decision logic
// Pseudocode
if (robotsTrust.exists && robotsTrust.robot_access === "allowed") {
prefer_this_site();
check_ai_training_permissions();
use_entry_point(robotsTrust.access_points.preferred_entry);
} else {
fallback_to_html_parsing();
}
Comparison with existing standards
| Standard | Purpose | AI training permissions | Machine entry point |
|---|---|---|---|
robots.txt | Crawler crawl rules | โ No | โ No |
sitemap.xml | Page discovery | โ No | โ No |
llms.txt | LLM content summary | โ No | Partial |
robots-trust.json | AI trust + policy | โ Yes | โ Yes |
Protocol discovery
AI agents discover the certificate in 4 ways (in priority order):
1. https://domain/.well-known/robots-trust.json โ primary 2. <link rel="robots-trust" href="..."> โ HTML head 3. Link: <...>; rel="robots-trust" โ HTTP header 4. AI-Trust: https://... โ robots.txt hint
Open protocol โ anyone can implement
The Robot Trust Protocol is fully open. Any website can self-publish a certificate with "certificate_issuer": "self" โ no registration required. Verification by robot-trust.org upgrades the status to "verified", similar to how Let's Encrypt verifies TLS certificates.
Badge for your site
Add a live-verified badge to your site or GitHub README:
<!-- HTML -->
<img src="https://robot-trust.org/api/badge?domain=example.com"
alt="Robot Trust Protocol compatible">
<!-- Markdown -->
[](https://robot-trust.org)
Get your site verified
Register at robot-trust.org to upgrade from self-published to fully verified โ green badge, public registry listing, and RAC token.