How It Works
Place a robots-trust.json file at the well-known URI. AI agents fetch it before interacting with your site to understand your AI interaction policy.
1 โ Place the file
Host the file at:
https://yourdomain.com/.well-known/robots-trust.json
2 โ AI agents read it
Any AI crawler or agent checks this URL to get your policy before crawling, scraping, or training on your content.
async function checkTrust(domain) {
const url = `https://${domain}
/.well-known/robots-trust.json`;
const r = await fetch(url);
if (!r.ok) return null;
return await r.json();
}
Minimal Valid Certificate
Only robot_trust_version is required. The simplest valid file:
{
"robot_trust_version": "1.0",
"ai_policy": {
"agents": "allowed",
"training": "restricted",
"scraping": "limited"
}
}
{
"robot_trust_version": "1.0",
"protocol": "https://robot-trust.org/protocol",
"ai_policy": {
"agents": "allowed",
"training": "restricted",
"scraping": "limited"
},
"trust_status": {
"certificate_status": "verified",
"certificate_issuer": "robot-trust.org",
"trust_level": "pro"
},
"access_points": {
"preferred_entry": "/ai/",
"rate_limit_policy": "ai-friendly"
}
}
ai_policy block
The ai_policy object is the core policy signal. AI agents read this first.
| Field | Values | Meaning |
|---|---|---|
| agents | "allowed" ยท "restricted" ยท "denied" | Whether AI agents may interact with this site |
| training | "allowed" ยท "restricted" ยท "denied" | Whether content may be used for AI model training |
| scraping | "allowed" ยท "limited" ยท "denied" | Whether automated scraping is permitted |
trust_status block
Optional if self-declaring. Required for verified certificates from robot-trust.org.
| Field | Type | Description |
|---|---|---|
| certificate_status | required | "verified" ยท "pending" ยท "revoked" |
| certificate_issuer | optional | Who issued the certificate (e.g. "robot-trust.org") |
| trust_level | optional | "basic" ยท "pro" ยท "enterprise" |
| issued | optional | ISO 8601 date (YYYY-MM-DD) |
| expires | optional | ISO 8601 date (YYYY-MM-DD) |
Top-level fields
| Field | Type | Description |
|---|---|---|
| robot_trust_version | required | Always "1.0" |
| protocol | optional | Link to this page so agents can find the spec |
| schema | optional | Link to the JSON Schema for validation |
Example Implementations
Basic site (allow all)
{
"robot_trust_version": "1.0",
"ai_policy": {
"agents": "allowed",
"training": "allowed",
"scraping": "allowed"
}
}
No training (content protected)
{
"robot_trust_version": "1.0",
"ai_policy": {
"agents": "allowed",
"training": "denied",
"scraping": "limited"
}
}
Private / restricted
{
"robot_trust_version": "1.0",
"ai_policy": {
"agents": "restricted",
"training": "denied",
"scraping": "denied"
}
}
Agent Decision Logic
Here is reference logic for how an AI agent should implement support for this protocol:
async function applyRobotTrustPolicy(domain) {
const url = `https://${domain}/.well-known/robots-trust.json`;
let policy;
try {
const r = await fetch(url, { signal: AbortSignal.timeout(5000) });
if (!r.ok) return; // No policy โ proceed with default behavior
policy = await r.json();
} catch {
return; // Timeout or error โ proceed with default behavior
}
const ap = policy.ai_policy || {};
// 1. Agent access
if (ap.agents === "denied") throw new Error("AI agents not permitted on this site");
if (ap.agents === "restricted") console.warn("Restricted access โ check site rules");
// 2. Training
if (ap.training === "denied") markAsNoTraining(domain);
if (ap.training === "restricted") markAsRestrictedTraining(domain);
// 3. Scraping
if (ap.scraping === "denied") throw new Error("Scraping not permitted");
if (ap.scraping === "limited") applyPoliteDelay(domain);
}
Further Reading
๐ Full Specification
Complete protocol documentation including all fields, versioning, and certificate lifecycle.
View Specification โ๐ท JSON Schema
Machine-readable schema for validating robots-trust.json files. Use with any JSON Schema validator.
โ Online Validator
Enter any domain to check if it has a valid robots-trust.json certificate.
๐ฆ Get a Certificate
Generate a robots-trust.json for your domain in under 2 minutes. Basic plan is free.