Use three tools in sequence: Google’s Rich Results Test to confirm all schema types parse with zero errors, the Schema.org validator to check semantic correctness, and a manual View Source check to confirm JSON-LD appears in static HTML before any JavaScript runs. That third step is the most important. AI crawlers don’t execute JavaScript, so schema that only appears after rendering is invisible to them.[1]
Run every new page through the three-step validation sequence before publishing: Rich Results Test → Schema.org validator → View Source check. Build this into your publishing workflow, not your post-publishing audit.
Each tool catches different types of errors. Rich Results Test catches structural and eligibility issues. Schema.org validator catches semantic vocabulary errors. View Source catches the critical JavaScript-injection problem that the other two miss.
Right now, open one of your existing published pages in a new tab. Press Ctrl+U (or Cmd+U on Mac) to view source. Search for "application/ld+json". If it's not there, your schema is either missing or JavaScript-injected. Both are serious issues to fix.
Google's Rich Results Test is available at search.google.com/test/rich-results. It accepts either a live URL or raw HTML code, and it renders your page (including JavaScript) before parsing the schema. This makes it excellent for catching structural errors. But it will show JavaScript-injected schema as working, which is why it is only Step 1 of 3.[1]
How to use it:
What a passing result looks like: You should see "FAQ" (or "Frequently Asked Questions"), "Article" or "BlogPosting", and "Breadcrumb" all listed as detected schema types with green or neutral status indicators. If any type shows a red error, click it to see which property is causing the problem. Usually a missing required field, a wrong value type, or a URL that doesn't match the canonical URL of the page.
The Schema.org Validator is available at validator.schema.org. While Google's Rich Results Test checks against Google's structured data requirements, the Schema.org validator checks against Schema.org's own vocabulary specification. Which is the authoritative source for the standard itself.
How to use it:
A common Schema.org-specific issue: using a property like datePublished without an ISO 8601 formatted date value. Google's Rich Results Test may be lenient about date formats; Schema.org's validator is stricter. ISO 8601 format is always correct: YYYY-MM-DD.[3]
This is the most critical step for AI visibility. And the one most people skip because the other tools already showed green. View Source is how you check what AI crawlers actually see.
How to do it:
application/ld+json.If the schema is JavaScript-injected, you will find it when you use Chrome DevTools (right-click → Inspect, then search in the Elements panel). But not in View Source. DevTools shows the rendered DOM after JavaScript runs. View Source shows the raw HTML before JavaScript runs. AI crawlers like GPTBot and Claude-Web see what View Source shows, not what DevTools shows.[2]
Both the Rich Results Test and Schema.org validator distinguish between errors and warnings. Understanding the difference helps you prioritize what to fix.
Errors indicate that a required property is missing, a value is the wrong type, or the JSON structure is broken (mismatched brackets, missing commas). Errors prevent a schema type from functioning as intended. They block rich results eligibility and reduce the reliability of the signal you're sending. Fix all errors before publishing.
Warnings indicate that an optional but recommended property is absent. Common warnings on BlogPosting include missing image properties. Common warnings on FAQPage include answer text that contains HTML tags (which is technically permitted but not ideal). Warnings don't block functionality but are worth addressing when time allows.
A clean result. Zero errors, minimal or zero warnings. Means your schema is correctly structured, accurately describes your page content, and is eligible for every enhanced feature that schema type supports.[4]
Once your site is live and indexed, Google Search Console provides ongoing schema monitoring under the "Enhancements" section in the left navigation. This section shows rich result status for FAQ, Article/BlogPosting, and Breadcrumb schema across your entire site. Not just individual pages.
What to look for in Search Console Enhancements:
Search Console only reflects pages that Googlebot has crawled and indexed. It won't show pages that haven't been visited recently. Use the URL Inspection tool to request recrawling of specific pages after you make schema changes, rather than waiting for Googlebot to discover the changes on its own.[1]
The View Source step is the one I watch most experts skip. Because their other tools showed green, and they assume that means everything is working. It doesn't. Google's Rich Results Test renders your JavaScript. AI crawlers don't. These are fundamentally different reads of the same page.
I learned this distinction the hard way when I noticed that a popular CMS-based site had apparently perfect schema according to every tool. Until I View Sourced it and found the <script type="application/ld+json"> block was not in the raw HTML at all. It was being injected by a JavaScript plugin after page load. That schema was invisible to every non-JavaScript AI crawler. The validation tools said "working." The AI crawlers said "nothing to see here."
The Authority Directory Method uses pure HTML/CSS/JS with no framework or CMS. Specifically because it eliminates this problem entirely. When your schema is written directly into the <head> of a static HTML file, there is no injection risk, no plugin failure mode, no JavaScript dependency. View Source always shows exactly what AI crawlers see. That certainty is worth a lot.
In Google's Rich Results Test, errors prevent a schema type from generating rich results. They indicate required properties are missing or values are wrong. Warnings indicate optional properties are missing or recommended best practices aren't followed, but they don't block functionality. For AI authority purposes, you want zero errors and minimal warnings. Common warnings include missing image properties in BlogPosting and missing review counts in Product schema. Errors typically indicate broken JSON syntax, missing required @type declarations, or URL values that don't match canonical page URLs.
Google's Rich Results Test checks all schema types present in the page, but it only shows rich results eligibility for types that Google has defined rich results formats for. Including FAQ, Article/BlogPosting, and Breadcrumb. For types without a Google-defined rich result (like Person schema on its own), the test will still parse and display the schema in the 'Detected structured data' section, but won't show a specific eligibility status. This is normal. Person schema embedded in BlogPosting's author property is evaluated in context of the BlogPosting type, not independently.
Yes. Both the Rich Results Test and Schema.org validator accept raw HTML code input, not just live URLs. In the Rich Results Test, choose 'Code' instead of 'URL' and paste your full HTML. This lets you validate schema on pages under development, local files, or staging environments before they go live. This is the recommended workflow: validate in staging, fix any errors, then deploy. Never publish a new page template without validating its schema first.
The most common cause is that the page hasn't been crawled since the schema was added. Google Search Console's 'Enhancements' report only updates when Googlebot recrawls the page. Use Google Search Console's URL Inspection tool to request indexing on the specific page, then wait 48–72 hours. If the schema still doesn't appear after recrawling, check that the schema is in the static HTML source (not injected by JavaScript) and that the page is not blocked by robots.txt or a noindex directive.
Validate schema whenever you make changes to a page's structure, update FAQ content, or change URLs. Also re-validate any time Google updates its structured data documentation. Occasionally, requirements change and previously valid schema may develop new warnings. For a stable, well-built site, a quarterly audit of your top 10 content pages using both the Rich Results Test and Schema.org validator is sufficient to catch any drift. If you use a template-based approach (same schema structure across all pages), validating one representative page per template is efficient.
Take the free AI Visibility Scan to discover your current positioning, or explore the complete build system.