Skip to main content
Free Tool

Robots Meta Tag Builder

Build the exact robots meta tag your page needs — index/noindex, follow/nofollow, snippet limits, image previews, per-bot overrides and X-Robots-Tag header — with a live preview.

index / noindexfollow / nofollownoarchivenosnippetmax-snippetmax-image-previewunavailable_afterX-Robots-Tag Header

Your Page

1

Core directives

The two choices Google checks first — whether the page can be indexed and whether links are followed.

2

Snippet & media controls

Fine-tune what Google can show in search results — snippet length, image preview, video preview.

3

Per-bot overrides

Optional — target a specific crawler when you need different rules for Google, Bing or news bots.

Robots Output

StatusIndexed · links followed
<meta name="robots" content="index, follow" />
X-Robots-Tag (HTTP header equivalent)
X-Robots-Tag: index, follow

Place the meta tag inside the <head> of the page. Use X-Robots-Tag as an HTTP response header for non-HTML resources (PDFs, images, JSON).

Field tips
  • okAll-clear configuration — this page is fully indexable with rich snippet support.

How to Use Robots Meta Tags

Four rules that stop 90% of noindex mistakes before they hit production.

1

Default is index, follow

Every public page should use index, follow — or simply omit the robots meta entirely. Only add noindex when you're certain the page should never appear in search.

2

noindex over robots.txt

To keep a page out of Google, use noindex in the meta tag — not Disallow in robots.txt. Blocked URLs can still appear without snippets; noindex actually removes them.

3

X-Robots-Tag for non-HTML

PDFs, images, JSON feeds and any non-HTML file can't carry a meta tag. Use the X-Robots-Tag HTTP header on those responses instead.

4

Stage restrictive rules carefully

Never deploy noindex across your whole site on staging unless you're 100% certain the build flag won't leak to production. A single misconfigured env can de-index a site in a day.

Robots Directive Reference

Every value Google honours, with the practical effect on search.

index / noindex

Whether the page can appear in Google search. noindex is the definitive way to remove a URL.

follow / nofollow

Whether links on the page pass ranking signals. nofollow tells Google not to crawl outbound links.

noarchive

Blocks Google's cached version link in SERPs. Mostly obsolete — Google removed the cache link in 2024.

nosnippet

No text snippet in search results. Also opts out of featured snippets and AI Overview extraction.

noimageindex

Stops images on the page from appearing in Google Images search.

max-snippet

Characters Google may show in the description snippet. Use 0 to hide, -1 for no limit.

max-image-preview

none / standard / large. Controls thumbnail size in Discover feed and image-enabled SERPs.

unavailable_after

Auto-removes the URL from the index after this date. Perfect for events, expiring offers and time-boxed landing pages.

When to Use Each Robots Preset

Common scenarios and which directives actually solve them.

Should be indexed

HomeService pagesBlog postsProduct pagesLocation pagesLanding pages

Use noindex

Thank-you pagesInternal search resultsTag archivesDuplicate paginated pagesStaging subdomainLegal / admin pages

Use nofollow

User-generated commentsForum pagesUntrusted outbound linksAffiliate-heavy pagesPaid placement pagesUnmoderated directories

Use unavailable_after

Event pagesWebinar registrationsExpiring offersTemporary landing pagesSeasonal promosLimited-time campaigns

Robots Meta FAQ

noindex vs Disallow — which removes a URL?

noindex in the meta tag actually removes the URL from Google's index. Disallow in robots.txt only blocks crawling — the URL can still appear in SERPs without a snippet.

How long until Google respects noindex?

Usually within 1–7 days, once Google re-crawls the URL. For urgent removals, submit via Search Console's Removals tool as well.

Does my input go anywhere?

No. Everything runs in your browser. Nothing is uploaded or stored.

Can I noindex a page that has backlinks?

You can, but those backlinks stop passing ranking signals once Google drops the URL. For signal consolidation, prefer canonical or 301 over noindex.

Need a Site-Wide Indexability Audit?

Our Australian SEO team audits every robots rule, canonical, sitemap and redirect — and flags pages accidentally blocked from Google.

  • Per-page robots review
  • Canonical + sitemap alignment
  • No lock-in commitment
Book a Free Consultation First
🔒 Secure checkout|Delivered within 48 hours|100% money-back guarantee

No long-term commitment. Cancel anytime. 100% satisfaction guaranteed.