Skip to main content
Free Tool

XML Sitemap Generator

Paste a list of URLs and get a valid, ready-to-submit sitemap.xml — with lastmod dates, changefreq and priority applied site-wide. Copy or download in one click.

sitemap.xml 0.9 schema50,000 URL validatorlastmod · changefreq · priorityDuplicate detectionXML-escaped outputDownload .xmlIn-browser onlyAustralian SEO

Your URLs

1

Paste URL list

One URL per line. Each must be a full absolute URL starting with http:// or https://.

5Lines
5Valid
5Unique
0Invalid
0Duplicates
2

Global defaults

Applied to every URL. Override per-URL later in your CMS if needed.

sitemap.xml

StatusSitemap valid
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>https://workspacein.com/</loc>
    <lastmod>2026-04-21</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
  <url>
    <loc>https://workspacein.com/digital-marketing</loc>
    <lastmod>2026-04-21</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
  <url>
    <loc>https://workspacein.com/seo</loc>
    <lastmod>2026-04-21</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
  <url>
    <loc>https://workspacein.com/web-design</loc>
    <lastmod>2026-04-21</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
  <url>
    <loc>https://workspacein.com/contact-us</loc>
    <lastmod>2026-04-21</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
</urlset>

Upload sitemap.xml to your site root, then submit to Google Search Console → Sitemaps. Re-submit after big content changes.

Checks
  • info5 valid URL(s), 934 B. Ready to deploy.

How to Deploy Your Sitemap

Three files and two submissions — that's all Google needs to start crawling your site faster.

1

Download and upload

Download sitemap.xml and upload it to your site root so it's accessible at https://yourdomain.com/sitemap.xml. Clear any CDN cache after uploading.

2

Reference in robots.txt

Add a Sitemap: https://yourdomain.com/sitemap.xml line to your robots.txt. Every crawler that respects robots.txt — Google, Bing, DuckDuckGo — picks it up automatically.

3

Submit to Search Console

In Google Search Console → Sitemaps, paste the sitemap URL. Do the same in Bing Webmaster Tools. Both platforms will start crawling the URLs within hours.

4

Automate regeneration

A static sitemap.xml is fine for small sites. For sites adding content weekly, generate the sitemap automatically from your CMS so lastmod stays current.

XML Sitemap Rules Google Actually Enforces

Break any of these and Google silently ignores the sitemap. All of them show up in Search Console's Sitemap report.

50,000 URLs max

Per sitemap file. Split into multiple sitemaps and reference them from a sitemap index for larger sites.

50 MB uncompressed

Per sitemap file. Gzip compression is allowed and recommended — most large sitemaps ship as sitemap.xml.gz.

UTF-8 encoding

Always. Non-UTF-8 characters must be entity-encoded. The tool handles this for every URL.

Absolute URLs only

Every loc must be a full http(s) URL. Relative paths are rejected. The host must match the sitemap's location.

Canonical URLs only

Include only indexable, canonical URLs. noindex pages, redirects and 4xx responses waste crawl budget.

lastmod is respected

Google uses lastmod to schedule re-crawls. Lying about freshness erodes Google's trust in the sitemap.

priority is mostly ignored

Google says they largely ignore priority. Include it for Bing — but don't stress optimising the value.

changefreq is advisory

Google treats changefreq as a hint, not a command. lastmod is a far stronger signal of freshness.

When a Sitemap Actually Helps

For tiny brochure sites Google will find every page anyway. These are the cases where sitemaps earn their keep.

Large sites

E-commerce cataloguesPublishersDirectory sitesJob boardsReal-estate portalsProperty search

Orphan / deep URLs

Location pagesArchive pagesFilter combinationsPaginated URLsStandalone landing pagesProgrammatic SEO pages

Fresh content

News articlesEvent pagesBlog releasesProduct dropsPress releasesChangelog updates

Niche formats

Image sitemapVideo sitemapNews sitemapHreflang sitemapMobile sitemap (legacy)Shopping feed

XML Sitemap FAQ

Do I need both robots.txt and a sitemap?

Yes. robots.txt tells crawlers what NOT to crawl. The sitemap tells them what TO crawl. They complement each other — reference the sitemap inside robots.txt for best results.

Should I include every page?

No. Include only canonical, indexable URLs you actually want ranking. Exclude noindex pages, redirects, filtered URLs, and tag/archive pages you don't want in Google.

Is my URL list sent anywhere?

No. Everything runs in your browser. URLs are not uploaded or stored.

How often should I regenerate it?

Automatically on every publish. If that's not possible, once a week is the minimum for any site adding new content regularly.

Want an Automated Sitemap + Indexing Workflow?

Our Australian SEO team sets up dynamic sitemap generation, index coverage monitoring, and automated re-submission after every content change.

  • Dynamic sitemap setup
  • GSC coverage monitoring
  • No lock-in commitment
Book a Free Consultation First
🔒 Secure checkout|Delivered within 48 hours|100% money-back guarantee

No long-term commitment. Cancel anytime. 100% satisfaction guaranteed.