Sitemap Generator
DeveloperGenerate XML sitemaps instantly to improve SEO.
Discussion
Join the discussion
Sign in to share your thoughts and engage with the community.
About this tool
What is the Sitemap Generator?
The Sitemap Generator creates valid XML sitemaps that you can submit to Google Search Console, Bing Webmaster Tools, and other search engines. A sitemap tells search engines which pages exist on your site, when they were last updated, how frequently they change, and how important they are relative to each other.
Three Ways to Add URLs
Crawl a URL
Enter your website address and the tool fetches your pages automatically by following internal links — similar to how a search engine crawler works. You can control how many levels deep the crawler follows links (1–5), and exclude paths like /admin or /api that should not appear in your sitemap.
Important for SPAs and JavaScript-heavy sites: The crawler reads static HTML responses only. If your site is built with React, Vue, Angular, or any framework that renders content client-side without server-side rendering, the crawler cannot see your pages. Use the Manual or Upload input mode instead, and paste your route list directly. This works for any site regardless of technology stack.
Manual / Paste
Type or paste URLs directly — one per line or comma-separated. Use the bulk paste area to import a list all at once, or use the individual URL rows to add, edit, and remove pages one by one. Each URL has its own expandable panel for setting per-page options.
Upload a File
Upload a .txt or .csv file containing a list of URLs. Lines starting with # are treated as comments and ignored. URLs are one per line or comma-separated.
Sitemap Settings
Last modified date (lastmod)
The date the page was last significantly updated. Search engines use this to prioritize recrawling. Set a default for all pages, then override individual pages where needed. Format: YYYY-MM-DD.
Change frequency (changefreq)
A hint to search engines about how often the page content changes. Options range from always (changes with every access) to never (static archive content). This is advisory — search engines do not follow it strictly, but it helps prioritize crawl resources.
Priority (0.1–1.0)
The relative importance of a page within your site. This does not affect your site's ranking in search results — it only helps search engines prioritize which pages to crawl first when crawl budget is limited. The homepage typically gets 1.0, main section pages 0.8–0.9, and deeper content pages 0.5–0.7.
Image sitemap entries
When enabled, each URL entry can include the URLs of images on that page. Image sitemaps help Google index images that might otherwise be missed, particularly images loaded dynamically or via CSS. The XML output includes the image:image namespace from Google's image sitemap extension.
Hreflang / multi-language
When enabled, each URL entry can include alternate language and regional versions of the page using xhtml:link rel="alternate" tags. This is the standard way to tell search engines which version of a page to show to users in each language or region. Required for any site serving content in multiple languages.
Output Formats
XML — The standard sitemap format. Download as sitemap.xml and place it at your domain root (https://example.com/sitemap.xml), then submit the URL to Google Search Console under Sitemaps.
URL list — A plain text list of all URLs in the sitemap. Useful for importing into other tools, sharing with developers, or checking coverage at a glance.
After Generating Your Sitemap
- Download
sitemap.xmland upload it to your web server's root directory. - Add a
Sitemap:directive to yourrobots.txtfile:Sitemap: https://example.com/sitemap.xml - Submit the sitemap URL directly in Google Search Console under Indexing → Sitemaps.
- For large sites (over 50,000 URLs), split into multiple sitemaps and create a sitemap index file.
Privacy
URL crawling is performed server-side via your browser's request to the tool. No URLs or sitemap data are stored or logged.