How I Fixed My robots.txt and Sitemap to Help Google Index My Blog Faster
Posted on: June 24, 2025
When I first heard about robots.txt
and sitemap.xml
, I thought: "That sounds like stuff only web developers touch." But once I learned what they really do — I realized I had been ignoring a simple SEO fix that could help Google crawl and index my blog properly.
This post explains what they are, why they matter, and how I set them up correctly in Blogger — in under 10 minutes.
🤖 What is robots.txt?
robots.txt
is a small file that tells search engines (like Google) what parts of your blog they can and can't crawl.
If it's set up wrong, Google might skip over important posts — or try to index pages you don’t want to show in search (like label pages, tag pages, etc).
🧭 What is sitemap.xml?
sitemap.xml
is like a map of all your blog posts and pages. You submit it to Google to help it find and index your content more efficiently.
Think of it as: robots.txt = what to ignore, sitemap.xml = what to crawl first
✅ How I Set Up robots.txt in Blogger
Follow these steps:
- Go to Blogger Dashboard → Settings
- Scroll to the Crawlers and Indexing section
- Turn ON Enable custom robots.txt
- Click “Custom robots.txt” and paste the following:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://yourblogname.blogspot.com/sitemap.xml
Make sure to replace yourblogname
with your actual blog URL.
🔍 What this does:
Disallow: /search
stops Google from indexing tag & label pages (which are duplicate content)Allow: /
allows everything elseSitemap:
tells Google where your blog post list is
✅ How I Enabled Sitemap.xml in Blogger
- In Settings, turn ON Enable custom robots header tags
- Click "Homepage tags" → Check: all, noodp
- Click "Archive and search page tags" → Check: noindex, nofollow
- Click "Post and page tags" → Check: all
Why? This prevents Google from indexing duplicate search/archive pages but still indexes your actual blog posts.
📬 Submitting Sitemap to Google
Once it's enabled:
- Go to Google Search Console
- Select your blog property
- Click “Sitemaps” from the sidebar
- Submit this:
sitemap.xml
You can also add: feeds/posts/default?orderby=updated
(this is Blogger's RSS-style sitemap)
🔄 When to Update robots.txt or Sitemap
- When you change blog structure (new pages/labels)
- When you notice posts not being indexed in Search Console
- When doing a full SEO clean-up
📈 Result After Fixing Mine
- My posts started getting indexed within 48 hours (earlier it took weeks)
- Googlebot stopped indexing duplicate tag pages
- My impressions in Search Console jumped by 25% in 2 weeks
🙌 Final Thoughts
If you’ve been posting regularly but Google still isn’t showing your content — fixing robots.txt
and sitemap.xml
might be the missing piece.
It’s quick, free, and requires zero technical skills. And once it’s set up, you don’t need to touch it often.
Need help writing your exact robots.txt file? Drop your blog link in the comments and I’ll send you a custom version 🚀
Tags: blogger seo, robots.txt, sitemap.xml, crawlcraft, indexing issues, google bot, search optimization
Comments
Post a Comment