Skip to main content

Design an SEO booster XML sitemap for your blog.

You must have heard about the XML sitemap.

How can you create it and submit to the search console?

You can find these types of posts on many blogs.

But how can it be used in a better way? You will find only one or two blog posts about it.

I am assuming that you know well about generating an XML sitemap for your blog and submitting it to the search console.

I will discuss those points in this post which can increase your blog/site performance more.

Once you go forward, you know its definition.

XML sitemap works like a map for searchbot. This makes crawling easier for searchbot.


XML sitemap works like a map for searchbot. This makes crawling easier for searchbot.

With the help of XML sitemap, searchbot can crawl every page included in the XML sitemap.

This was a small introduction of the XML sitemap.

Now we come to our main point.

Misconceptions about indexation.

You might have heard like this.

If you have to get indexed your site in Google’s SERP then you will have to submit your site/blog’s XML sitemap in the search engine console (google, yahoo, bing).

Does the search engine start your post/page indexing just after submitting an XML sitemap?

But, the truth is not this.

If this would happen, you never hear like this.

If your post/page is not ranking, modify your post or page. Keep the most searchable keyword in it.

But the truth is anything else.

We are giving a clue to the search console that all those pages, post, etc. included on the XML sitemap are good quality content and being searched on the search result, whenever we submit the sitemap of our site/blog on the search console.

These materials are also suitable for indexation at SERP.

We give this request to the search engine.

This is just a request. This does not mean that the search engine will start indexing it.

When you publish after writing a new post.

And its XML sitemap does not submit to the search console.

After a few days, you search on google, the same post you see on the SERP. Can you tell me why?

That’s because the crawler can access your post. And that can also crawl it.

The crawler feels this post is of good quality and it would be good to index it.

This proves that only submitting the XML sitemap is enough for your post/page indexation.

For this, your post/page quality should be good.

Adding noindex page/post in the XML sitemap.

Often, whenever we generate a sitemap with the help of a sitemap generator, we generate a sitemap of the entire site/blog.

And we submit the same to the search console.

Now the question comes. Do we want to index all post/pages of blog/site index on SERP?

Not at all.

There are two types of content on our site.

1. Utility pages like- contact us, privacy policy, about us, etc.

These are those pages. Which gives information about us and our site/blog.

There is no need to index these pages on SERP.

But we include it on the sitemap. Secondly, we disallow them on the robots.txt file to instruct not to follow them.

First, we include them on XML sitemap and instruct these posts/pages to be indexed. Alternatively, we instruct robots.txt file to disallow these posts/pages.

Now you can tell what will follow the search console. First or second

Whatever instructions you provide to the searchbot it should be clear.

That is, disallow all the utility pages in robots.txt or “noindex, follow” on meta robots. And do not include that page on the XML sitemap.

2. Search landing pages.

These are the pages, where visitor wants to come to these pages by searching on google.

These pages should not be disallowed on robots.txt nor should “noindex, follow” on meta robots.

And these pages should also be included on the XML sitemap.

Correct use of Crawl Bandwidth.

When you click on the Crawl Stats option under the google search console’s Crawl option.

You will see some graphs. The first graph is “page crawled per day”.

By analyzing this graph, you will know that there is a limitation of Google crawler.

The crawler crawls only limited number of pages per day.

crawl stats-xml sitemap

From the above graph, it is clear each site/blog has limited bandwidth for crawling.

In the meantime, if the searchbot is also taking plenty of time to bring the “noindex, follow” located in the meta robots of your blog/site and releasing them.

The crawler is unable to crawl the important pages of your site/blog.

Here, all the utility pages should be disallowed via robots.txt, and should also be removed from the XML sitemap.

XML sitemap has been shown to improve cleanliness and utility pages by virtually ranking in noindex.

Are you confused about the robots.txt file? This post will help overcome confusion.

Final words: –

An XML sitemap can improve your site visibility on the SERP.

The condition is you should include only those pages/posts that are to be indexed.

The article published on MOZ gives detail information. Please read the article.

Any query regarding this article. Please, comment below. I will be happy to answer you.

Founder , WebtechThoughts

Barun Chandra is technology enthusiast and a blogger. He is fond of technology in depth and writes posts in simple words to make understand easy.

Get Free Email Updates!

Signup now and receive an email once I publish new content.

I agree to have my personal information transfered to MailChimp ( more information )

I will never give away, trade or sell your email address. You can unsubscribe at any time.

2 thoughts to “Design an SEO booster XML sitemap for your blog.”

Leave a Reply

Your email address will not be published. Required fields are marked *