How to Create Blogger Blog Robots.txt File Perfect for SEO in 2020

Robots.txt file is another teeny tiny yet powerful medium any webmaster can explore to give his website or blog a leap forward. But in this tutorial, I will be talking categorically for blogger blog users, thus; as a blogspot platform user, how can you help your blog rank better in search engines using your blogger blog Robots.txt file?


In one of our earlier posts, I discussed how to configure your blogger blog custom robots header tags to give your blogger blog a heads-up in the SERPs.


This time around, we will be discussing how to use your blogger blog robots.txt file to helping your blog rank better in search engine results pages. And if by now you are wondering why I concentrate majorly on blogger blog users, I will suggest that you check out our focus, mission and vision page under the company info page here.


robots txt file skeletal form

Well, back to the topic for today, using Robots.txt file to help your blogger blog compete better in search engine results pages.


Before we dive deep into the discussion, let us get down with the basics first and then, proceed further gradually as we get along. To start with the basics, we must explain what Robots.txt file is, to what extent it can help your blog, terms used in it and clear the possible confusions before we narrate how best to use the Robots.txt file for your blog.


What is Robots.txt File?

Robots.txt file, as the name implies, is a text file that contains a few lines of code. This text file is stored at the root folder of the website or blog. However, some CMS such as WordPress uses virtual Robots.txt file stored on their server for each blog or website powered by the platform.
Meanwhile, for most websites and blogs hosted on any CMS, Robots.txt file can be created and uploaded in the root directory of the site or blog where search engine spiders or crawlers can locate and identify the Robots.txt file as and for what it is.


What is the Function of the Robots.txt File?

Similar to custom robots header tags discussed in one of our earlier posts on blogspot SEO, Robots.txt file contains set of instructions intended and usable by the search engine spiders.
The lines of codes contained in the Robots.txt file simply instructs the search engine spiders which aspect of the blog or website domain are to be crawled and indexed and which aspect should be exempted.


How Does Robots.txt File Work?

Robots.txt file does not work. Rather, it is present there for the search engines spider to work on. The search engine bots simply work based on the cammands and instructions laid out in the Robots.txt file.


Any Similarities with Robots Header Tags?

Yes. In a way, robots header tags and Robots.txt file are similar. Similar in the sense that they both give instructions to the search engine spiders.


Additionally, the search engine bots, on arriving a website or blog for crawling; look out for both the robots header tags and Robots.txt file and acts by the instructions set in both.


Since this tutorial discusses Robots.txt file, we will just focus that. You may check our in-depth insight on the relationship between Robots.txt file and robots header tags here.


Is Blogger Blog Robots.txt File Different?

No. Blogger blog Robots.txt file is the same as with other websites and CMS. In fact, you can modify the Robots.txt file of another website or blog as yours and upload it on your site or server - this should be done only if you believe that website has got one of the best if not the best Robots.txt file setup.


However, it should be noted that the major differences in Robots.txt file is in the directory of the domain which determines the instructions that are issued in the Robots.txt file and how they shall look like.


Further down this page, you shall see how blogger blog Robots.txt file loooks like, the best instructions to be issued in the Robots.txt file and where and how to upload it in your blogger blog.


How to Locate Blogger Blog Robots.txt File

Either you are using custom Robots.txt file or the default, there is one common way to check and know the Robots.txt file settings or content of a blogger blog.


To know the Robots.txt file content of any website including blogger blog, simply enter in your web browsers' address bar:

http://www.example.com/Robots.txt

And then press "enter".


When you do that, one of three things would happen;

  • The site displays the Robots.txt file content of the blog - means the domain has a Robots.txt file and that is its content.
  • The address displays and emptied / blank space - means the domain has a Robots.txt file with no specific instructions. This kind of domain are crawled and index with no constraints.
  • The address returns an error 404 - means the the domain has no Robots.txt file configured or is not properly configured.

Meanwhile, for any blogger blog in this year 2020; here is the default sample Robots.txt file content you will see:

User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Allow: /

Sitemap: https://www.example.blogspot.com/sitemap.xml

Blogger blogs using the older custom Robots.txt file would have Robots.txt file content similar to this:

User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Allow: /

Sitemap: https://www.example.blogspot.com/feeds/posts/default?orderby=updated

And for blogger blogs having more than 500 published posts (and using older blogger Robots.txt file method) should have a Robots.txt file similar to the one given below:

User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Disallow: /b
Allow: /

Sitemap: https://www.example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: https://www.example.blogspot.com/atom.xml?redirect=false&start-index=501&max-results=500

Explanations of the Variations in Blogger Blog Robots.txt File

Until the year 2015 or so, blogger did not introduce the smarter XML sitemap format which has:

https://www.example.blogspot.com/sitemap.xml

That being the full XML sitemap.


Meanwhile, prior to that time, blogspot platform bloggers resolved to using bloger blog feeds as the sitemap which makes the example seen in the code block below the common practice.

 https://www.example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

Thus, instead of using the officially supported blogger blog feeds as the sitemap.


The officially supported blogger blog feed being used as the sitemap is:

https://www.example.blogspot.com/feeds/posts/default?orderby=updated

That was not popularly adopted because that reflects only 25 most recent published posts. Whereas the former reflects up to 500 most recent published posts.


And if you had more than 500 published posts, you can include multiples of that as your XML sitemap.


Hence, the reason you could see some blogger blog XML sitemaps like this:

Sitemap: https://www.example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: https://www.example.blogspot.com/atom.xml?redirect=false&start-index=501&max-results=500

And those were working fine until early in the year 2019 when the new Google Search Console started throwing errors indexing the feeds sitemap.

Blogger .XML Sitemap Brief Updates

The .XML sitemap of the blogger which was introduced as a solution could reflect and contain up to 3000 published blog posts.


Thus, if you have up to 3000 published posts in your blogger blog, submitting the single and simple line as in the example below is enough and all your 3000 blog posts are crawled and indexed.


For Blogspot hosted blogs:

Sitemap: https://www.example.blogspot.com/sitemap.xml

For custom domain blogger blog:

Sitemap: https://www.example.com/sitemap.xml

NOTE: if you do not use the https:// version of your blog, simply enter http:// instead otherwise, your sitemap may not go through and will throw various errors.


Meanings of Each Line in Blogger Blog Robots.txt File

As earlier revealed, Robots.txt file contains instructions intended for and useable by the search engines web spiders. Here, let us explain exactly what each of the line instructs the search engine bots so that you may better understand how to use it.

  1. User-agent: Mediapartners-Google - this line does little for your website SEO as it is meant for Google AdSense. That line basically instructs Google's mediapartners web spiders where not to crawl and index on your domain. By practice, it is recommended that you leave that line untouched either you use Google adverts or not. Removing it does no harm either.
  2. User-agent: * - this line communicates with all search engines bots including Google, Bing, Yandex, Baidu and others. The asteric sign right to the User-agent: refers to all search engines. If you wish to communicate to a particular search engine web spiders differently, you can replace the asteric sign with the name of the particular search engine web spiders and you would have to repeat the same process for each of them. Again, by practice, it is recommended that you leave the default.
  3. Disallow: - this line communicates to the search engines web spiders which path of your domain you do not want indexed or crawled. And you can repeat this for as many path as you desire. See the example below:
  4. Disallow: /search
    Disallow: /p
    
    
    In this example, all pages that fall after /search and /p would be ignored by the search engines web spiders.
    
  5. Allow: - this line communicates to the search engines web spiders, the path in your domain which you want crawled and indexed. By practice, just that single line is okay and works for all.
  6. Sitemap: this line is not necessarily a requirement in the Robots.txt file but it is recommended that you have it there. Because? It simply complements the functions of your Robots.txt file as it helps search engine web spiders easily identify and or recognise any updates in your blog sitemap which is also being explored in navigating and in the indexation of your blog by search engines bots.

Can I Use Custom Robots.txt File for Blogger Blog?

Yes. You can use custom Robots.txt file for your blogger blog however, if you have custom robots header tags enabled and set correctly, I strongly recommend that you use ONLY the Robots.txt file that is provided in our examples further down this page on your blogger blog.


I repeat, enable custom Robots.txt file for your blogger blog if you have custom robots header tags enabled and use the robots.txt file we provided in our example.

Can I Include Sitemap.xml Link in Blogger Blog Robots.txt File?

Yes. It is very recommended because it simply complements the functions of your Robots.txt file as it helps search engine web spiders easily identify and or recognise any updates in your blog sitemap which is also being explored in navigating and in the indexation of your blog by search engines bots.


Which Version of Blogger Blog Sitemap.xml File Should I Use?

In this year 2020, if anyone should ask, without a second thought, the officially supported sitemap.xml file is your answer. Simply put, your-domain.com/sitemap.xml is the right one to use this year.
And along that, I would also recommend that you create an HTML Sitemap for your blogger blog as explained here. It is highly recommended. See the linked page for details on that.


Does Custom Robots.txt File Affect Robots Header Tags?

Yes. For blogger blog, it does. It's either you create the correct Robots.txt file accurate and appropriate for your needs and upload it in your blogger blog or you configure the appropriate custom robots header tags. Having both setup, the chances that both would negate each other is high. We already cover detailed explanations on that here: custom Robots.txt file vs custom robots header tags


Step by Step Guides to Creating Perfect Blogger Blog Robots.txt File for Good SEO in 2020

How to Create Perfect Blogger Blog Robots.txt File

Estimated cost: $0
Estimated time: 12 minutes
This tutorial explains with illustration, how to create a Robots.txt file perfect for good SEO for the year 2020. This tutorial is especially created for blogger blog users willing to do better in the competition.

Requirements:

Recommended tools:

Preparation

  • Either you intend to use your phone or PC, ensure that whichever device you wish to use have had the tool listed above installed and ready for use.
Installed plain text editor

Launch your favourite plain text editor

  • For this procedures, plain text editor is highly recommended because using WordPad, MS Word or any other similar application may add unnecessary formatting which may render your Robots.txt file useless therefore; use only either Notepad, Notepad++, Sublime or any other similar plain text editor app.
Plain text editor

Create A New Text File

  • From your chosen plain text application, create a new text files. Ensure that that new text file is empty. Thus, nothing in it.
new text file empty space

Create a skeletal form of your Robots.txt file

  • Now, create a skeletal form of your Robots.txt file by typing the core commands of the Robots.txt file first. And the core lines of the Robots.txt file are User-agents:, Disallow: and Allow:. Each of them should occupy their own separate line such that it looks similar to what is in the image below.
<b>Robots.txt</b> file skeletal form created

Fill up the core lines with your commands

  • Now that the core lines are written, you may start filling them up by entering your commands. For example; the User-agent: may have "*" right to it. Allow: should have "/" right to it. Disallow: should have nothing unless you wish to Disallow some paths in your websites or blog. See the image below.
complete <b>Robots.txt</b> file created

Add your blogger blog XML Sitemap

  • Although not necessarily but highly recommended, you should add your blogger blog sitemap to the bottom of the entire lines such that it looks like what is in the image below:
complete <b>Robots.txt</b> file created

Upload your Robots.txt file

  • The Robots.txt file is now ready and can be uploaded on your blogger blog. Seems too simple? Yes. But as simple as it seems, it can single handedly spike your blog traffic.
blogger blog <b>Robots.txt</b>

How to Add Custom Robots.txt File to Blogger Blog

Now that you have socintly created a perfect robots.txt file for your blogger blog, the following tutorial shows you how to upload and add the robots.txt file to your blogger blog.


To add robots.txt file to your blogger blog:

  1. Visit blogger.com and login your blogger account using your login details.
  2. From your blogger dashboard go to settings and then click search preferences .
  3. From the search preferences screen, click on Custom Robots.txt and then click the radio button left of Yes - that enables custom robots.txt for your blogger blog and a new text field where you can paste your newly created robots.txt file will open.
  4. In the text field that shows, paste your new or modified robots.txt file. Note: clear any text found in that field before you paste the new robots.txt file.
  5. Now, click save to effect the changes.

Final Note

This tutorial, to my knowledge is made completely to help you grow your blogger blog to the next level higher.


If you have any questions or believe there are some points I should clearify or improve on, do let me know in the comment box and I shall respond accordingly and timely.


Also, feel free to check my profile on our authors profile here and follow me via my social media. You can also request my audience by mentioning my names in our official fan pages.

0 Comments

to top