Last updated on February 15th, 2022 at 09:05 am

There is nothing difficult about creating a basic robots.txt file. It can be created using notepad or whatever is your favorite text editor. Robots.txt file can be used to ask search bot to include or exclude your website page / folder / content. Each entry has just two lines:

User-Agent: [Spider or Bot name]
Disallow: [Directory or File Name]

The above lines can be repeated for each directory or file you want to exclude, or for each spider or bot you want to exclude.

A few examples will make it clearer.

1. Exclude a file from an individual Search Engine

You have a file, privatefile.htm, in a directory called ‘private’ that you do not wish to be indexed by Google. You know that the spider that Google sends out is called ‘Googlebot’. You would add these lines to your robots.txt file:

User-Agent: Googlebot
Disallow: /secure/secure.php

2. Exclude a section of your site from all spiders and bots

You are building a new section to your site in a directory called ‘newsection’ and do not wish it to be indexed before you are finished. In this case you do not need to specify each robot that you wish to exclude, you can simply use a wildcard character, ‘*’, to exclude them all.

User-Agent: *
Disallow: /data/

Note that there is a forward slash at the beginning and end of the directory name, indicating that you do not want any files in that directory indexed.

3. Allow all spiders to index everything

Once again you can use the wildcard, ‘*’, to let all spiders know they are welcome. The second, disallow, line you just leave empty, that is your disallow from nowhere.

User-agent: *
Disallow:

4. Allow no spiders to index any part of your site

This requires just a tiny change from the command above – be careful!

User-agent: *
Disallow: /

Thanks for your time 🙂

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *