Robots

Create a Perfect Robots.txt File

Robots.txt is a text file which tells search engine crawlers which pages on your website are to crawl and which don’t.

Robots.txt file should be placed on your website’s root directory. If you don’t want the search engine to crawl some of your website pages like admin, cgi-bin then you can use “disallow” command. For example:

User-agent: *
Disallow: /wp-admin
Disallow: /cgi-bin

Save this code in “robots.txt” text file and put this file in your website root directory. Now your website pages like “wp-admin” and “cgi-bin” won’t be a crawl in the search engine.

If you want to get more traffic for your website, Get in touch today.