How to configure Robots.txt and upload it to root directory for WordPress blogs

The robots.txt file is a text file (.txt extension) which is used to tell search engines about the pages that it should not be crawled and indexed. Yes, you can control bots of different search engines like Google, Yahoo and Bing to prevent them to crawl to specific area like WordPress admin pages, feeds, trackbacks for security reason and avoid duplicate content. Robots.txt is a nice medium to tell search engines not to enter at this place (this place is not so important for you).

robots Make a new text document and copy the following code inside it. Rename the document as Robots. Since it has .txt extension so that so that the file may be called as Robots.txt. Now upload this file to the root directory.

How to upload file to root directory

The location of robots.txt should be at a specific place. It must be in the main directory because search engines cannot search the whole site to find it.Open your FTP client, After login to your FTP you will se lots of folders. Look at the Public_html folder. You can move your Robots.txt text document inside this folder. After successfully listing you can check that you file is successfully listed or not. You can check this by opening .

It will show the content of your Robots.txt file. In this way your Robots.txt is configured and you don’t need to take care of your site’s security area because your admin part is now secured from Google bots indexing.

4 Responses

Leave a Reply

Your email address will not be published. Required fields are marked *