How to use robot.txt in blogger/wordpress

How to use robot.txt in blogger/wordpress

As a blogger you may have heard about robot.txt at least once and till now you aren't sure about what is the use of robot.txt ? Then you are at the right place. First of all , Robot.txt is a protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers like , Google bot and Bing bot.Website owners can use the /robots.txt file to give instructions about their site to web robots. These instructions includes sitemap of your site or blog and Directories or pages you don't want to include in search result. Search engine robots uses robot.txt as a permission to crawl your site data which can they index later to search results.


  • How to use robot.txt in blogger ?

Step 1: Go to blogger > Setting > Search Preferences.

Step 2: Edit or Enable Custom robot.txt from Crawler and Indexing Section.


Step 3: Copy and paste below codes , add your sitemap and pages you want to disallow and save changes.

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: Page URL you don't want to appear in search 
Disallow: Page URL you don't want to appear in search 
Allow: /
Sitemap: http://www.Yourdomain.com/sitemap.xml


Change red bold text with required data and save changes.

  • How to use robot.txt in wordpress ?
Step 1: Open any of your favorite text editing software, You can also use notepad (if using windows).

Step 2: Copy and paste below codes to text editor.


User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: Page URL you don't want to appear in search 
Disallow: Page URL you don't want to appear in search 
Allow: /
Sitemap: http://www.Yourdomain.com/sitemap.xml


Step 3: Change red bold text with your sitemap and pages you don't want crawlers to access.

Step 4: Save this file as robot.txt and upload to your root directory.

No comments:

Post a Comment