What to do to limit crawlers and Bots on website?

kumkumsharma

Administrator
Staff member
Our websites are crawled from different bots and search engines, sometimes it is beneficial or sometime it is creating the issue. Bots and crawlers are consumed lots of bandwidth. To avoid this issue we can upload robots.txt file in our home directory. In this file we can block bots and for that we have to make following entries:

Code:
User-agent: Googlebot
Disallow: /
User-agent: Yandex
Disallow: /
User-agent: Slurp
Disallow: /
Above code will block entire website for the bots, but if you want to block particular directory then you can use below code:

Code:
User-agent: Googlebot
Disallow: /cgi-bin/
User-agent: Yandex
Disallow: /wp-admin
 
Top