Skip to main content
How to Manage Bots With robots.txt and .htaccess
Rapyd Team avatar
Written by Rapyd Team
Updated over 3 months ago

What Are Bots?

Bots, also known as web crawlers or spiders, are automated programs designed to perform repetitive tasks on the internet. They can range from beneficial bots, like search engine crawlers, to malicious bots, like scrapers or bots designed to exploit website vulnerabilities. While bots are essential for the web ecosystem, such as helping search engines index websites, not all bots are good for your site.

Pros and Cons of Bots

Bots, while often essential for various online functions, come with both advantages and disadvantages. On one hand, they automate tasks, improve efficiency, and enhance user experience. On the other hand, they can also lead to issues like server overloads and security vulnerabilities. Understanding these pros and cons can help you manage bot traffic effectively.

Pros of Bots

  • Search Engine Indexing: Bots like Googlebot and Bingbot crawl your site to index its content, making it discoverable on search engines.

  • Monitoring and Automation: Good bots can check website performance, uptime, and security vulnerabilities, ensuring optimal functionality.

Cons of Bots

  • Excessive Server Load: Some bots can crawl your site too aggressively, consuming server resources and slowing down your website.

  • Security Risks: Malicious bots can attempt to exploit security vulnerabilities, scrape content, or overload your site with requests, leading to downtime or data breaches.

  • Spam: Bots may fill out forms, leave spammy comments, or trigger automated actions that can damage your site’s credibility.

How to Manage or Block Bots

Bots can be managed through various methods, and one of the simplest ways to control how bots interact with your website is by using the robots.txt file. The robots.txt file is a text file located in your site’s root directory that tells bots which parts of your site they are allowed or disallowed from crawling. While this is an effective way to direct well-behaved bots (like search engine crawlers), not all bots respect this file, which means further actions may be required in some cases (see below).

Method 1: Managing Bots with the robots.txt File

The robots.txt file can be accessed through SFTP or from the Rapyd dashboard file manager. If you do not see one in the webroot directory, you can create one and add it to your webroot directory. You can use this as a standard boilerplate for WordPress websites:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Plugins like Yoast or All in One SEO have options to create and manage a robots.txt file for your site as well. However, these plugins touch nearly every part of your site and can sometimes cause unwanted performance issues, so be sure to manage the settings according to your site if you choose to use one.

If you prefer to do things manually, create the robots.txt file, add the boilerplate from above, then add the code snippets below according to your needs.

1. Blocking All Bots from Your Entire Site:

User-agent: *
Disallow: /

This blocks all bots from crawling your website.

2. Blocking Specific Bots from Your Entire Site:

User-agent: BadBot
Disallow: /

This blocks only the specified bot (e.g., BadBot) from crawling your website.

3. Blocking All Bots from a Specific Directory:

User-agent: *
Disallow: /private/

This blocks all bots from accessing the /private/ directory.

4. Blocking Specific Bots from a Specific Directory:

User-agent: BadBot
Disallow: /private/

User-agent: *
Allow: /private/

In this case, only BadBot is restricted from crawling the /private/ directory, while all other bots are allowed.

5. Allowing Specific Bots While Blocking Others from a Directory:

User-agent: Googlebot
Allow: /private/

User-agent: *
Disallow: /private/

This allows Googlebot to crawl the /private/ directory, while all other bots are blocked.

Method 2: Managing Bots with WordPress Plugins

In addition to the robots.txt file, several WordPress plugins are designed to block harmful or unwanted bots. These plugins provide more advanced bot management features, such as automatic detection and blocking. Here are two popular WordPress plugins for controlling bot traffic:

  1. Blackhole for Bad Bots
    This plugin creates a hidden link that only bots can follow. When a bot accesses the hidden link, it triggers an automatic block, permanently preventing the bot from crawling your site. It’s lightweight and effective, especially for blocking malicious bots.

  2. StopBadBots
    StopBadBots works by using an updated blacklist of known malicious bots to block them from crawling your site. It logs all bot activity, allowing you to review and take action if needed. This plugin is highly effective at stopping spam and harmful bots.

Note: While plugins like Blackhole for Bad Bots and StopBadBots can be effective, we generally recommend avoiding the use of unnecessary plugins, as they can impact your site’s performance. Be sure to evaluate the need for each plugin and choose those that offer the most benefit with minimal performance impact.

Using .htaccess and Blocking User Agents

Some bad bots ignore the instructions in your robots.txt file and continue to crawl your site, causing issues. In these cases, you can take stronger measures by using .htaccess rules and blocking specific user agents.

When to Use .htaccess to Block Bots:

  • When bots are consuming excessive server resources.

  • When bots are ignoring the robots.txt file.

  • When bots are scraping your content or attempting to exploit vulnerabilities.

How to Block Bots Using .htaccess:

The .htaccess file is a powerful configuration file used by Litespeed and Apache web servers. You can block bots by user-agent or by IP address.

1. Blocking Specific User-Agents: If you know the user-agent of the bot you want to block (e.g., BadBot), add the following lines to your .htaccess file:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^BadBot [NC]
RewriteRule .* - [F,L]

This denies access to any bot identifying itself as BadBot.

2. Blocking Multiple User-Agents: To block multiple bots by their user-agents, you can list them as follows:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^BadBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EvilBot [NC]
RewriteRule .* - [F,L]

This blocks both BadBot and EvilBot.

Blocking Bots by IP Address: If a bot’s IP address is known, you can block it by adding the following to your .htaccess file:

Order Deny,Allow
Deny from 192.168.1.1

Replace 192.168.1.1 with the IP address of the bot you want to block.

Considerations for Blocking with .htaccess:

  • Blocking bots at the server level using .htaccess can be very effective but must be done with caution. Be sure you’re only blocking malicious or unwanted bots and not legitimate traffic.

  • Monitor your logs regularly to avoid over-blocking and ensure that no legitimate bots (such as Googlebot) are affected by your rules.

Conclusion

Controlling bot traffic is essential for ensuring the performance, security, and SEO of your WordPress website. Using a combination of the robots.txt file and WordPress plugins like Blackhole for Bad Bots and StopBadBots provides a robust defense against unwanted bots. However, in cases where bots don’t respect the robots.txt file, using .htaccess to block specific user agents or IP addresses adds an additional layer of protection.

By managing bot traffic effectively, you can protect your website from performance degradation, content scraping, and malicious attacks while allowing search engines and beneficial bots to index your site properly.

For further assistance in managing bots or securing your website, feel free to contact our support team!

Did this answer your question?