Booter – Bots Crawlers Manager is a preventative measure (treatment in advance) and treatment of damages caused by crawlers and bots.
The plugin uses a number of existing technologies which are known by crawlers and bots and takes them one step forward – smartly and almost completely automatically.
To allow the plugin to function correctly, you must follow the instructions and manually enter some data (which must be done by a human being to avoid errors).
At the prevention level
- Booter allows you to manage and create an advanced dynamic robots.txt file.
- View a 404 error log to see the most common bad links.
- Blocking bad bots that cause high server loads due to very frequent page crawls, or are used to search for security vulnerabilities.
At the treatment level
- Booter allows you to limit the amount of requests from crawlers and bots, if or when they exceed the specified amount of requests per minute, it will be rejected for a specified period of time.
- Rejecting links that we do not want in the fastest way, not by just blocking but by sending the appropriate HTTP status code to make search engines forget them.
Instructions for use in case of damage treatment
- Activate the plugin.
- Enable the 404 error log option.
- Set the access rate limit.
- Watch the 404 log, try to find common parts in the URLs that repeats most often.
- Enter the common parts to the “reject links” page, and ensure the rejection code is 410.
- Clear the 404 error log.
- Repeat the process once every few hours until the 404 error log remains blank.
- Check the status of your website’s index coverage every few days.
Plugin General Settings
Reject Links Settings
booter-crawlers-managerfolder to the
- Activate the plugin through the ‘Plugins’ menu in WordPress
- The plugin will start rate limiting as soon as it is activated, however it is recommended to update the settings to suit your needs, under ‘Settings’ – ‘Booter – Crawlers Manager’ menu
- Updated default settings
- Made the robots block case-sensitive to reduce false-positives
- Updated default settings
- UI improvements
- UI fixes
- Added option to create a simple predefined robots.txt file
- Reverted some changes from 1.2
- Default settings changes
- UI and text improvements
- Added more help text
- Readme changes
- Added disavow links tool
- Added help screens
- Added option to add rejected links to robots.txt
- Disabled sending daily 404 report if there were no 404 errors that day
- Minor bug fixes
- Changes in data structure to avoid hitting post max vars limits
- Added additional bad robots
- Added website name to 404 daily emails
- Minor bug fixes and changes
- Added option to reject links based on regular expressions