Recently we added new options for SpamFireWall.
They are Anti-Flood and Anti-Crawler. They are meant to block different bots. Most of the visiting bots are not being shown in the statistics of Google Analytics so you can’t see the exact number of their visits. Nevertheless, bots can create a big load on your website and be a big part of the overall statistics of visits. They can gather various data about your website, links, pictures, text and so on. More aggressive bots can copy your website information to use it later for themselves.
Anti-Flood and Anti-Crawler options are intended for blocking unwanted bots, content parsing, shop goods prices parsing or aggressive website scanning bots. Web crawling bots such as Google, Bing, MSN, Yandex, Ahrefs are excluded and will not be blocked.
Anti-Crawler — is designed for blocking any bots (with the exceptions) on your website. The initial visit launches the checking process whether it’s a bot or a human. If the checking fails then visiting the second page will return a blocking screen. The IP address will be added to the blacklist for 10 minutes and when this time period expires the data about that IP address will be deleted.
This option helps in blocking any parsing and HTTP DDoS attacks.
Anti-Flood — is designed for preventing any aggressive behavior of bots (with the exceptions). The option checks how many pages were visited by one IP in 1 minute. If the amount of visited pages exceeds the set threshold then that IP will see a blocking screen. The blocking screen is active for 30 seconds and when this time period expires the IP address will be able to visit the website until it exceeds the threshold again.
By default, the threshold is set to 10 pages per 1 minute. This number was picked based on the statistics. Usually, a normal visitor does not open 10 pages at once, it’s about 3-4 pages so the threshold is set with a margin.
You can set your own threshold at any time: https://cleantalk.org/help/anti-flood-and-anti-crawler