To block bad bots by using their user-agent
You can add a directive in Htaccess to block a bad bot with an expression in the user-agent of the spambot. Don’t copy and paste the long list of user-agent that you can find online. Block only the bots that visit your website. By the past, I have made the mistake to copy a list and paste it into my htaccess, and then all the visitors using Firefox could not access my website. In order to block the bad bots using user-agent, check your log and find the user-agent of the spambot then, identify an original expression in the user agent (like jojolaf) and add it to a new directive RewriteCond %{HTTP_USER_AGENT} jojolaf [NC] at the top of your series like this:
RewriteCond %{HTTP_USER_AGENT} (andsoone|bloup|badbot) [NC]
RewriteRule .* - [L,R=404]
To block bad bots using IP address
In this case, you type a list of IP addresses in your htaccess file. NC means case insensitive.
Order Deny, Allow
Deny from 192.168.1.1
Deny from 192.168.1.2
To block bad bots using “referer”
The “referer” is now a way for some people to advertise in your google analytics or website statistics. You can block those bots by blocking all requests coming from a specific expression in the referrer.
RewriteCond %{HTTP_REFERER}(autoseoservice|nubuilderoo|anot) [NC]
RewriteRule .* - [L,R=404]
Explanation of the last command:
RewriteRule .* – [L,R=404] => It returns a 404 error page. This is made on purpose to make the
Compressed version:
RewriteCond %{HTTP_USER_AGENT} (baidu|jojolaf) [NC,OR]
RewriteCond %{HTTP_REFERER} (autoseoservice|nubuilderoo|dddd) [NC]
RewriteRule .* - [R=404,L]
To check if it is working, download the chrome extension modHeader. Add a “
Take into account that the 3rd part counter spam like stopforumspam, Akismet, honeypot, DNS