Let’s talk a little bit about the most common website spam issues and how to actually prevent or resolve them.
Every once in a while I get work order requests from customers on a shared hosting environment that all of a sudden are exceeding their resource usage allowance and this either could either cause severe latency and errors on their site or could simply bring the entire server down.
In at least half of these situations the issue is caused by overgrown MySQL databases that are full of spam and the issue isn’t isolated to a specific CMS as it occurs with WordPress, Joomla, Drupal sites but also with forums such as SMF, phpBB or wiki’s such as MediaWiki.
The reason for this is singular across all situations as the sites are neglected or abandoned and they don’t have any form of security in place for public registration or posting so they could easily fall target to spamming scripts such as Scrapebox or similar.
Neglecting to secure the submission forms on a website could end up causing situations like the one in the screenshot below with over 250k comments posted and awaiting moderation and with an overgrown database of over 2GB in size which would obviously cause latency and could take the site down:
Preventing this from occurring is rather easy. First of all a website owner should determine if he needs or not public registration and commenting to be enabled. If not disabling both public registration and commenting would suffice in preventing such situations.
However, if public registration and commenting are needed then protecting the forms with captcha like Google’s reCaptcha should help minimize the volume of incoming spam.
Additionally security plugins, modules or extensions can be installed depending on the script used so that the form submissions are cross compared against blacklists which will add much more security.
And lastly, code can be added in .htaccess (if you’re on an Apache webserver) to prevent automated submissions and mitigate such attacks even before PHP execution is invoked.