4 Proven Ways to Combat Website Spam
A few days ago I introduced a comment feature to the site allowing anyone to add a comment to a blog/tutorial post. The problem I found is that spambots found this quickly and started submitting their spammy comments by the hundreds on a daily basis. I obviously had to react fast or I would have a lot to clean up. So here are some great ways to rid your blog/website of spam.
How Spam Gets On Your Site
Spambots tend to look for forms that they can submit information to. Whether its a contact us form and an order enquiry form, they will find it and use it to post spammy information. So these forms are what need to be improved to prevent spam from getting through.
Option 1: Login Required
The first obvious option is to require users to login before they can use the form. This will 99.9% of the time prevent any bot as they would first have to sign up, then log in, then use the form to post spam.
The advantage with this is your form is extremely secure and I would highly doubt any spam would get through.
The downside to this option is that it requires action on your users before they can use the form. This can often times put the user off and you may find a decrease in form submissions.
Option 2: CAPTCHA
For those who don't know, captcha is a way to verify that the user is a human by asking the user to do something before they submit. Common CAPTCHA methods are to create a CAPTCHA image that they have to type the letters/numbers that are dynamically generated through some server side scripting. However, an even easier form of CAPTCHA is to ask a simple math question such as 'What is 1 + 2?'. The key with CAPTCHA is that every time the page is refreshed a new CAPTCHA is generated.
The advantage to CAPTCHA is that its nearly impossible for a spam robot to predict what the answer is and the user doesn't have to go through a signup/login process to use the form.
The downside to CAPTCHA is that most human users find it really annoying and you may also find that if your audience is older, that they may have trouble providing the correct answer (no offence older web users, even I have trouble with CAPTCHA).
Option 3: Hidden Form Field
Another possible option is to add a hidden form field with the field name of 'info' or 'comment'. The idea here is that normal users won't fill in hidden form fields but robots probably will as they tend to only read the code and see a field. Then what you would do is check if the field has anything in it. If it does, it must be a robot.
The advantage to this option is that it requires no extra steps on the human user. They can use your site as they wish without needing to login or validate a random series of letters/numbers. Plus, its really easy to do.
The downside to this option is that some robots still get around this. I have tried using type="hidden" and even using CSS to position the text field off the screen. It seems that alot of the robots have figured out how to overcome it.
Option 4: Akismet
The final option is to use a service called Akismet. Basically, it's a global database of information about website spammers that compare every form submission and it will tell you if it thinks its spam or not. I have to say, I didn't like the idea of relying on a third party but they do a fantastic job. If WordPress.com uses it, it must be pretty good.
The advantage to Akismet is that 99.9% of the time, it will flag spam up correctly and it doesn't require any extra work on the user. It's really hard to flaw it.
The downside to Akisment that I can see is 1.) its a third-party site, 2.) it requires a Wordpress API (but they're free) and 3.) it needs you to know a little bit about service side scripting (but you probably already do).
All four options might help prevent spam and I have used every one of them. However, I have to say that Aksimet would have to be my choice for the best way to combat against website spam.
I hope this has helped the odd few people and if you have any suggestions on ways to combat website spam that you found helpful, please use the spam free comment feature below to let us know.
Sharing is Caring
Leave a Comment
Wilde @ 10 Jun 2009 09:26:11 AMI was only aware of the first 2 before reading this, #3 is especially sneaky :)
David @ 13 Jun 2009 05:53:30 AM@Wilde
Yes 1 and 2 are some great ideas. Personally, I think Akismet is the best option.
Brennan @ 17 Oct 2009 03:44:54 PMHey David,
I commented on your Wordpress article too and this one caught my attention since I built a news site for my school but never implemented any SPAM deterrent. I DID put in a small "Type 'NO SPAM'" which seemed to work a lot (read, all) of the time.
I'm very interested in Akismet though and I've seen it before but only in Wordpress blogs! Can it be used in custom designed sites? I know I'd need a Wordpress API but how hard is it then to use it in a non-crap-cookiecutter-Wordpress site?
David @ 18 Oct 2009 02:20:23 PM@Brennan
Its very easy to implement. Yes, unfortunatly you must have a wordpress API (which seems a bit stupid if you don't use WordPress). A quick google for "akismet api php tutorial' will bring up the information you need.
If i get the time, I might write a quick one. It's very simple.
kcmartz @ 20 Dec 2009 11:56:04 AMNICE, but there are better spam blockers out there, like wp-spam free, a wordpress spam blocker, it has blocked over 200 spam in 1 month! and I don't have to read each comment, because i rarely get a spam comment now!!!
Sarah @ 5 May 2010 01:25:55 PMHere’s an alternative CAPTCHA that is easier on humans than those awful distorted letters and warped words. It simply asks them to click on certain images, like dogs, flowers and cars: http://www.confidenttechnologies.com/products/confident_CAPTCHA.php.
In the interest of full disclosure -- I work for this company. That being said, I would be very interested in hearing what people think about it. I think that it\'s definitely easier on the human user, while still being tough on bots because the pictures change with every session, making it difficult for a bot to break it using random guessing or a brute force attack.
Traditional text-based CAPTCHAs are a frustrating eye-strain test for users, but a CAPTCHA remains a good layer of security to have on a site (among other security tools that your website should use). It\'s important that bots can do much more damage than simply spread annoying spam… they spread dangerous worms like the Koobface worm and can post links to malicious websites like phishing sites, or websites that can download Trojan horses and keyloggers onto visitor’s computers.
David @ 5 May 2010 03:53:38 PM@Sarah
It's a great idea, but I can imagine there are plenty of free alternatives to your captcha method.