Select Page

Now that Xrumer 7.0 Elite can make sense of most CAPTCHAs, we need a new completely automated public Turing tests to tell Computers and humans apart.

Why are we making humans prove they are human while it is much simpler to catch robots being robots? Let’s create a completely confusing UX for spambots and be done with most of their shenanigans!

First of all, no system will catch all spam but it is partly your responsibility as website owner to control spam, even if after the fact, rather than place that responsibility on your human users. But do install a flagging system if you have a User Generated Content (UGC) community and take action when your users help you out. Remember when all you could find on most sites was PORN? Website owners would say: “Well we can’t control what users upload” but Flickr came along and fixed that problem by tagging adult content.There’s still plenty of adult content on Flickr, but you can filter it out, which is the best solution. And so Flickr, with its teen and woman friendly site, prospered. I assume that there are a million random cam sites on the web now but because of Chat Roulette‘s biggest flaw, I would rather not go check if they have installed a ‘wanker’ filter system. The site that does, however, might find a wider audience of women and reach mainstream popularity.

So on to the actual advice, which, as usual is simple, unglamorous and dirt cheap to implement…

I's in Ur WordPress, fixin' Ur Spam

1. Implement confusing field names

Robots have been trying to endanger all my client’s websites with thousands of sign-ups and spam comments for 5-6 years. I have not personally faced an attack myself and use AKISMET profusely but all sites that have a contact or signup form eventually fall prey to dreaded spam bots. Spam bots can fill your entire database and shut you out of WordPress in very little time. After years of studying them, I have realized that these bots are tuned to recognize input labels and provide likely answers. Once I implement a spam killing feature, a client’s site will fall out of rotation which means it is removed from the ‘easy prey’ list and the spamming ceases.

If your form has validation on certain fields, you could simply assign diverging names to a validated form. For instance, your dual e-mail fields could be named ‘address’ and ‘phonenumber’. First of all, a bot will be unlikely to provide content that validates. In addition, a bot will be unlikely to enter the same information twice. A human user will not see this information or have to jump through any hoops.

2. Implement fake-CAPTCHA

I recently left a job that had me integrating contests on a buggy third party platform and trying to find creative workarounds. One of the more grating problem had to do with removing template form fields I did not want. When I would remove the code for these fields, the various forms would cease to work. An integrator colleague suggested simply hiding the display with CSS styling.

And so came the revelation that this could be used to fool robots. Because robots fill-out every field, simply reversing the CAPTCHA behavior and placing submission WITH a CAPTCHA answer in the spam box will catch a lot of suspect submissions. A bonus of such a system is that is is so simple to implement given that you don’t actually have to validate the CAPTCHA just whether or not the form field was filled out. A human user will not see this information or have to jump through any hoops.

3. Implement a geeky or pop-culturally narrow CAPTCHA system

If you implement a custom CAPTCHA that is incomprehensible to people who are not readers in your field, or that is based on something pop-cultural that is so narrow it would make no sense to the average gold-farmer-for-hire-CAPTCHA-breaker than it might take a bit longer to break. While working on an excellent site aimed at women over 40. I came up with this CAPTCHA system to amuse my colleagues.

This CAPTCHA example was generated randomly from Google Image results for “cougar in a leopard print“. As a 39 year old woman who’s ex is only 24, I hate the terminology BUT I love Kevin Nealon.)

I still have clients who think their sites are specifically being targeted for annihilation and will always have to continue to tell them that spambots are a part of everyday life on the Web for EVERYONE. Any measure to discourage automatic submission of unrelated or useless content can only do so much. With many individuals, corporations and government entities financing initiatives to create fake social media accounts to control the message about their brand or do surveillance, we are entering into a new era where humans will have to moderate ‘shills’ and ‘spooks’ to preserve the sincerity and quality of their community and that cannot be done with technology alone.

ETA: I highly recommend this article by David Bushell on Smashing Magazine : In Search Of The Perfect CAPTCHA. It’s the most complete and fascinating list of (very expensive) solutions I’ve found. Of course I am completely in disagreement with the statement about Facebook’s Alex Rice that Hackers halfway across the world might know your password, but they don’t know who your friends are.” because I have proven that the contrary is much more likely in my article Facebook Unintentional Feature – Retrieving a Hidden Friend List.