Brute force referer spamming

Over the last two days, I have been the target of some real brute-force referer spamming attempts, from IP-blocks owned by a “russian co-location provider.”:http://whois.sc/85.192.40.13
The current count for August 15th, and August 16th is a staggering _16189_ attempts at referer spam, and they’re still at it.
My experience tells me that complaining to the provider doesn’t help, so I have added the following to .htaccess:
bc. Deny from 85.192.40.
Deny from 85.192.41.
Alternatively, if you would like for them to taste their own medicine, you can do something like this:
bc. SetEnvIfNoCase Remote_Addr “^85.192.4(0|1).[0-9]{1,3}$” RefererSpam
RewriteCond %{ENV:RefererSpam} ^1$
RewriteCond %{HTTP_REFERER} ^(.*)$
RewriteRule ^(.*)$ %1 [R=301,L]

Previous Post

4 Comments

  1. Hello!
    I’ve been fighting referer spamming too for a few months now. I was getting referer spam from many different IPs, but with various ‘specific’ keywords (domain names), offering different products (you know what type of medical products I mean) and offering online gambling, etc. To fight this kind of referer spam (I was getting A LOT of it) I was forced to use RewriteMap prg:/check_referer.pl with a RewriteCond that checked the HTTP_REFERER with the RewriteMap perl file. The Perl file, of course, checks with a regular expression for ‘bad’ keywords. The file is easy to update, and I can easily add new keywords. I’ve done it because I wasn’t happy with the widely available suggestions to just add to .htaccess the keywords (or the IPs) you want.
    Also, there seems to be a new ‘trick’ in town. I don’t know if it did happend to you, but there are now various bots which reload a specific page (and file) from the server many times in only one hour. Two bots managed to each make about 24000 hits on my server in just one hour.
    Seems spammers can’t get no sleep :).

  2. Daniel

     /  2005-08-17

    What exacly does that .htaccess “script” do?

  3. The theory behind the referer spam URL is described in “Referer spam mirror”:http://virtuelvis.com/archives/2005/06/spam-mirror – but I’ll explain it quickly anyway:
    # The SetEnvIf checks that the remote address is one of the machines that is spamming.
    # The three next rules pick up the referer the request that contains the address, and sends a 301 Moved Permanently redirect to the URL the spammers are using.
    The idea is that instead of wasting my bandwith, I send the traffic back to the spammers and waste theirs instead.

  4. To Daniel (if you asked about my “scripts”):
    Check referer Perl script does simply check the referer URL with a regular expression. I keep updating the keywords not allowed in referer URLs. Yes, I don’t care if the user is searching on google for porn and then it finds my site (by some miracle). I’m feeling better to forbid access to those users :), instead of risking to allow true spam bots to flood my server.
    Check IP monitors in memory the hits from a specific IP. If that IP does more hits than allowed in a specified interval, then I simply ban it :). The ban duration is just half hour (of course, configurable) and the ban timer is reseted each time the bot tries again. This means if you get banned now and try again in 25 minutes, then … you’ll have to wait another 30 minutes to get unbanned. It took some ‘calibration’ for the settings, but now the script no longer bans IPs that it should not.
    I know the first thing you probably think of, is these performance impact. I did think of that myself, but … the performance impact is insignificant compared to the performance gain (yes, performance gain). What do I mean? Instead of dynamically generating the page interface of my site (which is quite dynamic) each time a bot wants (like 24 000 times in just one hour) it’s much better to just give it ‘forbidden’ error (i’ll probably use 301 instead of 403). That’s a true performance gain. Another performance gain for users is the bandwidth (I currently have a slow internet connection for the server). Plus, the performance impact of the perl scripts is minimal because Apache starts only once the scripts as separate processes and keeps them running, only sending them the HTTP_REFERER and REMOTE_ADDR via STDIN when a request is made.
    That’s about all.