Automated Target Acquisition

by r00tNinjas: 0rbytal & blerbl (0rbytal@burntmail.com & theblerbl@gmail.com)

"Invincibility lies in the defense; the possibility of victory in the attack." - Sun Tzu

Whether you are tired of hackers messing with your server (defense), you've got mad hacking skills and no targets (offense), or perhaps both, this article should interest you.  I will briefly explain a brilliant system set up by my friend blerbl because he's a technohacker genius who doesn't like to write, and I'm a fairly decent writer who thought my 2600 brethren would love to replicate his defensive web server configuration.  But first, the standard disclaimer:

This information is strictly for educational purposes.  You should not try this outside of your own personally owned and operated test network.  Any consequences resulting from your application of the knowledge shared in this article are your own fault.  Do not try this at home.

blerbl runs his own web server, mostly as a front lobby to host various files he wants to access from any remote location with Internet access.  As any web administrator who monitors their server will notice, the number of automated scans occurring across the Internet is prodigious.  He doesn't mind being scanned, but he'd prefer they not launch remote file inclusion attacks to enlist him in their botnet.

Like most savvy web administrators, blerbl uses a robots.txt file on his server to politely ask the courteous web crawlers to refrain from searching or indexing specific directories.  Of course, blerbl also knows that the cunning hackers look for robots.txt files on web servers because they often contain the file paths that are much more interesting than what is published on the server.  With this in mind, blerbl makes sure to include in his robots.txt file paths to tantalizing pages like /myadmin.php as sort of a "honey pot" for the nefarious hackers and inconsiderate web crawlers.  To avoid copying the honeypot page a thousand times, renaming it as every permutation of myadmin.php, blerbl used the mod_rewrite engine of Apache to help him accomplish his goal.

When a user requests /myadmin.php on his website, the user's IP address is added to a special log file.  He added a rule to his Apache configuration that will compare all requests with requests filed in the special log.  If the request matches a logged IP from the special log, the request is transparently modified to become a request for the trap page... again.  To reinforce his intent, blerbl added a rule at the top of his configuration file that compares the requestor's IP address with the special log file and serves an error page if the requestor has ever previously accessed the trap page.

This routine prevents malicious users from accessing his server from that IP address, as was blerbl's intent, but this method isn't just an effective defensive measure... remember that when the user is blacklisted, his IP address is logged by the server in the special log file.  Most casual Internet users will only browse the pages that are linked, and have no interest in a robots.txt file, or any page listed in it.  Who has any interest in browsing pages and files listed in the robots.txt file?  Hackers.

The special log file containing the blacklisted IP addresses can now be used as a targeting list!  Clever and careful hackers won't hack directly from their own IP address... they use somebody else's.  So, the blacklisted IP addresses likely belong to either; (A) noobs who don't really know what they're doing, (B) script kiddies who disregard stealth, or (C) compromised systems.

Regardless of the type of user that scanned the web server, the admin can now scan the scanner with a fair probability they can gain access (if the admin had the time/interest).  It's kind of like being an active agent of karma, teaching hackers the golden rule through the most effective (and often merciless) teacher: experience.

Another benefit to this defense implementation is that the web admin can add rules to discriminate based on user agents that script kiddies often use, or any other screening parameter.  Plus, the blacklist can be modified or manually updated without having to restart the server.  The customization possibilities are endless.  Below is the code for the auto-blacklisting files you can use to defend your web server, or to automate your target acquisition.

Hack All The Things!

security.conf:

include the desired site's conf file
RewriteEngine ON
#RewriteLog rwlog.log
#RewriteLogLevel 5
## BLACKLIST IPS ##
RewriteMap ipslist txt:/etc/security/blacklistip
RewriteCond %{REMOTE_ADDR} ^(.*)$
RewriteCond ${ipslist:%1|white} ^black$ [NC]
RewriteRule (.*) - [F]
## TRAP REQUESTS ##
RewriteMap reqlist txt:/etc/security/bad_requests
RewriteCond %{REQUEST_URI} ^/*(\S+)/*$ [NC]
RewriteCond ${reqlist:%1|white} ^black$ [NC]
RewriteRule (.*) "/trap.php" [L]
## RFI Prevention ##
RewriteCond %{THE_REQUEST} GET\ ((http)|(ftp))(://|s://)+.*
RewriteRule (.*) "/trap.php" [L]

trap.php:

<html>
	<title> Oh my </title>
	<body>
	<center><p>Now what ???</p></center>
	</body>
<?php
    $bl_filename = "/etc/security/blacklistip";
    $f = fopen($bl_filename,'a');
    $msg = $_SERVER['REMOTE_ADDR']."\tblack\n";
    fwrite($f,$msg);
    fclose($f);

    $bl_filename = "/etc/security/offenses.log";
    $f = fopen($bl_filename,'a');
    $msg = $_SERVER['REQUEST_TIME']."\t".$_SERVER['REMOTE_ADDR']."\n";
    fwrite($f,$msg);
    fclose($f);
?></html>

Code: security.conf

Code: trap.php

Return to $2600 Index