Gone Fishin'

by dcole

Recently, I decided to start learning some server-side programming for a project that I had in mind.

Having used JavaScript on the client side in the past, I choose Node.js as my server-side runtime due to my familiarity with the aforementioned language.  If you are not familiar with Node.js, it is simply a JavaScript runtime built on Chrome's V8 JavaScript engine that runs on the command line of pretty much any operating system.  My project was to consist of a web page where people could post a message.  This message would then be sent using a client-side script to a web facing CouchDB server.  Node.js was used at this point solely to serve the web page itself.

While starting the project, I initially wrote a short server-side test script that would respond to incoming requests from the web and display what those requests were on the console.  After testing the script on my local network, I opened up a port on the router.  This allowed me to test the server from outside my local network which was successful.  As it was late in the evening at this point, I failed to remember to exit my server process and close the port on the router before heading to bed.

The next evening after work when I got back to programming, I found various requests on the terminal that I did not ask of my server.  Noticing I left my router port open to the wild, it dawned on me that these requests were made from the outside world.  This must be all those malicious hackers and script kiddies I hear of trying to gain access to my server to wreak havoc!  Being a naturally curious fellow, this situation gave me the brilliant idea of going fishing for requests, basically setting up a honey pot.  For my fishing lure, I used the following code saved as fishing.js:

const http = require('http');
const requestIp = require('request-ip');

http.createServer((req, res) => {
  req.on('data', () => {})
  req.on('end', () => {
    const ip = requestIp.getClientIp(req);
    console.log(`\u001b[34m${ip}\u001b[0m:\u001b[32m${req.method}\u001b[0m ${req.url}`);
    res.statusCode = 404;
    res.end();
  })
  req.on('error', e => {console.error(e);})
}).listen(8080);

The above code accepts an incoming request, pleasantly displaying it on the console and responds to the request with a 404 (Not Found) code.  I then ran this code for approximately three days using the following commands in a Bash shell on my server:

$ npm install request-ip --save # optional
$ node fishing.js > fishing.log &
$ disown

Asking the server process to run in the background with the & symbol and running the command disown after allowed me to end my SSH session with my server while keeping the fishing program running.  After the three days were over, I logged back into my server and ran the following commands to terminate the process:

$ ps -aux | grep node
$ kill <pid>

After killing my node process, I opened up the fishing.log file and proceeded to extract some interesting information.

I received 192 requests in total during the three days I let this fishing lure run.  Out of these total requests, 174 were GET requests.  In total, 114 unique IPs made requests with one IP having sent 36 requests.  The majority of IPs sent between one and nine requests though.  The most requested URL was / or a root request.

The next most requested URLs were as follows:

/currentsetting.htm
/vendor/phpunit/phpunit/src/Util/PHP/eval-stdin.php
/?a=fetch&content=<php>die(@md5(HelloThinkCMF))</php>
/?XDEBUG_SESSION_START=phpstorm

Now, I understand this information is nothing new to administrators of web servers, but this was new to me.  These types of requests have shown up in request logs for decades now.  I gather the requested URLs change over time as new vulnerabilities are found as well.  Having attempted a little Internet research, I was able to find out /currentsetting.htm is likely a Netgear router exploit and the other three mentioned requests above are all PHP exploits (though I didn't need the Internet to tell me that since PHP is right in the request!).

The initial shock of seeing these requests was quickly overcome by the knowledge that most of these requests are aiming at fairly specific exploits.  If you are not running PHP, no worries.  If you don't have a Netgear router, no worries.  Being aware of these attacks can allow one to be on guard and keep tabs on the latest potential vulnerabilities.

What did I learn from this situation?  I learned that it is important to pay attention to how you plan your web facing programs.  Initially, I was programming my web server without security in mind due to lack of experience with this type of programming.  My CouchDB server was open to the wild and my web server had no filtering for incoming requests.  After the experience of finding these requests accidentally, I changed my methods and started programming with security in mind.

I moved my CouchDB behind the web server code and started filtering for valid requests and denying the rest.  The client-side script would now send the POSTed messages to the server and the server handled posting the messages to the CouchDB.  Initially, with the CouchDB server open to the wild, anyone and their dog with some Curl skills could have deleted all my databases or injected a bunch of useless information into them.  This would have been a kick to the nuts if I had gone live with the initial setup.

What I hope other people who are new to web programming take away from this article is that your initial ideas and excitement may lead you to program in a way that leaves your data or servers vulnerable to exploits.  Take some time at the beginning for a little research and think of your project as a whole in terms of its security needs and vulnerabilities.  This may help reduce the number of iterations required to produce a project as well as reduce the potential failure points.

Have fun and code on!

Code: fishing.js

Return to $2600 Index