The Hunt For Red October – The Job So Far
Today, Kaspersky Labs released a report on a long running advanced persistent threat* (APT) they had uncovered, revealing a long running cyber-espionage campaign targeting a broad and diverse mixture of both countries and sectors. As usual the fingers were pointed at China (Chinese exploit chains, Chinese hosts used…), however, there was also some evidence to implicate Russian involvement, which was speculated to be a “False Flag” attempt.
An associate of mine, after reading the report, came up with a SHODAN dork rather quickly to identify the C&C hosts.
After a few seconds, he realized that the etag header on all of them was the same, leading to the following query:
SO, Fingerprinting information: just check for etag = 8c0bf6-ba-4b975a53906e4
The “offending IP’s” are as follows. These are used as proxies it appears.
So, we now have a list of 7 C&C hosts. Time to break out nmap and see what they are doing.
The following scan string was used for an initial scan of all the hosts.
sudo nmap -sSUV -A -O -vvv 3 -oA huntingredoctober 184.108.40.206 220.127.116.11 18.104.22.168 22.214.171.124 126.96.36.199 188.8.131.52 184.108.40.206
The tarball of report files is available here: huntingredoctober.tar
The hosts identified as alive are as follows:
The other four were not responsive, probably taken down already. No fun.
Once I had identified which hosts were, infact, still alive (while the rest of the bloody slow scan was running), I decided to see what lay behind the scenes on these hosts, doing the “daft” thing of connecting to port 80 using my web browser. The clench factor was rather intense as I half expected to be owned by about half a dozen super 0day exploits on crack while doing so. instead, I was redirected harmlessly to the BBC.
The following HTML code was responsible for this redirect, which I thought was an incredibly clever way to hide their true purpose.
<!DOCTYPE HTML PUBLIC “-//W3C//DTD HTML 4.0 Transitional//EN”>
<title>BBC – Homepage</title>
<meta http-equiv=”REFRESH” content=”0;url=http://www.bbc.com/”></HEAD>
Back to the nmap scan (it had FINALLY completed), the following was very interesting.
PORT STATE SERVICE VERSION
80/tcp open http?
|_http-title: BBC – Homepage
| http-methods: GET HEAD POST OPTIONS TRACE
| Potentially risky methods: TRACE
138/udp open|filtered netbios-dgm
520/udp filtered route
All of the servers looked like this. They all had those three ports – 80, 138, 520, open or filtered. The rest were all closed. The 220.127.116.11 host began sending me RST packets midway through my scan, but regardless, the work went on. I decided I was going to look at the webserver from informations kaspersky published.
Sending GET requests to the /cgi-bin/ms/check CGI script produced a 500 internal server error, as did other CGI scripts. This was interesting in that they told me to email email@example.com about it. I did so immediately, being a good netizen. Note the mispelling of example – “eaxample”.
Apparently the mail was delivered successfully, so I hope they reply soon with an explanation.
On to more serious things, another analyst working with me uncovered another interesting thing.
He went and did the following:
printf “POST /cgi-bin/nt/th HTTP/1.1\r\nHost: 18.104.22.168\r\nContent-Length: 10000\r\n\r\n%s” `perl -e ‘print “A”x(20000)’` | torsocks nc 22.214.171.124 80
Now, he had figured out the page would 500, unless a content length was set. So, he set a long Content Length, and sent an even longer POST request.
The result was nothing short of fascinating.
HTTP/1.1 200 OK
Date: Mon, 14 Jan 2013 19:18:07 GMT
HTTP/1.1 414 Request-URI Too Large
Date: Mon, 14 Jan 2013 19:18:08 GMT
Content-Type: text/html; charset=iso-8859-1
<!DOCTYPE HTML PUBLIC “-//IETF//DTD HTML 2.0//EN”>
<title>414 Request-URI Too Large</title>
<h1>Request-URI Too Large</h1>
<p>The requested URL’s length exceeds the capacity
limit for this server.<br />
“Look mom! Two headers!”. Seriously, this is interesting. First it gives a 200 OK, then a second later something else says “LOL, NO”. The delay makes us think the proxy is saying “OK”, then the real C&C is complaining. The fact it complains about a request URL, and the length being in the POST request, makes me think the final data-to-C&C might be sent as a GET. Just a theory.
— TO BE CONTINUED —
// This post suffered a bad case of myself and fellow researchers having a lulz about it, and my cat straying onto my keyboard. It is a work in progress.
— Continuing the hunt —
Today we retrieved new intelligence (or rather, last night, but I could not act on this intel) from HD Moore of Metasploit project that more C&C servers had been located. The following link is the list of IP addresses.
So, it was decided (once the cat had gotten the hell off my keyboard) to investigate this list.
@craiu provided us with the tip “check out “1c824e-ba-4bcd8c8b36340″ and “186-1333538825000″ too.”, so we will act upon this later.
I decided, seeing as my internet went down for a while, to test out my Python skillz, and whipped up a quick program I named “SONAR”, which simply attempted a TCP connection to port 80 on the suspected C&C servers and logged responsive ones to a file. Source code attached.
I could have used nmap, but that would have been unimaginative and, frankly, no fun. And who says hunting cyber-spies (So much worse than normal spies, ‘cos they got the dreaded CYBER in there) is not supposed to be bloody fun anyway, not me for certain!
We quickly reduced the list to a “lot less than we had”, and I queued them up for nmap scanning, which has yet to be done, as the sysadmins on the network I am using do not like when I portscan things for some odd reason. Or when I use SSH, or email.
Anyway, I digress.
So far, more C&C servers had been identified, and more “Fingerprinting” methods had been developed. I am considering writing a patch to sonar.py to dump out the etag data along with working IP’s, but that can wait til later. A simple HTTP GET / should do the trick, with a few regex’s.
We also obtained a list of MD5 hashes from malware.lu showing samples of Red October they have in their repo – see here -> http://pastebin.com/7zayMrKt so those were queued up for downloading (once on a non monitored by college network) for some analysis using IDA. That is to be tonight’s job – a quick and dirty first pass run of analysing these things.
* For the record, I think APT is another FUD term… But oh well, it has become “a thing”.