Sunday, January 17, 2010

Analysis Stuff

Metadata
Didier has posted new versions of his PDFiD and pdf-parser tools. Didier's offerings really kind of run the gamut, don't they? Well, hey...it's all good stuff! I mean, really, other than the fact that he's updated these really great tools, what else needs to be said?

Malware
The MMPC posted some new malware descriptions recently, regarding Hamweq and Rimecud. Nice names. Signatures have been added to MRT, apparently.

An interesting aspect of Hamweq is that it apparently drops files in the Recycle Bin, and uses the Installed Components key in the Registry as a persistence mechanism. I wrote up a quick and dirty plugin for the key...entries with StubPath values that point to the Recycle Bin would be suspicious, even if they didn't indicate Hamweq was installed, specifically.

Other malware has other artifacts and persistence mechanisms. Take Win32.Nekat as an example...this one adds entries to the user's Control Panel\don't load key, effectively removing the applets from view. While not overly sophisticated (I mean, it is native functionality...), something like this would be enough to slow down most users and many admins. And yes, there is an app a plugin for that (actually, it was pretty trivial to write...one of several that I wrote yesterday).

APT
With the Google thing, there's been considerable discussion of advanced persistent threat, or APT, lately. I'm not going to jump on that bandwagon, as there are lot of folks smarter than me talking about it, and that's a good thing. Even Hogfly has talked about APT.

I get the "threat" thing, but what I'm not seeing discussed is the "advanced" part. Wendi over at Mandiant blogged about M-Trends and APT, noting some...uhm...trends such as outbound connections and mechanisms used to avoid anomaly detection. One of those mechanisms listed is "service persistence", which sounds like the malware is installed as a Windows service, a persistence mechanism. While I do think that it's a good idea to talk about this kind of thing, what I'm not seeing a lot of right now is actionable intel. Wendi and Hogfly presented some very useful information, demonstrating that all of this talk still comes down to a couple of basic questions; Have I been breached, and am I infected? How do I find out? What do I do if I am? How do I protect myself? So someone looks at both posts and uses the information there to look and see if they've been breached. If they don't find anything, does that mean they're clean? No, not at all...what it means is that you didn't find what you searched for, and that's it. Both posts presented information that someone can use to scour systems, but is that all that's really available?

I think that a very important concept to keep in mind when doing this kind of analysis is what Pete Silberman said about malware; he was absolutely correct when he described it as having the least frequency of occurrence on a system. Think about it. Malware, particularly worms, no longer want to keep infecting systems over and over again, so they use a single, unique mutex to say, "hey, I'm infectin' here!". That way, the system doesn't get so massively infected that it stops functioning; not only does that alert folks that somethings wrong, but it also deprives the attacker of the use of the system. So, you can run handle.exe from MS on a live system, and then run the output through handle.pl and see mutants listed by least frequency of occurrence.

I'm going to throw this out there...run it up the flagpole and see who salutes, as it were...but I think that the same sort of thing applies to intrusions and breaches, as well. For example, Windows systems have thousands of files, and intruders may be installing some tools to assist them in persistence and propagation, but the fact is that there are a number of native tools that are perfect for what most folks want to do. I mean, why install a tool to locate systems on the network when you can use native tools (ipconfig, netstat, nbtstat, etc.)? So intruders don't have to install or add something to the compromised system, unless the required functionality is not available in a native utility. Remember what Wendi said in her M-Trends post about using services for persistence? How many do you think are there? Do intruders install their persistence mechanisms 50 or 100 times? No...likely, they only do it once. And they can either hide as svchost.exe, pointed at an executable in another location, or beneath the legit svchost.exe, using the ServiceDll entry (and adding an entry to the appropriate SvcHost key value in the Software hive).

Timeline Analysis
To illustrate my point about least frequency of occurrence, let's talk briefly about timeline analysis. Given the minimalist nature of malware and intrusions, how can we use timeline analysis to our advantage? The approach I've been advocating allows the analyst to see multiple events, from many sources, side-by-side for comparison and analysis.

One of the things that folks ask about with respect to timeline analysis is a graphical means for representing all of the data, in order to assist the analyst. IMHO, this simply does NOT work! I have yet to find a viable means of taking in all of the data from various sources, throwing it into a timeline and hoping to use a graphical representation to pick out those anomalies which happen least often. As an example, check out Geoff Black's CEIC 2007 Timeline Analysis presentation...the fourth slide in the pack illustrates a graphical file system metadata timeline within EnCase. I agree with Geoff's assessment...it's not particularly effective.

Overall, in order to get a grip on APT, we as responders and analysts need to change our mindset. We need to understand that we can't keep looking for spikes in behavior or activity, and there is no Find All Evidence button. When you respond to or analyze a single Windows system, and consider the changes in OS, various applications that can be installed, and the infrastructure itself, what constitutes an anomaly? Honestly, I don't think that this is something a software package can tell you.

I do, however, firmly believe that training and education are the key.

1 comment:

Darren said...

Re Timelines;

I'm debating with myself whether to go down the path of writing a script to make multiple entries for MACE, etc, or stick with my current method of manually looking through the data and highlighting items of interest, then putting the items of interest into a database.

Manually highlighting takes time, but reduces to data to being manageable. A script would speed things up, but end up with too much data.

I think I'll stick with the manual way for now! (and yes, I'm in the process of writing a paper about the process I use for timeline analysis)

regards,
Darren_Q