SecureArtisan

My Road to Digital Forensics Excellence

Archive for the ‘State of Affairs’ Category

The malware made you do it?

Posted by Paul Bobby on June 25, 2012

I was having a conversation with a co-worker about various things and the topic of the trojan defense was raised. He started chuckling. I asked why, and he dismissed it as the far fetched final gasp of air muttered by the unfortunate soul facing the consequences of his actions. Well perhaps not as dramatically stated as that, but you get the point. I told him I disagree with that sentiment 100%, and proceeded to tell him why. Upon reflection I figured this is something to blog about and invite any comments in the event I’m completely off base with this.

TL;DR – You are not as in control of the content of your hard drive as you think you are, especially with respect to the content managed by your web browser.

TL;DR #2 – Any process that generates artifacts from computer initiated actions aswell as human initiated actions is a candidate for the malware defense.

So when you think about the trojan defense, what comes to mind? Like my buddy I bet most people who dismiss the trojan defense imagine some sort of malware that secretly makes use of your internet connection and downloads CP to a hidden folder just waiting for law enforcement to discover it. Well that’s not too unreasonable, maybe ransomware or extortionware designed specifically for financial gain, but that’s not what I’m thinking about. Or how about this, malware that plays solitaire, or does your homework, or watches movies, or is a cyber-yenta using your IM program? All this stuff in a corporate environment that creates a case for mischarging – did they malware do it? Of course not. But again that’s also not what comes to mind when I think of the malware defense as it is not appropriate for those types of scenarios in which activity is generated by human initiated actions.

Let me ask a couple of rhetorical questions. Do you believe it’s possible for malware to alter the contents of your hard drive? I would hope the answer is yes. Unless it’s some proof of concept code or experimental code, most malware can and does manipulate your OS, which leads to manipulation of the hard drive. Do you accept the possibility that malware can introduce new content to your hard drive? Again, the answer should be yes. Thirdly, of the set of malware is there a subset that is specifically designed to intercept, hijack or otherwise interfere with the normal web browsing process of your OS and its web browsers? The answer is yes, there is a group of malware that is designed to manipulate normal web browsing processes, for example, BHOs, activex, flash, even exploiting the normal function of your browser by, say, opening up multiple tabs when you click ‘home’, tabs which may contain thumbnail pages of clickthrough content to objectionable websites.

Malware doesn’t even have to be on your computer. Imagine one day visiting CNN.com and discovering that a disgruntled employee decided to deface the website somehow. Your browser dutifully renders the website request content to be delivered to you, over the network, rendered in memory or virtual memory and ultimately stored on your hard drive in your caching mechanism. Perhaps the web page is ‘too big’ to fit in your browser and you would only see the content if you scrolled down to view it. Has that content already made it to your hard drive? Of course… There are accelerators and other browser add-ins that can cause all sorts of content to be transmitted over the network, rendered/processed in memory and stored all on your hard drive, giving the appearance of the end user specifically requesting such data.

So do you see where I’m going with this? It was at this point that my co-worker realized what I was hinting at. You are not as in control of the content of your hard drive as you think you are, especially with respect to the content managed by your web browser. The malware doesn’t have to specifically ‘go out and get badness’, no, on the contrary, by interfering with your web browsing process, malware can cause all sorts of content to inadvertently be deposited on to your hard drive.

Have you ever executed Sysinternals’ Process Monitor and watched all the activity go by when the OS is supposedly ‘idle’? It’s mind boggling the amount of stuff that goes on. Put on top of that web browsing and you get what is, in my opinion, the ‘noisiest’ thing you can do with an operating system. Data read/written to network sockets, read/written to memory and read/written to the hard drive. Even the OS can cause web browsing traffic to occur – really gumming up the works when it comes to discerning human-initiated versus computer-initiated traffic. And all the while, malware, designed to interfere with normal web browsing processes, generating its own traffic.

Let’s take an example from the corporate world. Imagine an employee walking by your cube and sees porn on your screen. They contact the ethics officer who subsequently opens a case. Technical assistance is requested to discover any evidence that can substantiate and the analyst finds pornography in the temporary internet files area of the hard drive. Is the employee fired just because it’s there?” Good lord I hope not, in fact the analyst shouldn’t even submit any report based solely on the presence of content only (note this might raise your hackles, but I’ve read/observed cases from both corporate and law enforcement in which this is exactly what happens). Rather the analyst needs to provide a narrative describing how  the content got there in the first place. Perhaps a mistyped search term or URL? Perhaps a compromised web advertisement place holder? (you’ll need your manual reconstruction skills for that one, and the next). Perhaps a compromised website in general? Perhaps malware on the local computer? Perhaps the individual really is seeking inappropriate content. The key to this investigation is accurately describing the actions that caused this content to be placed on the computer.

So, is the trojan defense actually plausible? You bet it is, for specific scenarios of course. Web Browsing being the most common. So I would caution anyone against dismissing the malware defense too quickly simply because it sounds too fantastic or unrealistic. Another scenario that comes to mind is P2P.

Update: 7/7/12 http://www.net-security.org/malware_news.php?id=2177 (a real CP ransomware story. how about that)

Update2: 8/14 http://www.fbi.gov/news/stories/2012/august/new-internet-scam/new-internet-scam?utm_campaign=email-Immediate&utm_medium=email&utm_source=fbi-top-stories&utm_content=129647 (FBI alert regarding ransomware)

Posted in State of Affairs | Leave a Comment »

CEIC2012 Part 2

Posted by Paul Bobby on May 24, 2012

One of the best things about conferences is the social aspect, meeting people you only know online and getting reacquainted with those you’ve met previously. Spent some time chatting with Simon Key about the developer program, James Habben my enscript instructor. Finally met Geoff Black and Jon Stewart (you do have your Lightgrep beta right?) (btw Jon, hopefully I catch you before the conference is over). Met plenty of names from Guidance (hey Joshua), folks who recognize my name from blogging (howdy Sgt Doug Collins), and names associated with vendors, such as Blackbag, Clearwell etc.

Session 3: Anti-anti Forensics. So ever encountered ccleaner usage or other system wipers/cleaners? Of course you have. This presentation focused on ‘what was executed’ and  ‘when was it executed’?’ The hardest bit of course, “what was deleted”? David Cowen (Hacking Exposed: Computer Forensics) has done some original research with the $Logfile. We’ve seen MFT records, INDX files and LNK files carved from the $Logfile, but those 4k record pages can contain much more. One part of the research he was willing to share is a portion of the record file that shows the before and after filename change when a file gets renamed (a typical function in system cleaners). There’s apparently plenty more research, but he’s hoping to present at Blackhat this year – watch his blog for more details. (He’ll post two tools called Splitter and SectionSearch)

Session 4: What’s new in Windows Forensics? John Marsh presenter. Couple of things I didn’t know and need to research more fully. Microsoft Virtual Disks – never really played with them, but for testing purposes they look really cool. The transaction log, $TxF – anyone know if this has been parsed or the file structure documented anywhere? The ‘Virtual Store’ folder – if 32bit apps are executed by non-admins, or 64bit apps executed that aren’t coded to ‘play well’, then they get written to a ‘virtual store’ folder structure to be separated from the Program Files area. There’s also a ‘virtual store’ registry area too – this is stored in the UsrClass.dat registry file, not in NTUser.dat. Something else I didn’t know. Gotta start looking there in general, and for testing purposes to see what gets written.

Session 5: Timelines with Encase. Sgt Doug Collins, RCMP. Good presentation, and he called out my blog and a post on the Windows Reliability Monitor. Doug has created an enscript to parse temporal data sources and feed them to a MySQL data base he runs in a custom linux VM. He’s included deduplication checks and a spreadsheet with a database query front end. Cool stuff. One source of temporal data I’d not considered before is Google Analytics cookies (almost every page uses them) – there’s several timestamps in there associated with when the site was first visited, last visited etc. He also mentioned Google Chrome and how Chrome indexes every page you visit, storing that data locally – at least that’s “how I heard him”. I need to test that – but if that’s the case, the good lord, that’s quite the treasure.

Session 6: MMA Forensics Challenge. This is the session you live for – an opportunity to flex your 4n6 muscle and take on a class of forensicators. The challenge, contrived obviously, included memory analysis, dead box analysis, pcap analysis and timeline analysis. Points awarded based on answers to five groups of questions, and prizes for the first to solve each group. I won one of the groups, and took away the prized parrot 🙂 Yes there were signed copies of various 4n6 books to be won, but heck, that parrot spoke to me 🙂 Pieces of forensic8 (drole).

Posted in State of Affairs | 1 Comment »

My CEIC2012 Experience

Posted by Paul Bobby on May 23, 2012

Let me begin with the obligatory “I haven’t written in a while eh?”

Now that that is over with, and with some encouragement to continue posting, here we go.

I’m attending CEIC 2012 in Las Vegas, and with only two sessions left it’s time to post my thoughts on the conference.  Two keynotes to attend, the first being the CEO of Guidance Software. His presentation focused on the where we came from in the forensic world, through the 2000s and a brief look in to the future. It was also used to introduce the Guidance Software App Store to arrive in the Fall. This can only be a good thing. Forensics, hardware, techniques and everything else electronic is evolving at such a fast rate that any one company cannot keep up with it all – coders have been writing enscripts for a while now, myself included, and no doubt we will continue to provide free enscripts to the community. But allow a developer to be compensated for his/her work and you create a way for coders to spend serious time developing some significant scripts and plugins for Encase. Take a look a the Volume Shadow Copy problem – I’ve been waiting ages for Guidance to incorporate native support in to Encase; others have developed tools, so the solution is well understood, but while it likely appears on a future feature list, it is not a priority. An enterprising developer can take up the challenge and probably find many a shop willing to pay $s for it.

The second keynote came from General Richard Meyers, chairman of the Joint Chiefs of Staff, retired. While he retired from service in 2005 his points of view were definitely valid, and offered clarity to the incident problem we are dealing with today. In his summary of the Top 5 threats for today, Cyber Incidents was classified as number two. He also addressed the PRC, classifying them as highly aggressive when it comes to using cyberspace to gain intellectual property. He stressed however that these incidents would not directly lead to military conflict, however the persistence of the threat undermines whatever headway is being made via diplomacy, and that an unintended conflict may occur because of this tension, for example, in another space, such as the South China Sea.

Okay session time, number 1: Manual Web Page Reconstruction. This was a 90 minute lab session, the purpose of which was to teach an approach to reassembling a web page from the artifacts present on the computer. Unfortunately we didn’t start the lab work until 70 minutes in to the presentation, this alone was enough to rate the session as disappointing. However one of the things I always gain from sessions are questions that require some research to find the answer. Here’s the problem that I thought about during this session: if you see a file called “1.jpg” and “1[1].jpg”,”1[2].jpg” in temp internet files – do you know what that means? That’s not the real question though. What I need to figure out is if the web browsing mechanism (we’ll take Internet Explorer for example) is smart enough to know which of these JPGs to pull from the cache.

Let me state it this way, using IE, a user visits website1, website2 and website3. They all have a file called “1.jpg” that is loaded during the page render. The browser stores these JPGs in the cache, but the JPGs are completely different, only their filenames are the same. When the user visits one of the sites again, say, website2, and the image is loaded from the cache, is the correct one displayed? No idea, will have to test.

Session 2: Hunting for Unfriendly Easter Eggs. Two presenters from Deloitte walking us through “Cloppert’s Kill Chain” and modifying it slightly to be a Kill Chain Life Cycle (btw this kill chain is to be credited to Amin and Hutchins as well). The life cycle modifies the chain slightly by making the initial exploit/c2/exfil phase an external phase, that is to say the first penetration in to the network, followed by a cycle of internal phases that may repeat as often as is needed while the attacker modifies the chain with new exploits/recon/c2 etc. The second part of the presentation built on existing “indicators of compromise” proposed by Mandiant, in which case studies were made of real incidents that lead to 300+ IOCs for each stage in the chain. Good stuff.

More to come

Posted in State of Affairs | 1 Comment »

Criteria for an Effective Report

Posted by Paul Bobby on August 24, 2011

I work for a major defense contractor and have written many reports as the work product of being a digital forensics analysis practitioner for the last ten years. Have you looked at some of your own early reports? You may find bad use of language, incorrect conclusions, overreaching statements, inconsistent technical approaches and ambiguous data. While there is room in digital forensics analysis for 100% conclusive statements, the majority of statements you make are not, and learning what is and is not conclusive comes with experience.

I have supported security incidents, legal discovery and corporate investigations with digital forensics analysis. But more recently, my focus has been only on corporate investigations. Let me explain the difference. Security incidents are events that comprise network or computer intrusions, malware analysis, forensic deep-dives, root cause analysis, incident triage and damage assessment. Each sub-component of a security incident requires a unique approach to digital forensic analysis. For example, a triage typically requires assessing a large range of computing devices for evidence of compromise by analyzing registry indicators or file system indicators. Whereas a forensic deep-dive analyzes a specific device, already known to be compromised, in almost exhaustive detail: for example, to find evidence of exfiltration or to develop a complete timeline of the compromise. The work product of these analyses are formalized in a written report – the flavor, configuration, look-and-feel, whatever you want to call it is very different to the type of report I would write, say,  in support of a legal discovery or corporate investigation.

Corporate investigations are conducted by corporate officers (human resources, industrial security etc.) in to the allegation of policy violation by an employee. A digital forensics analyst is engaged to support this investigation specifically to retrieve electronic data that may substantiate the allegation (and yes, we do look for exculpatory evidence also). The work product of this analysis is the final report; the narrative that discusses these findings. The format of this report is different from one I’d write about a security incident. The consumer of this report is typically non-technical, the authors, the digital forensics analysts, may have differing technical skills and rhetorical skills and the technical data itself has changed over time.

Non-technical customers– when I talk about internet history and cache, one customer may understand the concept completely, another may not, so you write your report to the lowest common denominator.  For example, a common misunderstanding about technical data is why none of it contains any information about the ‘duration’ of an activity:  an employee visiting www[.]ebay.com is not important, but an employee spending 4 hours a day is, and yet internet history doesn’t provide this data.

Technical data changing over time – storage of email in PSTs is a common issue. Employees store lots of email, so when providing 800Mb of email to a customer, how do you present that effectively, analyze it, and provide an easy way for the customer to also interact with that data?

Because of these factors, it is important that a consistent approach to report writing be adopted by a digital forensics analysis group. This consistent approach should include standard formatting, approved language and a common look and feel for various report elements. But before you can address these consistency items you should develop goals to be met by an effective report. Here are some suggestions:

Accurately reflect the technical investigation process.

While it is important that the analyst understand the allegation and take appropriate steps to discover technical data that may become evidence, documenting these steps in the final report is more critical. That way the customer can understand where you found data, why you went ‘there’ looking for data, and can compare these approaches with past investigations. This provides a teaching opportunity to our customers; they become more aware of our capabilities and limitations, but also ensures that forensic analyst follows consistent technical practices when analyzing data.

Understandable to decision makers

As I said earlier, there are few 100% conclusive statements that can be made in a report, the rest may have some degree of uncertainty. And that’s okay, the point of being understandable to decision makers is to make clear the reason for that uncertainty: clarify why or why not a particular set of electronic evidence may or may not substantiate an allegation.

Withstand a barrage of employee objections

Your analysis is complete, the report is written and handed off and you move on to the next investigation. In the meantime your customer is interviewing the employee. The employee raises all sorts of objections to the technical data provided in the report. The customer, being non-technical, does not know how to rebut. Over the years I’ve heard many excuses for various technical evidence. For example, “Oh I take my laptop home over the weekend, and that was my teenage son who used it to visit inappropriate websites.” Many of these excuses can be anticipated and specifically commented on within the final report. To continue the example, I could highlight specific inappropriate websites that were visited not only on the weekend but also during work hours when badge records indicated that the employee was in the facility. This is a simple example, but it helps to tie together two different pieces of electronic data that help to address an anticipated employee objection.

Structured and easily referenced

This goes to the look and feel – if our customers receive reports from our analysts and they all ‘look’ the same, the customer learns to bypass the structure of the report and instead focus on and more easily consume the content of that report. Have you ever seen a complicated slide deck or spreadsheet and find yourself spending most of the time trying to figure out where data is? The same goes with technical reports for digital forensics. The technical content is hard enough to understand, don’t let your report structure get in the way of it.

Offer opinions and recommendations

This may be controversial to some of you, but in the world of corporate investigations it is most welcomed. The dialogue between a customer and forensic analyst isn’t just through a written report, there are many phone calls in which various technical concepts can be discussed: for example the significance of why a piece of data substantiates an allegation. Once the phone call is over, these conclusions and explanations will be forgotten. Writing them down as part of the final report will help the customer remember that conversation.

When you write a report, ask yourself if that report meets your established criteria for effectiveness. Peer review is key here, because after all, if another forensic analyst can make neither head nor tail of your report, a non-technical customer has no chance.

    Posted in State of Affairs | 1 Comment »