My Road to Digital Forensics Excellence

CEIC Day 3 – The Lectures

Posted by Paul Bobby on May 20, 2009

Hands on techniques to go from Forensic Examiner to eDiscovery Practitioner

No course description, considered a basic class

The idea behind this class was to address the change in mindset required when executing an eDiscovery task. In the 4n6 world we concern ourselves with unallocated, file carving, imaging the entire drive and recreating behavior. For eDiscovery, throw all of that out of the window. Your focus is on those files that are readily accessible to the user – and yes, this may even exclude files still in the recycle bin.
It’s not really all that difficult to imagine my role during a discovery request – I create criteria, connect to all workstations specified and pull files from some period in time until the present. Then perform the culling (i.e. conditions and filtering) against the collected data, produce my LEFs, and deliver to counsel.
Effort needs to go in to the creation of a pre-discovery questionnaire – this leads to a search plus criteria protocol which should be signed by counsel and becomes your get-out-of-jail-free card.
One reveal from the presenter, EnCase will feature a new indexing engine (at least that was alluded to), specifically to allow LexisNexis type index searches – the implication that the search engine will be changed was also made.
Again, this session turned in to a lecture – no ‘hands-on’ whatsoever.
Scripting Network Forensics – Featuring Powershell, Log Parser, Perl, Sysinternals

Again no course description

Let me start by saying – if you think the title implies the use of tools to conduct network forensics, you would be wrong. Network Forensics is the analysis of network traffic to determine activity. This particular session considered network forensics as analysis conducted against a target workstation ‘over the network’. Who QAs these session titles?

Powershell – quite a bit of time spent on this tool. This tool needs to get more use. Apparently everything is treated as a ‘hard drive’, the registry, WMI, network connections, and the drives. What’s cool? Powershell can be executed against drives mounted using PDE or Mount Image Pro (or some other tool). I tested this during class – and yes it works: to an extent.
Log Parser – this tool really needs better exposure. I’ve taken note to spend some serious time with this tool and to start using it more often against event logs and restore point registries.
Using VMWare Toolbox – Tools to conduct investigation

Again no course description

The presenter was all giddy about MojoPac ( I’ll have to give it a shot. If you haven’t, the idea is that you data and program can exist on a thumb drive, while the execution of that application makes use of the already installed Windows OS. The website also documents the artifacts left behind through usage of MojoPac – very cool.
Various other products were highlighted to enable virtual analysis – vmware, virtualpc, virtualbox, liveview, altiris, and easyvmx for creating VMs for use in VMPlayer.
3 slides later the 105 minute lab ran out of information. Yes, I was disappointed. These presenters need to stop naming lectures labs and to actually start making a lab.
Timeline Analysis

This lab will look at the challenges of timeline analysis and some of the key techniques for working around them

Timeline analysis has picked up recently on the interwebs, although as an analysis technique for a malware compromise it has been standard for some time. Determine when something happened, and what happened when, timeline analysis helps to identify the initial infection vector, the filesystem modifications that ensue ( registry too), and, heaven forbid, data exfiltration or when the computer was under control by the bad guy.

Today’s session focused on the FNA and SIA timestamps, their differences and how they can be affected by timestomp. Unfortunate – timeline analysis can be conducted by a larger group of people than those who know how to manually parse an MFT record and the instructor missed an opportunity to affect future investigations for good.
Couple of things I took note of for future testing and blogging: what can change the Last Accessed, and conduct various timestomp experiments and identify the FNA and SIA timestamp changes when files are then moved/copied to the same/different NTFS volume under XP/Vista.
Lose the geek speak: Creating client friendly forensic reports

A brilliant examination’s value is nil if a client can’t understand the findings.

Unfortunately, the presenter got sick and the presentation and materials weren’t made available to his co-worker who stepped in – ugh. This session had some potential, especially at the end of the day, for some humor. I mean after all, lets throw up some snippets of either actual reports, or just badly written ones. Get people laughing and make your point.
Produce snippets of good report versions to drive the point home. Kleiman produced an interesting report of his rebuttal to defense expert – the defense report was not poorly written per se, just incorrect in the assumptions. So while the analysis of this report and Kleimans’ rebuttal was interesting, it didn’t follow the spirit of the session which was to look at effective report writing (whether you are correct or not 🙂 )

And with that, the day is done. I was planning on attending the Enscript Birds of a Feather session, but I saw most of the Enscript guys I know getting on the shuttle to head to Citywalk. So I went to see Star Trek.

Awesome 😉


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: