SecureArtisan

My Road to Digital Forensics Excellence

Archive for January, 2011

DC3 2011 Day 2 and 3

Posted by Paul Bobby on January 29, 2011

Visualization of Mobile Data – John Carey and Timothy Leschke
The bulk of the presentation was a case study in to the George Ford Jr trial (see here). His wife had suspected Ford of infidelity and had secretly installed a GPS device under the drivers seat of his truck. The data that was obtained from this device, along with visualization provided LandAirSea Past-Track, secured the conviction.

VDL Slack in NTFS – David G Ferguson
This talk drew attention to the problem of the slack space of a file whose current file size was different to the logical file size reserved for it. The problem exists within the various 4n6 tools available to us in that they appear to handle searches within that space differently.

I also learnt that the volume shadow service deliberately sets the VDL size of VSCs to 0 (zero) – this renders the VSC invisible to the normal Windows backup processes, and so the VSC is not backed up.

Advanced C2 Channels – Adam Meyers and Neal Keating
Some of the new channels being detected today.

  1. Twitter C2 – A twitter account is created and C2 is posted to that account, read by controllers and bots. Base64 content.
  2. Facebook C2 – data posted to the Notes section of a facebook account using english words as codewords.
  3. Gmail – SSL is allowed to gmail (and now facebook). C2 is communicated over draft email with hex codes.
  4. RSS feeds – malware drops javascript, the JS engine is instantiated, which requests an XML feed from a website. That feed contains the C2.

These guys didn’t like the term APT. But they did like to say reminent instead of remnant.

Windows 7 Artifacts – Rob Attoe
Hey, a fellow Brit – he works for Access Data. This presentation condensed an otherwise 4hour block in to 50 minutes. Awesome – just hit me with everything and I’ll sort it out later.

Some of the artifacts I wasn’t aware of.

  1. Bitlocker-to-go – If the FveAutoUnlock kvp exists in the Ntuser.dat, then the end-user has selected the “remember this password” option when accessing that specific removable media. No password? Then use one of the many methods to boot up the OS, and simply insert the removable device. You may just get lucky.
  2. Jump Lists – Custom or Automatic destinations are listed in the registry. Valuable for behavior analysis.

When did it happen? – Kieth Gould
Kieth rocks. A good solid Geek Meter-5 presentation in to NTFS timestamps and some of the gotchas/misconceptions that forensicators continue to fall prey to.

He reviewed SIA and FNA timestamps, and common scenarios in which the FNA timestamps are changed, file-system tunneling (see this earlier blog article) and reliability monitoring.

Knowledge Management – Sam Wenck
Much has been said about a threat-based approach to Incident Response (as opposed to traditional CND (vulnerabilities) or incident response (presuppose successful intrusion)), and Sam demonstrated the Lockheed Martin implementation of threat-based IR using Request Tracker and some custom programming.

This system comprises the standard ticketing engine with a customized indicator-database and a knowledge management database (like a wiki). The entire system is supported by back-end datastores such as IP databases (where on the perimeters IPs were seen), DNS lookups, proxy logs, etc etc.

The indicator database has a systematic entry method to ensure proper canonicalization of indicator intelligence. At this time we store just atomic indicators. Future work is being pursued to create computer indicators, such as complete TTP models.

Posted in State of Affairs | Tagged: | 1 Comment »

Day 1 DC3-2011 Part 2

Posted by Paul Bobby on January 27, 2011

Firefox plug-ins useful for online investigations – Jesse Varsalone
I attended this presentation half-way through since the solid state drive one was so short. Plug-ins demo’d included geo-ip location, Tor, deepnet, 1-touch downloading (flash videos etc), and a passive cache plug-in. The cache plug-in I didn’t know about – when viewing cache from Google, any images in the cached data are retrieved from the live website. Passive cache ignores this and just displays the text.

Effective Expert Witness Testimony – Donald Flynn
This was a discussion about the requirements to be identified as an expert, and how to deal with cross-examination and technical presentation. An interesting comment made by the presenter jives with my investigative approach; spend the time finding both inculpatory and exculaptory evidence.

Lifting the lid on Cyber Espionage – Randy Lee
This presentation had the largest ratio of doodles-to-notes in my notepad. Yep, it slipped past me when I decided on attending that the presenter was a vendor. Ugh – must scream.

The presentation was just terrible! It was the usual pitch with one scare tactic right after the other, but from 10 years ago when vendors were still trying to sell SIMS/IDS etc. While there exists a need for these tools, the security landscape has evolved, and so must the sales pitch.

It’s the 80/20 rule – we used to spend so much money fighting 80% of the attacks. Firewalls, SIMS, log tools, netflows etc were all designed to provide real-time, behavioral (meh) capabilty as a defense against the 80% threat. The market is now saturated. The 20% threat is the focus now. The sales pitch needs to change.

Do you see what I see? – Paul Cerkez
Cerkez is a PhD student researching the automatic identification of semagrams. A semagram encodes a message in to another file – yep a type of steg, but the encoding makes use of pictures/icons to carry that message. It was an interesting cerebral diversion for the final presentation of the day.

Posted in State of Affairs | Tagged: | Leave a Comment »

Day 1 DC3-2011 Part 1

Posted by Paul Bobby on January 26, 2011

Shadow Volume Link Manager and VirtualBox – Timothy Leschke
The presenter discussed the challenge he faced analyzing data from VSCs – five years ago. At that time XP was still the most prominent desktop OS – Vista was still trying to eek an existence. However when the Vista examinations finally came along, how does one tackle the problem of the volume shadow copy?

The presenter walked us through the usual techniques of list shadows and mklink, but again, the main problem was developing an operational analysis environment that could run under Windows XP.

He settled on VirtualBox as the VM method of choice; a choice that was easily made since it was the only product that worked. The issue was the inability of other VM products to mount a drive as a physical device – they all mounted them as logical devices. A running volume shadow service can only interrogate the VSCs on a volume if that volume is listed as a physical disk in Disk Manager.

The coup for this presentation came when it was cut short. Mark McKinnon was given the podium and demo’d for us ShadowAnalyzer (yep that tool we’ve all been waiting for). It is in beta at the moment, but he had a pile of CDs to hand out. Woot.

The tool works because they authors essentially reverse engineered the volume shadow service. Therefore they promise versions for Linux and MacOSX in the future. The other cool thing is that this tool can interpret multiple file versions even in the same VSC.

Don’t know what I mean? Well, imagine if a VSC is created every 24 hours, and in that 24 hours you changed a certain spreadsheet 10 times. If you need to get back a ‘previous version’ of that file, Windows will only give you the most recent version that was saved in the VSC even though the ‘diff’ data is there for all 10 versions. The same thing occurs when you manipulate your host OS in to interrogating VSCs on mounted media. ShadowAnalyzer will present to you all 10 different version. Oh my.

Applying the Science of Similarity to Computer Forensics – Jesse Kornblum
Ever attended a talk by Jesse? Then you’ll know you’re in for some fun. My favorite quip is that he asked all of us to turn off our cellphones. And if they did beep – he wouldn’t throw it out, instead he’d do a forensic analysis on the device in front of the entire class! Perfect way to start, I knew I was in store for something good.

Uh oh, this one got mathy. Fortunately all presentations came on some DVDs that were provided to us for the conference – this is one presentation that had some math, and plenty of ‘for more details’ references to go read on Wikipedia.

The problem of similarity began with the obvious, but inefficient method of the simple MD5 hash and compare for reducing data sets during review. While somewhat effective for operating system files (and even then many files through patching are missed by this process) it was highly ineffective for user based electronic files.

He walked us through block hashing and fuzzy hashing, introducing to us various algorithms that generate an end product that should have a low false positive rate and a high false negative rate. This one I might come back to once I read that statistics primer again.

Solid State Drives – Fred Barry
The class lasted 20 minutes but the presenter essentially refreshed everyones’ memory on the workings of SSDs and the implications for forensic examiners. He presented some useful statistics which more than validated that SSDs as a source medium for evidence files most definitely increased the speed of analysis.

The most eye-opening of tests concerned the TRIM command capable OSs (for example Windows 7, Windows Server 2008, and some nix’s). He wrote the same dataset (12gigs) to many test SSDs, and then deleted that data set. Through some measurement mechanism that he didn’t disclose, he would time how long before the data disappeared. While the details were not presented the values varied from 24hours (7gigs of the original data were still present) down to after 60 seconds (all the data was gone!) So while we are all still used to the OS handling garbage collection (i.e. tracking free space etc), when it comes to SSDs, TRIM sends that command to the drive. And worst case, after 60 seconds, you will no longer be able to carve.

Posted in State of Affairs | Tagged: | 2 Comments »

Plenary Session

Posted by Paul Bobby on January 26, 2011

Some notes that I had taken from todays’ plenary session of the 2011 DC3 Cybercrime conference.

Howard Schmidt
Cyber deterrence – this is a cost/benefit analysis issue. Make it so that it costs more to steal/sell information.

Resiliency – we need a better early warning system to our critical infrastructure.

Privacy – Privacy and security are no longer at odds with each other. Security guarantees privacy.

Partnerships – Redefinition needed; more than just government and the private sector being in the same
room talking about the problem.

Future – not cost effective to replace existing networks, so develop technology that can elevate existing networks to a more trusted communication method.

The challenge? – Evolution of money laundering. State governments are actively participating or turning a blind eye. It has yet to become a national priority.

NSTIC – National Strategy for Trusted Identity in Cyberspace. He promised that it wasn’t a national ID card or drivers license for the internet.

Allan Paller
There have been several events in the past few years that have lead to public outrage towards lax computer security (Ed. Not sure that the public is entirely outraged yet) culminating with the Stuxnet episode. (Ed. A minority of our technical professionals can properly explain Stuxnet – not the average American. While the general public may not be brandishing pitchforks, CEOs from the DIB are starting to.)

There has been a non-linear adoption of technology. For example, no one liked Windows 1.0 and 2.0, then at 3.1 it took off. (Ed. my example is smartphones. We’ve had palms, PDAs etc for years, but it’s not until, I believe, the iPhone, did it become widely adopted. Until then, smartphones were geek toys and for business users. So making technology cool seems to be the key?)
We need to move from a report/paper/artifact based system of security measurement and assessment to a more real-time approach.

Baked-in security – pull a system engineer from each of your major projects, train them intensively on security and re-embed them. Products will now come off the line with security engineered from the beginning.

Hunters and Human Sensors – Sysadmins and the rank and file of our IT departments become human based sensors; they are the ones who should notice abnormalities. Hunters are those select few who really get it, know what to look for and how to look for it.(Ed. I don’t think it’s a select few at all, his description was more for the high-end policy maker rather than the keen-eyed few you would hire in your SOC).

Formula forensics becomes analytic forensics – (Ed. a common thread amongst existing practitioners, however it is sort of at odds with the latest industry buzzword of forensic triage and the lineup of products such as Encase portable and FTK Triage. Looks like a mixture of low-hanging fruit/formula 4n6 and the deep-dive is where things should be heading.)

Ovie Carroll
This was the most fun to watch presentation – nice slides 🙂 Nothing like a good story telling using Google auto-complete.

Couple of new issues:
1. Social media – significant evidence about our daily activity within the social environment.
2. Hard drives – large capacity, less likely for data to be overwritten (Ed. okay, well not entirely new :))
3. Every case, not just fraud and child porn, potentially has a digital evidence component.
4. Cloud evidence – dropbox, google docs, etc.

This leads to the importance of triage, early in the investigation process. He used the example of a suspect making bail, and immediately cleaning up his online presence before the law enforcement process has time to catch up and start looking.

(Ed. Triage was being used in this presentation as nothing more than a fancy word for capturing low-hanging fruit. Userassist, MRUs, internet history etc. Triage, in my mind, is an approach for getting to the critical/significant data as soon as possible, for example a large-scale computer incident, you need to know which computers are infected. For an investigation such as OC described, this isn’t triage but basically on-site analysis to get some quick leads. This is a word that I think is going to be redefined because of product pressure.)

Report Writing – timelines, flowcharts, narratives, data visualization, write for the end-consumer (for law enforcement that is the prosecutor). (Ed. I have a lot to say on report writing, and it may become a presentation point for me in the future as well as this blog. Stay tuned.)

Jeff Troy
Social media – increases the speed of emerging threats based on social issues (e.g. flash mobs, mobilization to ddos based on wikileaks support, etc).

Morphing of criminal successes – Zeus botnet highly successful as a method of stealing financial information. Now being adopted by nation states as a launch platform for infection. (Ed. I predicted this a year ago internally – first confirmation I’ve heard so far. No confirmation if APT is associated with botnet launch points, but it’s only a matter of time. Still, the noisier the APT, the easier to break it in the kill-chain (Cloppert can give you details on this awesome intel tool))

Criminal activity – This is reducing in scope, they think it’s because the criminals are bored with how easy it is to steal money. (Ed. yah, thass right)

Posted in State of Affairs | Tagged: | Leave a Comment »