My Road to Digital Forensics Excellence

Archive for March, 2009

Repeatable Analysis Steps for Statusing

Posted by Paul Bobby on March 19, 2009

A frequently asked question in class and on forensic forums is “What steps should I take when conducting analysis?” I have blogged on this before, and provided several approaches to case analysis.

This time, let us consider the requirement of case status. Whether in law enforcement or in the corporate realm there is a dual-role requirement for investigations. This dual role separates the investigator from the examiner; typically one investigator, and one or more examiners. The dual role provides a separation of duties, but also permits the agency or corporation to maintain expertise in investigations separate from expertise in forensic examination. I consider these to be highly valued skills, and the individual capable of performing on both to excellence is held in high regard.

The timeframe for corporate investigations is much shorter when compared to law enforcement often days or weeks versus months or years. Regardless of the timeframe, the investigator has an insatiable appetite for progress and status. How does the examiner provide adequate status to the investigator?

One method is to leverage the concept of repeatable forensic analysis steps and combine those with a standard 0% through 100% qualifier:  we have the beginnings of a repeatable status metric.

  • NA: Task not applicable
  • 0%: Task not yet started
  • 33%: Task started
  • 66%: Data collected
  • 100%; Ready for final report

What tasks? Well, look at my previous investigation posts and the Microsoft Onenote I use to support investigations. These contain a variety of ideas for repeatable forensic analysis.

What are your thoughts?


Posted in Incident Response | Leave a Comment »

Evidence Verification as a Benchmark

Posted by Paul Bobby on March 11, 2009

Now that I have my 60gig Wikipedia evidence set completed, why not take it for a spin.  I chose a basic EnCase operation, Verify File Integrity, as my first benchmark.

As a reminder, Verification occurs automatically when adding evidence to your case that has not yet been verified for that case. Furthermore, the examiner has the ability to initiate a manual verification by right clicking the evidence file in the Tree Pane, and selecting Verify File Integrity.

The file verification function creates a Log Record under bookmarks, as shown in this screenshot:

 Evidence Verification

The verification processes took 155, 155 and 152 seconds respectively in this example.

Before I start benchmarking Encase and my examination environment, I need to establish the baseline of my system configuration.

Run msinfo32.exe and record some basic information. For example,

OS Name                Microsoft® Windows VistaTM Ultimate
Version   6.0.6001 Service Pack 1 Build 6001
System Manufacturer           Dell Inc.
System Model       Precision WorkStation T7400
System Type          x64-based PC
Processor               Intel(R) Xeon(R) CPU X5472  @ 3.00GHz, 2992 Mhz, 4 Core(s), 4 Logical Processor(s)
BIOS Version/Date                Dell Inc. A04, 8/21/2008
SMBIOS Version    2.5Hardware Abstraction Layer               Version = “6.0.6001.18000”
Time Zone              Eastern Standard Time
Installed Physical Memory (RAM)      16.0 GB
Total Physical Memory        4.00 GB
Available Physical Memory 13.5 GB
Total Virtual Memory           32.1 GB
Available Virtual Memory    29.9 GB
Page File Space     16.3 GB
Page File                C:\pagefile.sys

Patch day? Not sure how to reconcile that yet. Any suggestions?

My workstation hard drive configuration comprises the following:

  • 1. 80Gig drive for the OS and the Pagefile (10k SATA) (OS Drive)
  • 2. Two 500Gig drives (7200rpm) at RAID0 (Investigation Drive)
  • 3. 80Gig drive for EnCase and other applications (EnCase Drive)

My Encase configuration is as follows:

a. Version:
b. Platform: AMD64
c. Build: EPBLD00000C22 02/20/09 12:23:53PM
d. System Cache
      i. Controlled by Encase
      ii. Minimum 1, Maximum 13104
e. Configuration
      i. Autosave turned off (0 minutes)
      ii. ParseCache folder on the Encase drive
f. New cases
 1) Investigation Drive\Test#
 2) Investigation Drive\Test#\Export
 3) Investigation Drive\Test#\Temp
 4) Investigation Drive\Test#\Index

General test considerations:

  • 1. Have no other applications running
  • 2. Turn off On Access antivirus scanning
  • 3. Ensure patching or other taskbar activities are running
  • 4. Turn off your screen saver

Evidence Verification

  • 1. Create the test case
  • 2. Copy the evidence to the “Investigation Drive\Test#” folder
  • 3. Cancel the automatic verification process
  • 4. Right click the evidence file in Tree pane and select Verify File Integrity
  • 5. Do not use the computer
  • 6. Once verification has completed, note the #second in the Log Record under Bookmarks.

For the above hardware configuration, I loaded Wikipedia-Part1.E0*, and the verification process took 155 seconds.


This is a great opportunity to test different hardware configurations. Readyboost, RAID0 versus single hard drive, evidence stored on a network share, more RAM.

One test I conducted was to reacquire the evidence. The evidence was acquired using 64k for the block size. Per this post in the Guidance Software forums, Nik indicates that EnCase caches the block of data in the EV file – I guess that means, it caches 64k at a time. With large memory possible on 64bit machines (and with my 16Gig of RAM), I will test out different Block sizes and see if they have an impact on verification.

Posted in EnCase | Leave a Comment »

Benchmarking EnCase (update)

Posted by Paul Bobby on March 6, 2009

I have completed an evidence set to be used for non carving/recovery benchmarks.

The source was the complete database dump from Wikipedia, dated June 2008. This 15Gig .gz file uncompressed to ~250gigs, a process itself that took considerable time.

Once uncompressed, the directory structure for this wikipedia archive is just enormous – and is quite unwieldy when attempting to process as one big chunk. Performing operations against this large single dataset would be a great benchmark I’m sure, but it would bring most systems to a crawl, and so I decided to break it up in to smaller chunks, and investigators can load as many or as few as they want when running benchmarks.

The evidence sets were created from the articles\a-z sub folders. I chose to create .E01 evidence files as opposed to LEFs. I may still do that, but E01 has the advantage of including the operating system, and disk topology. If I do create LEFs, it will be on a subset of wikipedia as I’m interested in benchmarking the L01 file format and pushing the limits of the internal tree structure it is capable of maintaining.

The evidence assembly process was as follows:

0. Create a 32gig NTFS partition
1. Format the partition using Quick mode
2. Run Eraser and overwrite the unallocated space on the partition (using a simple 0x00byte 1-pass overwrite)
3. Copy ~32gigs of stuff from the wikipedia archive to this partition
4. Run EnCase and acquire an image of the drive. I selected Good compression, the default block size, and selected creating both an MD5 hash and a SHA1 hash, and 640meg file segments (you can fit 7 to a DVD if you want)

Once the acquisition was completed, go back to step 1 and start over for the next chunk of wikipedia.

In the end I created Wikipedia-Part1.E01 through Wikipedia-Part11.E01, each part approximately 10 640meg .E** files, creating a grand total of 60gigs for our first benchmarking evidence set.

Posted in EnCase | Leave a Comment »