HowTo

[How to] GPG and Signing Data

GNU Privacy Guard (GPG) uses public and private keys to secure communications (public-key cryptography). Many people use it to encrypt their email or other documents. An email encrypted with a user's public key can then only be decrypted with the same user's private key. This provides end-to-end encryption of the message, meaning that it is impractical for anyone that is listening in on the conversation to get the message in transit.


This is, of course, good and bad. For example, Google and other email providers use email text to gain intelligence about the user, sell user information and do better ad targeting. This revenue stream keeps these services free, but users pay for it in terms of 'sold' privacy. Email using end-to-end encryption cannot be analyzed for useful marketing information. Because of this, these providers don't want to make it easy for mass encryption.

On the other hand, criminals also use Cloud-based email services. Making encryption somewhat difficult means that sloppy criminals are less likely to use encryption. If so, they may be easier to detect and catch.

Related Book: Lucas, Michael. PGP & GPG: Email for the Practical Paranoid. No Starch Press. 2006.

Whether you are paranoid and want all your emails encrypted (good luck), or you are trying to implement a personal or business data classification policy, GPG can help with encryption requirements.

Beyond encryption, GPG is useful for signing data. This is not exactly a signature that you would put on a document. Instead it is a signature that verifies that the data is correct. The video below describes how to sign data.




<div style="text-align: justify;">Signing data lets your contacts know that the data has not been modified from the time it left your possession. Signing is NOT encryption. Everyone could see the contents. Singing just allows your contact to know the data came from you, and it is in it’s original state.</div>

1 min read

[How to] Beginner Introduction to The Sleuth Kit (command line)

Today we will give a beginner-level introduction to The Sleuth Kit from command line. If this video is helpful, I highly recommend reading The Law Enforcement and Forensic Examiner’s Introduction to Linux.

<div class='embed-container'><iframe src='https://www.youtube.com/embed//R-IE2j04Chc' frameborder='0' allowfullscreen></iframe></div>

~1 min read

[How to] Installing and updating Linux in Virtualbox

Today we are going to install and update a Debian-based operating system in VirtualBox as a guest operating system.

The first video goes through creating a virtual machine in VirtualBox, and installing an operating system from an ISO disk image.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//-vVh550oaoI' frameborder='0' allowfullscreen></iframe></div></div>

The next video uses apt-get to update the software in the system, as well as ifconfig and ping to check if the network is working.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//w97PciH_XSw' frameborder='0' allowfullscreen></iframe></div></div>
The final video shows how to install VirtualBox Guest Additions to allow multiple features inside the guest operating system.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//tAElCds6tu8' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Digital Forensic Memory Analysis - Volatility

This week we will begin with a very basic introduction into the memory analysis framework Volatility. We will use volatility to collect information about a memory image, recover the processes that were running in the system at the time of acquisition, and try to find malicious processes within the memory image. We will cover volatility in more depth in a later video.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//Cs0Gc3GtfZY' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Digital Forensic Memory Analysis - strings, grep and photorec

This week we will show how to use basic data processing tools strings, grep and photorec to start an analysis of a Random Access Memory (RAM) image, even if we currently know nothing about the image. These methods are extremely basic types of analysis, but they are also fast and can produce some interesting results.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//4XoidAheuJE' frameborder='0' allowfullscreen></iframe></div</div>

~1 min read

[How To] Forensic Memory Acquisition in Linux - LiME

This week we will be using LiME to acquire a memory image in a suspect Linux system. LiME is a loadable kernel module that needs to be compiled based on the specific arch of the suspect device. We show the basics of compiling, and how to load the kernel object to copy a RAW memory image.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//_7Tq8dcmP0k' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Forensic Data Recovery in Linux - tsk_recover

This week we will talk about The Sleuth Kit, and specifically the tool tsk_recover. tsk_recover is a useful tool for allocated and unallocated file recovery. tsk_recover is a good quick solution, but in terms of performance, other tools tend to carve data better. I recommend using this in conjunction with other tools in an automated processing chain.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/MS6zruRaxyA' frameborder='0' allowfullscreen></iframe></div></div></div>

~1 min read

[How To] Forensic Data Recovery in Windows - Photorec

This week we will show how to use Photorec to recover data form a suspect disk image. Photorec supports the recovery of many different file types, but we will focus on jpg image recovery. Photorec works in Windows, Mac and Linux and is a useful tool for automating data recovery on suspect disk images.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//PTbgDEhqx1k' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Forensic Acquisition in Windows - FTK Imager

In this video we show how to do a forensic acquisition of a suspect disk using FTK Imager in Windows.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/TkG4JqUcx_U' frameborder='0' allowfullscreen></iframe></div></div></div>

~1 min read

[How To] Forensic Acquisition in Linux - Guymager

This video shows how to acquire a forensic disk image of a suspect device in Linux using Guymager. Guymager is an extremely fast digital forensic imaging tool (the fastest in our experiments). It allows for the acquisition of many types of devices to RAW and Expert witness formats.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//mqHx7HutQLo' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Forensic Acquisition in Linux - DCFLDD

This video shows how to use DCFLDD to acquire a disk image from a suspect device in the Linux command line. DCFLDD is an expanded version of ‘dd’ that supports additional features that are useful for digital forensic acquisitions.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//5zSnCeaK-80' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Forensic Data Acquisition - Hardware Write Blockers

In this video we will show external write-blockers and describe how they are used to prevent writing data to suspect devices. We will talk about bottlenecks in the connection and how to make sure your acquisition is as fast as possible.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/7eT8KSHMGFw' frameborder='0' allowfullscreen></iframe></div></div></div>

~1 min read

[How To] Copy a disk image to a physical hard drive using DD

In this video we show how to copy a disk image to a physical hard drive using DD in Linux. This is useful for working with live disk images (Linux live CDs), or potentially copying suspect data to an external disk for testing.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//N17rPCj9ye8' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How to] Create and verify a mutli-part disk image with FTK Imager

This video shows how to make a disk image using FTK Imager on a Windows system.

FTK Imager is an easy to use tool for copying data from suspect disks, and has other functions such as verification features and a hex view. It is a simple, stable tool that is a useful part of the beginning of an investigation.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//-jtRS7RTeoA' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How-to] Load a multi-part disk image into FTK Imager

When working with multi-part disk images, it can be confusing to see if your tool has loaded all of the image or just a part. Below is one way to determine if all of your disk image has been loaded, or only the first part in FTK Imager.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//bW7BBcbl_Vc' frameborder='0' allowfullscreen></iframe></div></div>

<h3>Verifying your disk image</h3>When working with your disk image, verification of the data should always be included in your workflow. In the case of a multi-part image, we should have at least two hashes:

<ul><li>A hash for the total disk image</li><li>A hash for each part of the disk image</li></ul><div>This is especially true for raw disk images, since they have no built-in checksum like expert witness format.</div><div>
</div><div>A hash for the total disk image is normally created by your acquisition tool, and can be found in the acquisition report. FTK Imager does not create a hash for each part of a multi-part image.</div><div>
</div><div>In this case, we may need to generate our own hashes using FTK, or another tool.</div><h4>Why do I need hashes for each part?</h4><div>If you have a hash value for the overall disk image, then - in terms of court - you will be able to show that the suspect data has not changed from the time that the disk was first acquired. However, having hashes of each part of the image can help in one major way.</div><div>
</div><div>The Expert Witness Format that EnCase uses has checksums every 32KB that enables verification of parts of a disk image. If one part of a disk image changes, we can potentially still use the other parts of the image that can be verified with their checksum, even if the overall hash can not be verified.</div><div>
</div><div>With a multi-part RAW image, we can get similar functionality by hashing each part. Each part can then be verified, along with the overall hash. If the overall hash is not valid, hashes of each part can be used to determine what part has changed. Other parts that can be verified may still be used.</div><h3>Loading a multi-part image</h3><div>When many tools load a multi-part image, they may only show the filename of the first part of the image. If the tool is made ‘for forensics’, then the tool will likely load the entire image under the first filename. In this case, verify that the tool can:</div><div><ol><li>Detect the full size of the original disk image</li><li>Can generate the correct hash value for the original image</li></ol></div>

1 min read

How to print a double-side PDF booklet with a single-side printer

I only very rarely need to print something. However, printing things like grade reports and student schedules can come in handy. Since we don’t have a community printer, I bought a simple single-side, black and white laserjet from Samsung (pictured). Most of the features for connecting to it, I turned off. Google cloud print, however, is surprisingly useful for printing from my phone or outside the office.

Beyond printing schedules every now and then, I would like to print research articles (PDFs) from journals and read them on the train. My Galaxy Note II makes reading PDFs possible, but not great. eBooks are much better… when will journals provide ebooks?

So the goal is, with a single-sided printer, print booklets from PDFs.

By booklets, I mean a taking a normal A4 sheet of paper, holding in landscape, and folding it in half to form a book with the ‘spine’ where the fold is. There are 2 pages on one sheet of paper, and we want to print on both sides, so 4 pages for 1 sheet of paper. See the picture below. The trick is page ordering.

<div class="separator" style="clear: both; text-align: center;"></div>

Some software has ‘booklet’ mode when printing. In LibreOffice if you click on ‘File-> Print’ and select the “Page Layout” tab, there will be a “Brochure” option. This option will automatically order pages into a small booklet style. If you have a double-sided printer, congrats, you are done. If you have a single-sided printer, select “Page Sides->Include Front Sides / Right Pages”. Then print, and put the paper back in the printer. For my printer, the paper prints on the top so I should keep the pages in the same rotation, and put the blank sides up.

<div class="separator" style="clear: both; text-align: center;"></div>

“Booklet” options are easy to use if you are creating your own documents, but I want to print already created PDFs. I heard that Adobe Reader has a booklet mode, but I am on Linux and don’t use Adobe Reader.

My default reader is “Document Viewer - Evince”, and it does not have a booklet feature.

I came across the program “pdfbook” which basically rearranges PDF pages for you so you can print booklets. If you use pdfbook like so:

<pre>pdfbook journal-paper.pdf</pre>
It will output a pdf with 1 pages per sheet, but some of the pages are flipped over. I think this is intended for a double-sided printer. To be able to print with a single sided printer, we need to use the option ‘–short-edge’.

<pre>pdfbook –short-edge journal-paper.pdf</pre>
This will rearrange the pages with 2 pages per sheet, and all are facing upwards. There is just one more thing to do to print the booklet.

When printing, go to ‘Print -> Page Setup Tab’ and choose ‘Only Print -> Even Sheets’.  Make sure that your printer says 1 page per sheet. If you print 2 pages per sheet, you will have 4 ‘pages’ on one side of the paper.

After printing the even sheets, take the paper out of the printer. If you put the printed pages back in the printer in the order they are now, the first page will be on the bottom. We need to reverse the current order.

With the printed side facing you, put the first sheet on the table. Now put the next sheet on top of the first sheet with the printed side still facing you. Continue will the remaining sheets.

Once the sheets have been reordered, put the paper back in your printer with the printed side facing down (might be different on your printer). Now go to ‘Print -> Page Setup Tab’ and choose ‘Only Print -> Odd Sheets’.

You will need to determine how to feed the paper into your printer, but this is the method that works for mine.

If someone were printing a lot, I may recommend getting a double-sided printer, but since I am printing less than one paper per month, this method works for me. It makes a nice little - but not too little -booklet, and saves toner and trees.

3 min read

[How-To] Installing thc Hydra in Ubuntu

The steps below are how to easily install thc Hydra in Ubuntu with the majority of required libraries for common tasks. Hydra is a pretty well-known remote authentication service brute force cracker. It can perform rapid dictionary attacks against more then 30 protocols, including telnet, ftp, http, https, smb, several databases, and much more. I usually use it to test web forms on apps I’m making.

Please note: The main thc-Hydra website as malicious. Do not visit there on your main system. All of the links on this page go straight to the source at github.

<div class="separator" style="clear: both; text-align: center;"></div>
First you need to install git and tools to build the code. We will use this to get the source for thc-Hydra, and to update it from time to time.

<pre>sudo apt-get install git build-essential</pre>
Next, we need to get the source for thc-Hydra from github:

<pre>cd /opt/
git clone https://github.com/vanhauser-thc/thc-hydra.git
sudo chown -R [your username] thc-hydra
sudo chmod -R 755 thc-hydra
</pre>
Now you should have the source code, but most likely if you install now it will be missing a lot of the libraries it needs. Install some of the most common libraries from the packages below:

<pre>sudo apt-get install zlib1g-dev libssl-dev libidn11-dev libcurses-ocaml-dev libpcre3-dev libpq-dev libsvn-dev libafpclient-dev libssh-dev
</pre>
Now try to build hydra

<pre>./configure</pre>
Check the output and see if you can find any missing libraries.

<pre>./configure
make -jX
sudo make install
</pre>
Here “make -jX” where X is the number of processors your system has.

Now hydra should be installed. Type which hydra to see the install location. You can test ssh with a password list with the following command:

<pre>/usr/local/bin/hydra -l root -P PW.list -f -s 22 -t 4 -e ns 127.0.0.1 ssh
</pre><pre>
</pre>

1 min read

[How-To] Using GnuPG to verify data using detached signatures

GnuPG logo

Many software downloads come with a signature file. You normally need to download this signature file separately. Signatures are a great way to let people know that you are the person / company that is making the software available, and that no one else has changed the data since its release.
<div>
</div><table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; text-align: right;"><tbody><tr><td style="text-align: center;">Tails linux ISO and signature download links with SHA256 checksum</td></tr><tr><td class="tr-caption" style="text-align: center;">Fig 1: Tails ISO and signature file download</td></tr></tbody></table><div>We are going to use Tails Linux as an example. On their download page, you will find a link to download the Tails ISO image. This is the data we are interested in running. Think of it like the main program that we want to install / use.</div><div>
</div><div>Next, we are given a link to the “Tails 1.4 signature”. This is signature file that the distributor created. With this signature we can verify that the Tails ISO Image has not been modified by anyone else.</div><div>
</div><div>Tails also provides a “SHA256 Checksum”. This is a less-rigorous way than signatures to verify the data has not changed.</div><div>
</div><div>First, download the ISO file AND the signature file. The signature file will almost always end with “.sig”. Make sure both files are in the same directory.</div><div>
</div><div class="separator" style="clear: both; text-align: center;"></div><div>
</div><div>Once you had both files, open the command line / terminal and navigate to that directory. Next we need to use gpg to verify the signature. If we try to verify now, we may get the following results:</div><div>
</div><pre>gpg2 –verify tails-i386-1.4.iso.sig gpg: assuming signed data in ‘tails-i386-1.4.iso’
gpg: Signature made Tue 12 May 2015 02:56:27 AM KST using RSA key ID 752A3DB6
gpg: Can’t check signature: No public key
</pre>
In this case, we also need to get the public key of the person that created the signature. From the tails website, I find the ID of their signing key, so now we need to import.

<pre>gpg2 –recv-keys A490D0F4D311A4153E2BB7CADBB802B258ACD84F
gpg: key 58ACD84F: public key “Tails developers (offline long-term identity key) " imported
gpg: 3 marginal(s) needed, 1 complete(s) needed, PGP trust model
gpg: depth: 0 valid: 2 signed: 0 trust: 0-, 0q, 0n, 0m, 0f, 2u
gpg: next trustdb check due at 2017-01-09
gpg: Total number processed: 1
gpg: imported: 1
</pre>
Make sure we have the right key:

<pre>gpg2 –list-keys
pub rsa4096/58ACD84F 2015-01-18 [expires: 2016-01-11]
uid [ unknown] Tails developers (offline long-term identity key)
sub rsa4096/752A3DB6 2015-01-18 [expires: 2016-01-11]
sub rsa4096/2F699C56 2015-01-18 [expires: 2016-01-11]
</pre>
Now verify the signature again:

<pre>gpg2 –verify tails-i386-1.4.iso.sig gpg: assuming signed data in ‘tails-i386-1.4.iso’
gpg: Signature made Tue 12 May 2015 02:56:27 AM KST using RSA key ID 752A3DB6
gpg: Good signature from “Tails developers (offline long-term identity key) " [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg: There is no indication that the signature belongs to the owner.
Primary key fingerprint: A490 D0F4 D311 A415 3E2B B7CA DBB8 02B2 58AC D84F
Subkey fingerprint: BA2C 222F 44AC 00ED 9899 3893 98FE C6BC 752A 3DB6
</pre>
Here we can see when the signature was made, and the ID of that key. Next we see “Good signature” which means that the signature does verify the data.

Remember, we were given the SHA256 value of the ISO file. Get the SHA256 hash with the following command (linux):

<pre>sha256sum tails-i386-1.4.iso
339c8712768c831e59c4b1523002b83ccb98a4fe62f6a221fee3a15e779ca65d tails-i386-1.4.iso
</pre>
Now can can compare this hash value to the one on the website, and we see that they are the same.
<h3>If I can just check the hash value, why verify with a signature?</h3><div>Hash values do allow you to make sure that the data has not changed, however, there are a number of weaknesses. For example, someone intercepting your network traffic could deliver the web page to you with an altered ISO link AND an altered hash value on the page. This means that the hash value will be valid, but the source of the information cannot be trusted.</div><div>
</div><div>Signatures help this in a number of ways. Because the signature is generated by a developer’s private key, and we are verifying it with their public key, it is nearly impossible for someone to pretend to be the developer. Also, since we did not download the public key from the webpage, but looked it up on a different server, it is slightly more difficult for someone to trick us into download the wrong key. Further, we can try to use the Web of Trust to make sure we are getting the right key. In our case, we can see who has signed this key by going to a keyserver checking.</div>

3 min read

Clearing USB disk read cache for testing and forensics in Linux


When copying data from USB devices in Linux (Debian / Ubuntu), you may have noticed that reading data from the disk the first time takes a while, and reading the second time takes only a few seconds.
<div>
</div><div>For example:</div><div>
</div><pre>[email protected] /media/joshua/ucdntfs $ time sudo md5sum /dev/sdf1
3a698f0c3155e494274e5e7829f4d246 /dev/sdf1

real 2m58.620s
user 0m7.032s
sys 0m1.429s

[email protected] /media/joshua/ucdntfs $ time sudo md5sum /dev/sdf1
3a698f0c3155e494274e5e7829f4d246 /dev/sdf1

real 0m3.467s
user 0m3.285s
sys 0m0.181s
</pre>
Here the first read took 2 minutes 58 seconds, while the second took only 3 seconds.This is because all data on the disk is cached to memory when read the first time. In cases where the disk may change between reads, caching may return results that are not consistent with the current state of the disk (like a hash).

When looking how to disable read cache, I found a lot of information about disabling write cache, but not a lot about disabling read.

To disable write cache (if supported) for the current session that the device is plugged in:

<pre>sudo hdparm -W 0 /dev/[device]
</pre>
But this does not solve our read cache problems. Unfortunately, I could not find a way to completely disable read cache, but we can clear the cache buffer.

First, determine the path to echo with

<pre>which echo</pre>
Then we want to tell the kernel to drop caches. To do this, we need to echo a value to /proc/sys/vm/drop_caches.
<blockquote class="tr_bq"><pre style="white-space: pre-wrap; word-wrap: break-word;">To free pagecache:
echo 1 > /proc/sys/vm/drop_caches
To free reclaimable slab objects (includes dentries and inodes):
echo 2 > /proc/sys/vm/drop_caches
To free slab objects and pagecache:
echo 3 > /proc/sys/vm/drop_caches</pre></blockquote>
So our echo command to clear all caches should look like:

<pre>sudo sh -c “/bin/echo 3 > /proc/sys/vm/drop_caches”
</pre>
Note: You probably cannot echo directly to drop_caches with sudo - you should be root. The work-around to that is wrap the whole command in sudo. Make sure you are putting the full path to echo on your system.

<pre>[email protected] /media/joshua/ucdntfs $ time sudo md5sum /dev/sdf1
3a698f0c3155e494274e5e7829f4d246 /dev/sdf1

real 3m18.294s
user 0m6.389s
sys 0m1.390s

[email protected] /media/joshua/ucdntfs $ sudo sh -c “/bin/echo 3 > /proc/sys/vm/drop_caches”

[email protected] /media/joshua/ucdntfs $ time sudo md5sum /dev/sdf1
3a698f0c3155e494274e5e7829f4d246 /dev/sdf1

real 3m18.344s
user 0m6.545s
sys 0m1.438s
</pre>
If you use it a lot, like me, you might want to make an alias:

<pre>alias clearusbcache=”sudo sh -c ‘/bin/echo 3 > /proc/sys/vm/drop_caches’“
</pre>
If you want to clear the cache in the background while running experiments, you can try this script:

<pre>#!/bin/bash
while true; do
/bin/echo 3 > /proc/sys/vm/drop_caches
sleep 1
done
</pre>

1 min read

Modifying Javascript Variables Real Time with Chrome aka AdBlock Detection Subversion

Sometimes you may want to see what scripts a website is trying to run on your system. Other times you may want to be able to not only watch, but also modify javascript variables.

<div class="separator" style="clear: both; text-align: center;"></div>Doing this with Google Chrome is relatively easy. After opening chrome, open “Developer Tools” either from Menu -> More tools -> Developer tools or with ctrl+shift+I.

From the top of the developer tools menu, choose “Sources”, and the source file you would like to look at. From here you should be able to see what the website is trying to run (just like view source with more information).

Form here there are many things that you can do. For a more comprehensive list, please see the Chrome DevTools Overview.

Setting Javascript Variables
One of the most useful things about chrome developer tools is the ability to set javascript variables.

An example that I find often is sites using javascript to detect adblock software. When doing research on sites that might be malicious, you may want to access all features of the site without enabling ads (or other potentially malicious stuff). However, some sites will redirect or do other tricks when ad blocker software is detected.

Example adBlock detection script (real code I found at a random site):

<pre> var isBlockAds = true;
$(function () {
setTimeout(‘DoDetect()’, 3000);
});

function CheckAdImage(elem) {
if (elem.is(“:visible”)) {
isBlockAds = false;
elem.hide();
}
}

function DoDetect() {
CheckAdImage($(‘#adElement’));
if (isBlockAds) {
// redirect to new page
});
}
}
</pre>
This javascript code is doing a few things. First the variable “isBlockAds” is set to true. The next function runs the “DoDetect” function after 3 seconds. DoDetect calls CheckAdImage with the element identifier to check. If the element is visible on the page (not blocked) then isBlockAds is set to false. If isBlockAds is true at the end of the process, then redirect the page to somewhere else.

The end result is that if you are blocking ads, then you can use the page for a short period, but then are redirected to a new, possibly malicious page.

So how can we get around this? With the chrome developer tools do the following:

<ul><li>Open the target webpage</li><li>Under the list of sources, hit the pause button: <div class="separator" style="clear: both; text-align: center;"></div></li><li>Under “console” at the bottom, there is a white box with a > character. You can type here. <div class="separator" style="clear: both; text-align: center;"></div></li><li>In the console you can set global variables. Here we will set our variable isBlockAds to false: <div class="separator" style="clear: both; text-align: center;"></div></li><li>If the variable has already be defined, developer tools will should autocomplete. If not, you can set it. Note: in our example if we set isBlockAds before it is defined in the page, it will be reset to true later.</li></ul><div>That is it. In this case, hitting the pause button allows us to find the variable we want to change before the page redirects. We can then use the console to check and set any variables we want.</div><div>
</div><div>For example, if we want to see the value of a random variable we can do something like:</div><div>
</div><pre> console.log(iswhatsappCustomButton);
false
</pre><div>
</div><div>Check out Chrome DevTools Overview for much more information.</div>



2 min read

John the Ripper shared library error path fix on Linux

If you are using John the Ripper with CUDA, and you start to see errors like:
<blockquote class="tr_bq">./unshadow: error while loading shared libraries: libcudart.so.6.5: cannot open shared object file: No such file or directory</blockquote><div>First, check your paths. An example .bashrc might look like (64bit system):</div><div>
export CUDA_HOME=/usr/local/cuda
export LD_LIBRARY_PATH=${CUDA_HOME}/lib64:$LD_LIBRARY_PATH
PATH=${CUDA_HOME}/bin:${PATH}
export PATH

<div>
If you are still getting the shared library error, then check /etc/ld.so.conf.
Add the path to your cuda library: /usr/local/cuda/lib64
Then run sudo ldconfig</div>
Now try running unshadow again, and the problem should be fixed.

<div class="separator" style="clear: both; text-align: center;">
</div><div class="separator" style="clear: both; text-align: center;"></div>
</div>

~1 min read

Attacking Zip File Passwords from the Command Line

There was recently a question on SuperUser linking back to CybercrimeTech’s article about cracking passwords, with an issue about zip files using ZipCrypto, and never finding the password. I left an answer, saying that I guess zip2john does not know how to accurately extract the hash from zip files using that particular algorithm.

<div class="separator" style="clear: both; text-align: center;"></div>In such a case, you can either 1) figure out the data structure, and update zip2john (https://github.com/magnumripper/JohnTheRipper), or use the same approach that we have used before with LUKS to attack the file directly from the command line.

Definitely, attempting to crack the hash is faster, but if you are stuck and don’t have time to reverse engineer a new file type, this would eventually work for you.

See the code below as an example of having John generate the password then passing it to 7zip to try. This should work regardless of chosen encryption, unless you have to specify it when opening the archive. It is not clean, but it should be enough to illustrate.

    #!/bin/bash
    # Using john the ripper to brute-force a zip container
    startTime=$(date)
    if [ $(file $1 | grep -c "Zip archive data") ]; then
        john -i --stdout | while read i; do   # this is john generating password to stdout
            echo -ne "\rtrying \"$i\" "\\r
            7z -p$i -so e $1 2>&1> /dev/null     # this is your zip command
            STATUS=$?
            if [ $STATUS -eq 0 ]; then
                echo -e "\nPassword is: \"$i\""
                break                         # if successful, print the password and quit
            fi
        done
        echo "Start time $startTime"
        echo "End time $(date)"
    else
        echo "The file does not appear to be a zip file"
    fi


This approach should work when you are unable to extract the hash, but is much, much slower (not really practical for most applications). See the results below.

    ...
    trying "pmc"
    7-Zip [64] 9.20  Copyright (c) 1999-2010 Igor Pavlov  2010-11-1
    Processing archive: test.zip
    Extracting  Sample_memo.pdf     Data Error in encrypted file. Wrong password?
    **Sub items Errors: 1**
 
    trying "1234"
    7-Zip [64] 9.20  Copyright (c) 1999-2010 Igor Pavlov  2010-11-18
    Processing archive: test.zip
    Extracting  Sample_memo.pdf
    **Everything is Ok**
    Size:       60936
    Compressed: 51033
 
    Password is: "1234"
    Start time 2015. 01. 03. (토) 19:02:51 KST
    End time 2015. 01. 03. (토) 19:02:51 KST

2 min read

[How To] Installing LIBEWF in Ubuntu Trusty

Installing LIBEWF is normally straightforward. Usually the most difficult part is remembering which packages are required for the dependencies. When running configure, I always like to have “support” dependencies filled out. While some of these are not necessary, you may find yourself needing them someday, and having to recompile.

<div class="separator" style="clear: both; text-align: center;"></div>On a almost-brand-new install of Ubuntu Trusty (64bit), these are the required packages:

apt-get install build-essential autoconf automake libfuse-dev uuid-dev libbz2-dev zlib1g-dev

Then you just download LIBEWF, untar, and run ./configure. All dependencies should be filled out.

From there it is just a simple make to start working with forensic file formats.

~1 min read

[How To] Easy Install TexStudio on Ubuntu

I mess around with the internals of my operating systems a lot. This means that every few months I need to re-install my operating system, which, lately, is almost always Linux Mint. This also means that I have to remember most of the packages I have installed.
<div>
</div><div>Out of all the software I normally use, LaTeX is usually one of the most difficult to remember. However, I have found package combination that gets most - if not all - of the packages I would normally used for LaTeX.</div><div>
</div><div class="separator" style="clear: both; text-align: center;"></div><div>First, I use TeXstudio as a graphical front-end for LateX. Thankfully, they provide .deb packages on their site. First, download the latest version. There is a version in the apt repository, but it is older.</div><div>
</div><div>After downloading and installing TeXstudio, you may notice that LaTeX, bibtex and all the other packages you likely need to be able to produce documents is unavailable. Almost everything can be found in the following packages:</div><div>
</div><pre>
apt-get install texlive texlive-latex-extra texlive-latex-recommended latex-xcolor pgf</pre>

If you prefer things like Biber, it can also be found in the repository. However, it will likely take more tweaking to get working properly.

That’s all it takes to get a fully functional LaTeX editor.

~1 min read

[How to] Brute forcing password cracking devices (LUKS)

We have written in the past about how to crack passwords on password-protected RAR and ZIP files, but in those cases someone wrote a program to extract the password hashes from the RAR and ZIP files first. After that, we could use John the Ripper to generate passwords (or use a dictionary) to attack the password hashes. In this case, John the Ripper generates a password, then hashes the password with the same hashing/salting method as the hash we are attacking. If the hash matches, then we have the clear-text password.
<div>
</div><div class="separator" style="clear: both; text-align: center;"></div><div>However, in some situations, we don’t have a password hash to attack, or maybe we don’t know what algorithm is used. Either way, we may not be able to extract a hash for whatever reason. In this case, we can still brute-force the password using the standard authentication mechanism.</div><div>
</div><div>In this tutorial, I will be brute-force attacking a LUKS encrypted file using John the Ripper. A LUKS encrypted file is similar to a truecrypt container. It could be used to encrypt a disk, partition, or file.</div><div>
</div><div>Let’s assume that an investigator extracted all files from a suspect disk. By looking at the file headers, we find the following:</div><div>
encrypted: LUKS encrypted file, ver 1 [aes, xts-plain64, sha1] UUID: 811f8a08-85da-4f7d-b50f-3e64ed7a66f4</div>
Maybe the investigator has no memory image of the suspect device or any other information regarding the file, but still needs to get into it….

To mount a LUKS file from the (linux) command line, you have to use cryptsetup . The command usually looks like cryptsetup luksOpen container mountpoint . When you attempt to mount, by default it will ask for a password 3 times if the attempts are incorrect. We cannot pass the password directly to cryptsetup as an option, it will always ask for a password.

Regardless of some challenges, cryptsetup does have some useful options, such as -T which controls how many times it will ask for a password before giving up. And also the option –test-passphrase, which will see if the password worked without actually opening the device. So right now we have the following command to open the LUKS device:

cryptsetup luksOpen $1 x --test-passphrase -T1


Now we can use John the Ripper in incremental mode sending the output to standard out to generate our password list.

john -i --stdout


Now we just need a small script to capture the output of JtR and test the cryptsetup password:

#!/bin/bash

# Using john the ripper to brute-force a luks container
startTime=$(date)
if [ $(file $1 | grep -c "LUKS encrypted file") ]; then
    john -i --stdout | while read i; do
        echo -ne "\rtrying \"$i\" "\\r
        # as root
        echo $i | cryptsetup luksOpen $1 x --test-passphrase -T1 2> /dev/null
        STATUS=$?
        if [ $STATUS -eq 0 ]; then
          echo -e "\nPassword is: \"$i\""
          break
        fi
    done
    echo "Start time $startTime"
    echo "End time $(date)"
else
        echo "The file does not appear to be a LUKS encrypted file"
fi

<div>
</div><div>This bash script first uses file to check whether the input is recognized as a LUKS encrypted file. If so, it will run JtM in incremental mode, and output to stdout. A while loop is used to capture the generated password as a variable “i”. We then echo the password, and pipe it to the luksOpen command. That way, when cryptsetup asks for a password it will use the password we have piped to it. The status will be either 0 if it worked or 2 if the password failed. If the status is 0, then we print the password, and how long it took. If the status is not 0, we get a new password and try again.</div><div>
</div><div>Now, if you start to run this script you may notice that it is very, very, very slow. Generating passwords with JtM is relatively quick, but trying the passwords on a LUKS device is designed to be slow.</div><div>
</div><div>This process will likely take a very long time, but 1) it will eventually crack any type of device and 2) it can be used when you have no other option.</div><div>
</div><div>Try to use the script to play around with other password-protected files/devices.</div>

3 min read

[How-to] Cracking ZIP and RAR protected files with John the Ripper

After seeing how to compile John the Ripper to use all your computer’s processors now we can use it for some tasks that may be useful to digital forensic investigators: getting around passwords. Today we will focus on cracking passwords for ZIP and RAR archive files. Luckily, the JtR community has done most of the hard work for us. For this to work you need to have built the community version of John the Ripper since it has extra utilities for ZIP and RAR files.

<div class="separator" style="clear: both; text-align: center;"></div>For this exercise I have created password protected RAR and ZIP files, that each contain two files.

<pre>
test.rar: RAR archive data, v1d, os: Unix

test.zip: Zip archive data, at least v1.0 to extract
</pre>The password for the rar file is ‘test1234’ and the password for the zip file is ‘test4321’.

In the ‘run’ folder of John the Ripper community version (I am using John-1.7.9-jumbo-7), there are two programs called ‘zip2john’ and ‘rar2john’. Run them against their respective file types to extract the password hashes:
<pre>
./zip2john ../test.zip > ../zip.hashes
./rar2john ../test.rar > ../rar.hashes
</pre>This will give you files that contain the password hashes to be cracked… something like this:
<pre>
../test.zip:$pkzip$221001ba80c95e4e9547dcfcde4b8b2f05a80aaeb9d15dd76e7526b81803c8bf7201bf7205131204401ba808cbafdd390bf49ea54064ab3ff9f486e6260b9854e37d1ee3a41c54*$/pkzip$
</pre>After, that you can run John the Ripper directly on the password hash files:
<pre>./john ../zip.hashes
</pre>You should get a message like: Loaded 1 password hash (PKZIP [32/64]). By using John with no options it will use its default order of cracking modes. See the examples page for more information on modes.

Notice, in this case we are not using explicit dictionaries. You could potentially speed the cracking process up if you have an idea what the password may be. If you look at your processor usage, if only one is maxed out, then you did not enable OpenMP when building. If you have a multi-processor system, it will greatly speed up the cracking process.

Now sit back and wait for the cracking to finish. On a 64bit quad-core i7 system, without using GPU, and while doing some other CPU-intensive tasks, the password was cracked in 6.5 hours.
<pre>
Loaded 1 password hash (PKZIP [32/64])

guesses: 0 time: 0:00:40:29 0.00% (3) c/s: 2278K trying: eDTvw - ekTsl
guesses: 0 time: 0:01:25:10 0.00% (3) c/s: 1248K trying: ctshm#ni - ctshfon9
guesses: 0 time: 0:02:56:40 0.00% (3) c/s: 1499K trying: BR489a - BR48jf
guesses: 0 time: 0:03:56:04 0.00% (3) c/s: 1703K trying: fjmis5od - fjmidia0
guesses: 0 time: 0:04:46:09 0.00% (3) c/s: 1748K trying: Difg1ek - DifgbpS
guesses: 0 time: 0:05:21:22 0.00% (3) c/s: 1855K trying: btkululp - btkulene
guesses: 0 time: 0:06:02:43 0.00% (3) c/s: 1857K trying: ghmnymik - ghmnyasd
test4321 (../test.zip)
guesses: 1 time: 0:06:32:34 DONE (Mon Jul 28 17:50:22 2014) c/s: 1895K trying: telkuwhy – test43ac
</pre>Now if you want to see the cracked passwords give john the following arguments:
<pre>
./john ../zip.hashes –show
</pre>It should output something like:
<pre>
../test.zip:test4321
1 password hash cracked, 0 left
</pre>Note: the hash file should have the same type of hashes. For example, we cannot put the rar AND zip hashes in the same file. But this means you could try to crack more than one zip/rar file at a time.

For the rar file it did not take nearly as long since the password was relatively common. If you take a look at john.conf in the run directory, it has a list of the patterns it checks (in order). The pattern 12345 is much more likely than 54321, so it is checked first resulting in a quick crack.
<pre>
Loaded 1 password hash (RAR3 SHA-1 AES [32/64])

guesses: 0 time: 0:00:00:10 1.38% (1) (ETA: Mon Jul 28 18:23:58 2014) c/s: 24.86 trying: rar.tsett - ttests

guesses: 0 time: 0:00:02:12 13.40% (1) (ETA: Mon Jul 28 18:28:19 2014) c/s: 25.98 trying: Test29 - Test2rar9

test1234 (test.rar)

guesses: 1 time: 0:00:17:03 DONE (Mon Jul 28 18:28:56 2014) c/s: 24.01 trying: test1234 - testrar1234

Use the “–show” option to display all of the cracked passwords reliably

</pre>

3 min read

[How-to] Compiling John the Ripper to use all your processors for password cracking

John the Ripper is a fast password cracker, currently available for many flavors of Unix, Windows, DOS, BeOS, and OpenVMS. Its primary purpose is to detect weak Unix passwords. Besides several crypt(3) password hash types most commonly found on various Unix systems, supported out of the box are Windows LM hashes, plus lots of other hashes and ciphers in the community-enhanced version.

Today we are going to show you how to compile John the Ripper to use all of your processors (we will talk about compiling for NVIDIA GPUs later).

First you should visit Openwall's site and download the John the Ripper source code. I recommend getting the community-enhanced version since it contains support for many other hashes and ciphers. As of this writing, the current version of the community edition is 1.7.9.

You also need to install a compiler and ssl. On Ubuntu systems, you can just install the build-essential package, and libssl-dev.
sudo apt-get install build-essential libssl-dev

Download and verify the version suitable for your platform. In this example I am compiling on Linux Mint 17 (Ubuntu Trusty).

Extract the tar:
tar -xvf john*.tar.gz

Enter the newly created directory into the “run” directory:
cd john*/src

Important: This step enables parallel-processing in John using OpenMP.
nano Makefile

Remove the # before
OMPFLAGS = -fopenmp
and
OMPFLAGS = -fopenmp -msse2

Now save, and close.

Type “make | more” and choose the type of system that you are using. I am running a 64bit version of Linux, so I will choose linux-x86-64-native. If you have a 32 bit system, make sure to choose x86. If you don't know what to choose then “generic” will probably work for you.

Once you have edited the Makefile, and picked the system to compile for, then build the program:
make clean linux-x86-64-native

On multi-processor systems you can also add -j5 where 5 is the number of processors on your system.
make clean linux-x86-64-native -j5

Once the process is done – if it had no errors – then the binaries will be in the 'run' directory.
cd ../run

You can test it by running ./john --test

Troubleshooting


<div style="line-height: 100%; margin-bottom: 0in;">If there was an error building, try building “generic”. If it works, then you probably chose the wrong build options.</div>

1 min read

[How-to] Check if your system is vulnerable to the Heartbleed OpenSSL bug

The Heartbleed OpenSSL bug can leave a lot of systems open to exploitation. To see whether your system is vulnerable try the following.
<div>
</div><div>*I am using Ubuntu, but if OpenSSL is installed on your system, the commands should be similar.</div><div>
Open a terminal or command prompt.

</div><div>First, check your version of OpenSSL:

</div><div>sudo openssl version -a </div><div>
</div><div>The command should output the OpenSSL version number.

<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">OpenSSL version on Ubuntu that is vulnerable to Heartbleed</td></tr></tbody></table>OpenSSL says you should upgrade to version 1.0.1g. If you manually installed OpenSSL, get the latest source, and install it.</div><div>
</div><div>If you are on Ubuntu, you should also look at  the “built on” date. If the date is on or after April 7th, then the patch has been applied. If the date is before April 7th, do a dist-upgrade to update.

</div>sudo apt-get update
sudo apt-get upgrade


<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Apt-get upgrade will likely want to upgrade a number of packages, many of which are potentially vulnerable to the attack.</td></tr></tbody></table>Once the upgrade is complete, the “built on” date should be on or after April 7th.

<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Patched OpenSSL installation on Ubuntu. Note, the version is still 1.0.1e - make sure to check the build date.</td></tr></tbody></table>Make sure you reboot your system to make sure the changes are applied.

Many vulnerable products will likely be pushing out updates soon. Make sure you update all your devices, including mobile phones and routers.

1 min read

Installing Cinnamon 2.0 on Linux Mint 14

With only a few weeks (hopefully) until Linux Mint 16 is released, I have been installing different software that I may want to start using. With all my data backed up and ready to migrate, this is essentially a few weeks to experiment with different programs to see how well they work.


The major consideration for me is ‘Cinnamon 2.0’ - the Gnome replacement desktop with Linux Mint. Cinnamon 2.0 has a lot of features and fixes that I have been looking forward to that were not pushed down. The other major piece of software I am looking for is a good desktop search tool.

If you are already running Linux Mint with a prior version of Cinnamon, it will not be upgraded automatically to 2.0. You can upgrade the following way:
<blockquote class="tr_bq">sudo add-apt-repository ppa:gwendal-lebihan-dev/cinnamon-stable</blockquote>Then add the following to /etc/apt/preferences
<blockquote>Package: *
Pin: release o=LP-PPA-gwendal-lebihan-dev-cinnamon-stable
Pin-Priority: 800 </blockquote>Do a apt-get update, and apt-get dist-upgrade.
For me on Linux Mint 14, the package ‘mint-translations’ did not want to install, so I had to re-run apt-get dist-upgrade with ‘-f’.
<blockquote class="tr_bq">sudo apt-get dist-upgrade -f</blockquote>Once installed, reboot.

On reboot the theme and background are changed, but all my settings were saved. Even conky still worked fine. Date and time settings were reset, however, and there is no app to change them back. Time and calendar still work, but you have to manually configure them.

The first thing I noticed, however, is that Cinnamon 2.0’s default settings are not as pretty as 1.8. To change it back to Mint’s old Cinnamon look (which is more sleek IMO), right click on menu -> ‘system settings’ -> ‘themes’ -> ‘other settings tab’. From there, I changed the controls, icons, etc. to ‘Mint-X’. From the ‘installed’ tab, you can also change the applied theme.

Overall Cinnamon 2.0 is much more responsive than 1.8. Even without reformatting I am seeing noticeably better performance. The look and feel is a bit more clean and polished.

1 min read

Sending a manually encrypted email using a public key

Some web-based email services don’t have an encryption client available, but if you still want to be able to encrypt an email using someone’s public key, you can do it in the following way.

First, get GPG.

<ul><li>There is GPG4Win: http://www.gpg4win.org/ (I’ve not used it, but I assume there is a command line client)</li><li>GPGTools for Mac: https://gpgtools.org/ (I’ve used, and liked very well)</li><li>For Linux there are few different options, so see this for more information.</li></ul>

Once you have GPG installed, you should be able to run the command ‘gpg’ from a terminal.

*note - On Windows or Mac the commands may be slightly different than what is shown here.

For this tutorial, you will be encrypting an email using someone’s public key. I will not show you how to create your own keys in this single.

Once you can run gpg, we need to try to find the public key of the person you will send the message to. If they have uploaded their key to a public key server you can query the server with
<blockquote class="tr_bq">gpg –list-keys [email protected]</blockquote>You can then import the key you have found using –recv-keys with the key’s ID. For example, my key looks like: pub   2048R/606B15C4 2013-01-09 [expires: 2017-01-09]. The key’s ID is 606B15C4.
<blockquote class="tr_bq">gpg –recv-keys 606B15C4 </blockquote>Once we have successfully imported the key, we can use it to encrypt messages that only the holder of the associated private key can decrypt. So, for example, if you use my public key to encrypt a file, then only my private key can be used to decrypt the file.

Likewise, if I use my private key to encrypt a file, then only my public key can be used to decrypt the file. This is a good way to show that you are the originator of the information.

OK - so let’s make some text to send in our email. I will create a text file as a simple file container.
<blockquote class="tr_bq">echo “Question: Can security students encrypt their email?” >> encryption_test.txt
echo “Hypothesis: Security students are too lazy to encrypt their email.” >> encryption_test.txt </blockquote>So now I have a plain text file called “encryption_test.txt” with two lines in it.

(In Linux) I can read the contents of the file using ‘cat’, and pipe the output into gpg for encryption.
<blockquote class="tr_bq">cat encryption_test.txt
gpg –encrypt –armor -r 606B15C4</blockquote>The first command takes the contents of the file “encryption_test.txt” and send it to gpg, which encrypts the text with the public key 606B15C4. The output is as follows:

-----BEGIN PGP MESSAGE-----
Version: GnuPG v1.4.11 (GNU/Linux)

hQEMA999TJtS1eU1AQgAn09ngR2pGDGCfFH0aKUazQInd+HtYqqFq6UmFyszb5RX
OS5mWB3gEKMh3Ki0Qvgs2tA9nBx1+uzyxvkLoXrLwkfVg7mu821bjkh9OMsFO9hb
EeKkZjAAHHcf1rkpWKgc9mcmYyaNBEexyDJy648osWiGyDfedVMKog0k44NFf9iG
aBNsrrqmS+S6SllCgUhbMtkJYB4+BbCBIDA0laWQF+wcKGIsQfk3XxYdJUAfTFIK
vJwKyyPd9JBwG70XYaycBVIPMYhflmNB0MKVV65iQXQslN0dlTMu9CZGRYDJ4IMD
eC89NDsSVwVBWGPxo76cE19McXC5EHQvEBZk9XUIY9KQAaqZOu8vJlz5kehprSnm
OsYY24fuvsbZ4tglgQQ2FC7QLwB+rMLXNWPXor/EjUUYfeRAtbADfpFgNX/Kyzf8
9CxHSRO0ARj/caQQHCjfothIcR6J+vSGe5z+QA9IG9LhvRYOQkbyqoz9ZxRmBgrG
zD++fpC+hgv56WRHe5BzSBq4iaYJ+M1FGb5YbpCBzrPI
=Umsn
-----END PGP MESSAGE-----


If I copy from —BEGIN PGP MESSAGE — to —END PGP MESSAGE— and paste that into the body of my email, and send it to the person who owns the private key - they should be able to decrypt it.

2 min read

Using XARGS to speed up batch processing

[Edit 24/7/2013] Be careful when using xargs to spawn multiple processes that write to the same file. I’ve been using it with md5sum and piping the output to a file. In data sets with a large number of files, several hundred hashes would be missing in the final file. A more reliable way is to use md5deep.


In our last single we compared md5sum, md5deep and openssl md5 to determine which is fastest at calculating different hashes. For that experiment we were using 1 thread. Essentially, a file comes in, it gets hashes, the next file comes in. With 1 threat, we saw 1 out of the 4 processors being used.

However, if we are processing a list of files, why not split the work into multiple jobs and use all of the processors. md5deep does this by default (4 threads), but splitting the jobs for any task is easily done with xargs.

xargs is a program that allows you to control arguments coming in from a pipe. For example, if we type ‘find -type f
md5sum’ then the list of found files would be hashed as one glob of data. To give each file name to md5sum, instead of all file names, we can use xargs to control how the output of find is piped to md5sum. For example, ‘find -type f xargs -n1 md5sum’ will feed one line at a time to md5sum, allowing md5sum to find the file and hash it.

You can also tell xargs how many threads to create using the -P switch. Since we have 4 processors, we will use 4 threads in this case.

Consider in the last single, were the times for hashing all files in a directory were as follows:
md5sum: real 3m46.117s
md5deep: real 3m57.595s
openssl: real 3m43.142s

(example: ‘find -type f
xargs -P4 -n1 md5sum’)
Using xargs (and all of the processors), the real times can be reduced to:
md5sum: real 1m57.196s
md5deep: real 2m5.408s
openssl: real 1m56.084s

md5deep recursive, default threads (4), no xargs:
real 2m0.521s

Note: the user and system time is the same as before because the same amount of processor time is being devoted to the tasks, however, there are more processors so the real time is much less.

Also, consider that all the files in the directory are different sizes, with the largest being 10GB. The majority of the time for each program was spent with 1 thread hashing the largest file, while the other processors had already finished hashing all the other files.
1 min read

md5sum vs. md5deep vs. openssl md5: MD5 calculation speed test

The following experiment is conducted to determine if md5sum, md5deep or openssl md5 hash calculations are faster than the others.

Methodology:
<div>Test 1: A directory of test files consisting of disk images and extracted Windows Registry files will be scanned with the command ‘find -type f | xargs -n 1 md5program’. The output of this command will be fed into each program, respectively. The time for the entire process to finish will be tracked with the time command. Xargs is being used to feed each line of output to the md5 program.</div><div>Test 1.1: After all programs have finished, the computer will be restarted, and the order the programs are ran in will be changed. For example, if md5sum was the first program in the first round, it will be the second program in the second round. This is to attempt to account for any caching that takes place in the operating system.</div><div>
</div><div>The result will be 9 timed runs for calculating the md5 sum of all files in a directory.</div><div>
</div><div>Test 2: Big files - In test 1, multiple files of different sizes will be continuously fed to the hashing program. In test 2, a single 10 GB disk image will be given to each program and timed.</div><div>
</div><div>Test machine:</div><div>Intel(R) Core(TM) i7-3537U CPU @ 2.00GHz (4 cores)</div><div>8 GB RAM</div><div>Timing cached reads: 8892.40 MB/sec</div><div><div>Timing buffered disk reads: 403.98 MB/sec</div></div><div>
</div><div>Results:</div><div><table border="1" style="text-align: center;"><tbody><tr><td>md5sum, run 1</td><td>md5sum, run 2</td><td>md5sum, run 3</td></tr><tr><td>real 3m46.457s</td><td>real 3m46.117s</td><td>real 3m59.198s</td></tr><tr><td>user 0m42.183s</td><td>user 0m39.254s</td><td>user 0m42.235s</td></tr><tr><td>sys 3m3.595s</td><td>sys 3m6.300s</td><td>sys 3m16.096s</td></tr><tr><td>md5deep, run 1</td><td>md5deep, run 2</td><td>md5deep, run 3</td></tr><tr><td>real 3m57.595s</td><td>real 3m57.666s</td><td>real 4m2.255s</td></tr><tr><td>user 0m43.743s</td><td>user 0m43.567s</td><td>user 0m44.551s</td></tr><tr><td>sys 3m17.552s</td><td>sys 3m17.192s</td><td>sys 3m22.881s</td></tr><tr><td>openssl md5, run 1</td><td>openssl md5, run 2</td><td>openssl md5, run 3</td></tr><tr><td>real 3m48.130s</td><td>real 3m43.142s</td><td>real 3m50.619s</td></tr><tr><td>user 0m37.878s</td><td>user 0m37.462s</td><td>user 0m38.558s</td></tr><tr><td>sys 3m9.436s</td><td>sys 3m5.112s</td><td>sys 3m11.136s</td></tr></tbody></table>
Overall processor time:
Run 1, md5sum 3m45.77s; md5deep 4m01.29s; openssl 3m47.32s.
Run 2: md5sum 3m45.55s; md5deep 4m00.76s; openssl 3m42.57s.
Run 3: md5sum 3m58.34s; md5deep 4m07.43s; openssl 3m49.7s.

Average (rounded to second):
md5sum: 3m50s
md5deep: 4m03s
openssl: 3m46s

Test 1 Conclusions
While the ‘real’ time may be an interesting factor for investigators, what we are more interested in is the time that the program was using the processor. In this case ‘user’ is the time taken by the program in user mode, and sys is time in kernel mode. The basic idea is that we want a program that takes the least amount of time on the processor to do the same amount of work.

Note: from the data it appears that another process, perhaps an OS update, was taking place that affected all the programs.

Based on the average of the three runs, it appears that openssl is slightly faster, followed by md5sum and then md5deep.

It should be noted, however, that md5deep is not really being used in the way it was designed. For example, md5sum does not include a recursive mode, where md5deep does. If we run md5deep using its recursive mode - instead of find and xargs - then the time is actually slightly better than the others:

md5deep recursive - restricted to 1 thread
real 3m43.963s
user 0m41.327s
sys 3m2.103s

md5deep: 3m43s

So, in conclusion, if you are are feeding a list of files into an md5 hash program, openssl appears to be a slightly better choice over md5sum and md5deep. However, if you can choose how the file list is ingested, md5deep is probably a better choice because of speed and available features.

Test 2: Testing the time for each program to hash a 10 GB disk image. This test will not use find and xargs - it will use the program directly. md5deep will be restricted to 1 thread.

<table border="1" style="text-align: center;"><tbody><tr><td>md5sum</td><td>md5deep</td><td>openssl md5</td></tr><tr><td>real 1m44.558s</td><td>real 1m49.491s</td><td>real 1m43.475s</td></tr><tr><td>user 0m19.369s</td><td>user 0m20.421s</td><td>user 0m18.233s</td></tr><tr><td>sys 1m24.877s</td><td>sys 1m30.534s</td><td>sys 1m24.889s</td></tr></tbody></table>
md5sum: 1m44s
md5deep: 1m51s
openssl: 1m43s

For a single large file, it appears that openssl is also the fastest, followed by md5sum and then md5deep.

Please note: all of these tests are, at best, quite shallow. A proper testing environment with many more runs should be conducted.
<hr />
Bonus observation: As set, xargs will send 1 line to the hashing program and spawn 1 process. This process seems to switch between processors, even while the same hash is calculating.

</div><div class="separator" style="clear: both; text-align: center;"></div>

3 min read

[How to] Install pHash on Ubuntu

pHash is an open source software library released under the GPLv3 license that implements several perceptual hashing algorithms, and provides a C-like API to use those functions in your own programs. pHash itself is written in C++.

From pHash.org: A perceptual hash is a fingerprint of a multimedia file derived from various features from its content. Unlike cryptographic hash functions which rely on the avalanche effect of small changes in input leading to drastic changes in the output, perceptual hashes are “close” to one another if the features are similar.

Ubuntu has a package libphash0 and libphash0-dev, but for this tutorial we will be installing from source.

Installation of pHash on Ubuntu (from source). This tutorial assumes you have build tools, like build-essential, installed.

Packages: apt-get install libavformat-dev libmpg123-dev libsamplerate-dev libsndfile-dev cimg-dev libavcodec-dev ffmpeg libswscale-dev

Next download pHash source from: http://phash.org/download/
Version at the time of this writing: 0.9.6

Extract the tar: tar -xvf pHash-0.9.6.tar.gz
Standard configure and make:  ./configure && make && make install

Remember this is just the library. For PHP bindings try this: http://useranswer.com/answer/compile-phash-on-centos-php-extension/

~1 min read

Notes on Installing Linux Mint on a Dell Inspiron 15z 5523

[Update 29/5/2013] The last several days the Banshee music player tends to crash the audio driver sometimes when skipping tracks. Ctrl+alt+backspace brings sound back without a full restart. Had no problem with Spotify crashing driver, so I assume it is a problem with Banshee itself. Everything else is still running with no problems.
[\update]

[Update 13/4/2013] It has been a few weeks with Linux Mint 14 (Unbuntu) on my Dell Inspiron 15z 5523. From a software side, Mint is running perfectly. Except for a few extra features lacking in Cinnamon desktop (which I am sure they will integrate sooner or later), it has been a quick, solid system all around. Battery life is about 4 hours with wireless enabled, surfing the web and listening to music.

The only complaints I have is with the hardware. First the resolution of the display (not a touch display) is too low. For most things it is fine, but attempting to work on multiple documents is not very convenient. An external monitor could solve this.

The other problem is with the keyboard. Compared to the trackpad, the center of the main typing space (the position of the spacebar, for example) is shifted slightly to the left. The result is that, when typing, my right palm hits the trackpad often, causing the mouse to go flying across the screen, deleting text, etc. Again, not a software problem, but a hardware design problem. For anyone who types a lot with the built-in keyboard and has the trackpad enabled, it can be annoying. Again, easily solved with an external mouse and/or keyboard.
[\update]

Installing Linux Mint 14 on a Dell Inspiron 15z 5523.
Disabled UEFI to boot.

Wireless works out of the box.
Intel works video out of the box.
Sound works out of the box.
Webcam works out of the box.

LAN connection not working (does not show up in network connections)
sudo apt-get install linux-backports-modules-cw-3.6-quantal-generic
After reboot, it should show up in network connections.

To be able to use the GeForce card, Bumblebee is required. This is recommended anyway because if the driver is not installed then the battery life will be about 2 hours (about 6 hours estimated after install).
NVIDIA GeForce 650M: https://wiki.ubuntu.com/Bumblebee

In my case the bumblebeed service was not starting, so if I wanted to run optirun it would fail. Try this:
http://askubuntu.com/questions/202644/how-to-install-nvidia-optimus-driver-on-ubuntu-12-10

If it is asking if the service is started, try ‘sudo service bumblebeed restart’.
If bumblebee does not start on reboot, use this (last entry):
http://askubuntu.com/questions/215146/ubuntu-12-10-nvidia-gt555m-bumblebee

I was getting about 108fps with glxspheres, and battery estimation doubled.

Disable bluetooth by default (can turn on again if you want)
http://askubuntu.com/questions/131684/how-to-boot-with-bluetooth-turned-off

Fix huge panel icons in Cinnamon.
Right click on the panel, click panel settings, click “use customized panel size” and “allow cinnamon to scale panel text and icons…”.

I’ve not tried the sd/mmc card slots or HDMI out yet. I will update as I find anything else.

2 min read

Creating and Configuring a Large Redundant Storage Array for a Network Share in Ubuntu using MDADM

We had a hardware RAID card that worked well in Windows, but was giving some issues in Linux (Ubuntu 12.04). So, we decided to try to setup a software array using mdadm. This is how we did it.

First, make sure your hardware RAID card has "non-RAID" mode. Basically that it lets each of the attached drives show up as a single drive. Ensure this is enabled. On some cards you will have to flash the BIOS with a non-RAID version.
  • Do not create a hardware RAID array using the RAID configuration menu
  • Make sure no drives are assigned to a RAID array
  • Install the newest version of Ubuntu Server
    • When asked for partitioning information
    • Select manual partitioning
    • Select "Create Software RAID”
    • “Create MD device”
    • Select RAID5   (we are using RAID5)
    • Active devices = #   (where # is the number of drives you want in the array)
    • Spare devices = #
    • Select all the drives you want in the array
    • Select OK

After the array is created our new device has about 21TB. In versions of Ubuntu before 12.04 it was difficult to create a partition using the whole 21TB, but now you should be able to do it from the install menu.
<ul><li>Create a partition on the newly created device (usually md0)</li><li>Format as ext4</li><li>Save and continue with install per usual</li><ul><li>You might want to select “ssh” from the package selection</li></ul></ul>Once install is done, and you boot into the OS:
<ul><li>Make sure array device has been created</li><ul><li>Sudo fdisk –l</li><li>Look for /dev/md0</li></ul><li>Check status of the software array</li><ul><li>Sudo mdadm –detail /dev/md0</li><li>If the status of the array is “building” or “syncing”, let the process finish (may take several hours)</li></ul><li>Create a mount point for the array</li><ul><li>Sudo mkdir /media/RAIDStorage</li></ul><li>Modify fstab to mount partition on boot</li><ul><li>Sudo nano /etc/fstab</li><li>Add a new line:</li><ul><li>/dev/md0 /media/RAIDStorage ext4 defaults 0 0</li></ul><li>Save and exit</li><li>Remount</li><ul><li>Sudo mount –a</li></ul><li>Check device was mounted to /media/RAIDStorage</li><ul><li>Sudo mount | grep RAIDStorage</li></ul></ul><li>Share RAIDStorage with NFS</li><ul><li>Sudo apt-get install nfs-kernel-server</li></ul><li>Edit exports file</li><ul><li>Sudo nano /etc/exports</li><li>Add line</li><ul><li>/media/Storage/Case_Data 10.1.1.0/24(rw,insecure,async,no_subtree_check)</li></ul><li>Save and exit</li><ul><li>/etc/init.d/nfs-kernel-server restart</li></ul></ul></ul>In this case, permissions are NOT being set up on NFS. If you need a more secure environment, make sure you set it up.

Also, we are using ‘async’ instead of ‘sync’. For some reason, when writing very large files sync have very, very bad performance, while async allowed for maximum write speeds.

If there are write permission errors, check that the permissions on the folder (/media/RAIDStorage) on the server are set correctly for the user

1 min read

Another SDHASH Test with Picture Files

After the last SDHASH test showed that fuzzy hashing on multiple sizes of the same picture files did not appear to work well. I decided to try the same size image with slight modifications like one might see in the real world. So, again there is an original image, the same image modified with text added, and the same image modified with a swirl pattern on the face.

<div class="separator" style="clear: both; text-align: center;"><table cellpadding="0" cellspacing="0" class="tr-caption-container" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Kitty Orig: 75K MD5 6d5663de34cd53e900d486a2c3b811fd</td></tr></tbody></table><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Kitty Text: 82K MD5 bcbed42be68cd81b4d903d487d19d790</td></tr></tbody></table></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Kitty whirl: 92K MD5 4312932e8b91b301c5f33872e0b9ad98</td></tr></tbody></table>On this test, I hypothesized that there would be a high match between the original kitty, and the text kitty, and a low, if any, match between the original kitty and the whirl kitty. My reasoning for this is because I think the features of the data would be similar enough - excluding the text area.<div>
</div><div>Unfortunately, I was wrong. sdhash did not find similarity between any of the pictures (ssdeep did not either).</div><blockquote class="tr_bq">$sdhash -g *</blockquote><div><blockquote class="tr_bq">kitty_orig.jpeg
kitty_text.jpeg 000
kitty_orig.jpeg
kitty_whirl.jpeg 000
kitty_text.jpeg
kitty_whirl.jpeg 000</blockquote> So, both sdhash and ssdeep did not detect any similarity between the picture files. Perhaps these tools are not suitable for picture file analysis, or a replacement for standard hashes like MD5, etc. when looking for like pictures.

</div>
~1 min read

Comparing Fuzzy Hashes of Different Sizes of the Same Picture (SDHASH)

In a previous single, we looked at setting up and using SDHASH. After comparing modified files and, and getting a high score for similarity, we started wondering how well fuzzy hashing works on different sized images. So today, we have a little experiment.

First, we have 4 images. One original, and 3 smaller versions of the original.
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Original: 75K, MD5 6d5663de34cd53e900d486a2c3b811fd</td></tr></tbody></table><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;"> 1/2 Original: 44K, MD5 87ec8d4b69293161bca25244ad4ff1ac</td></tr></tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">1/4 Original: 14K, MD5 978f28d7da1e7c1ba23490ed8e7e8384</td></tr></tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">1/8 Original: 3.6K, MD5 3e8e0d049be8938f579b04144d2c3594</td></tr></tbody></table>So, if we have an original image, we can take the hash like so:
<blockquote class="tr_bq">$sdhash kitty_orig.jpeg > kitty_orig.sdbf</blockquote>Now, we want to take the hashes of other other files (manual way):
<blockquote class="tr_bq">$sdhash kitty_2.jpeg >> kitties.sdbf
$sdhash kitty_4.jpeg >> kitties.sdbf
$sdhash kitty_8.jpeg >> kitties.sdbf</blockquote>Now we can compare the hashes of the smaller versions to the hash of the original. Note: set the threshold to negative one (-t -1) if you want to see results below 1.
<blockquote class="tr_bq">$sdhash -t -1 -c kitty_orig.sdbf kitties.sdbf</blockquote>Unfortunately, but as expected, the results were not so good. Feature selection for the hash is done at the bit level, and those features do not carry over to smaller files since there are less bytes.
<blockquote class="tr_bq">kitty_2.jpeg
kitty_orig.jpeg 000
kitty_4.jpeg
kitty_orig.jpeg 000
kitty_8.jpeg
kitty_orig.jpeg 000 </blockquote>If you were working with more images, and you wanted to hash and compare at the same time, you could use the -g switch. For example:
<blockquote class="tr_bq">$sdhash -t -1 -g *</blockquote>The output of which (in this case) would be:
<blockquote class="tr_bq">kitty_2.jpeg
kitty_4.jpeg 000
kitty_2.jpeg
kitty_8.jpeg 000
kitty_2.jpeg
kitty_orig.jpeg 000
kitty_4.jpeg
kitty_8.jpeg 000
kitty_4.jpeg
kitty_orig.jpeg 000
kitty_8.jpeg
kitty_orig.jpeg 000</blockquote>So, in conclusion, sdhash’s feature selection does not allow for comparison of greatly different sized picture files. Note that a text file would be quite different, and would probably produce better results.
1 min read

Similarity Comparison with SDHASH (fuzzy hashing)

Being a fan of ssdeep for fuzzy hashing, I was interested in this article comparing ssdeep to sdhash.

As the article says, ssdeep basically breaks up a file into pieces, hashes each piece, and combines the produced hashes to create a hash for the whole file. Similarity for each piece of the file can then be calculated since small chunks of the file are examined. When comparing two files, the similarity of chunks will result in a similar hash.

sdhash, on the other hand, uses probabilistic feature extraction to find features that are unlikely to happen randomly. The extracted features are hashed and put into a Bloom filter. For the comparison of two files, Bloom filters are compared using a Hamming distance measure. The more similar the filter, the lower the Hamming distance.

The article An Evaluation of Forensic Similarity Hashes demonstrated that with the different approaches, sdhash appears to outperform ssdeep in many instances. Because of this, I took a closer look at sdhash.

One of the more interesting features of sdhash is the ability to tell the command how many threads you want to spawn. In Roussev’s paper he talks about parallelization. He had 12 processors on a test machine, with 24 cores. In many of his tests he was spawning 24 threads.

So, wanting to get a feel for sdhash I built it on a 2.4 Ghz Intel Core 2 Duo machine. Building was straightforward on OS X: make && sudo make install
<blockquote class="tr_bq">$ sdhash –version
sdhash 2.3alpha by Vassil Roussev, Candice Quates, August 2012, rev 476
       http://sdhash.org, license Apache v2.0</blockquote>When doing a comparison you first need to create a list of SDBFs (bloom filters) for the files of interest. The switches I was most interested in were:

<blockquote class="tr_bq">  -r [ –deep ]                         generate SDBFs from directories and files
  -g [ –gen-compare ]             generate SDBFs and compare all pairs
  -c [ –compare ]                   compare all pairs in SDBF file, or
                                            compare two SDBF files to each other</blockquote>
So, let’s say I had a corpus of illegal images in a folder (and potentially many sub-folders) named “evidence”. I could create the hashes of each of the images using the command:
<blockquote class="tr_bq">$sdhash -r evidence/ > illegal_images.sdbf</blockquote>In this case I am outputting the results to the file “illegal_images.sdbf” using a standard >, but you can also use -o to specify the output file.

If I then want to compare all the hashes in illegal_images.sdbf to hashes create from a new case, on the new machine I would have to create a sdbf (test_images.sdbf) file for the new hashes, and then compare:
<blockquote class="tr_bq">$sdhash -c illegal_images.sdbf test_images.sdbf</blockquote>The output will be the names of the two compared files, plus a score of similarity all separated by pipes (
). For an explanation about the scoring system please see Content triage with similarity digests: the M57 case study:
<blockquote class="tr_bq">warnings.py
warnings.py 100
opera8-2png1txt/file2.png
flask.ui/static/jquery.js 005
opera8-2png1txt/file2.png
flask.ui/venv/bin/python 009
opera8-2png1txt/file2.png
opera8-2png1txt/request.txt 066</blockquote>As Roussev says in his papers, the score is a calculation of similarity. In other words, even if the score is 100, the files may not be exactly the same.

To test this, I have a 3.3K text file named “Makefile”. First, I create a md5 and fuzzy hash of the file:
<blockquote class="tr_bq">$sdhash Makefile > make.sdbf</blockquote>Result: MD5 (Makefile) = 4fe15b4cf1591cdfe92b7efd65d291ec
<blockquote class="tr_bq">sdbf:03:8:Makefile:3417:sha1:256:5:7ff:160:1:50:AICBAiDMAAAAAoEACMAQAAAICA5gACQIgBAAQYAAAAACAhAAAFAEACIAIAoEAACCAAICAAAAAEgEEMCgQAACBAAACAAQAABCCCBSQFgACAAAABLAAiBCciCAAgAIAACIBCQAAgGAAAASJhAgAAAUAAgAACQAAIEAEAAIAAAIAIAAAAaQCAAQEgABAWABAAAAEAAAAAAik6CgAQEAAAAABRQUxwAAAEEGAAAACAAQAEAAABAAAAA8CAiAAAEpgIAkAAQGAAICCAYSBsIFEQCoiAAAQIYCAAAIAASAAAAMSCABOCAAAARAAAQAIAAEFIABAgACgAACAEAIDAIAAACEEg== </blockquote>Next, I modify “Makefile” to remove the first “# “ (hash and a space), create an md5 and fuzzy hash again and compare.

Result: MD5 (Makefile) = d7e9182ee1d8cb7c6ad41157637c7d62
<blockquote class="tr_bq">sdbf:03:8:Makefile:3415:sha1:256:5:7ff:160:1:50:AICBAiDMAAAAAoEACMAQAAAICA5gACQIgBAAQYAAAAACAhAAAFAEACIAIAoEAACCAAICAAAAAEgEEMCgQAACBAAACAAQAABCCCBSQFgACAAAABLAAiBCciCAAgAIAACIBCQAAgGAAAASJhAgAAAUAAgAACQAAIEAEAAIAAAIAIAAAAaQCAAQEgABAWABAAAAEAAAAAAik6CgAQEAAAAABRQUxwAAAEEGAAAACAAQAEAAABAAAAA8CAiAAAEpgIAkAAQGAAICCAYSBsIFEQCoiAAAQIYCAAAIAASAAAAMSCABOCAAAARAAAQAIAAEFIABAgACgAACAEAIDAIAAACEEg==</blockquote>Comparison of fuzzy hashes:
<blockquote class="tr_bq">$sdhash -c make.sdbf make_change.sdbf </blockquote>Resulting in:
<blockquote class="tr_bq">Makefile
Makefile 100 </blockquote>So from this we see that the MD5 hashes (4fe15b4cf1591cdfe92b7efd65d291ec and d7e9182ee1d8cb7c6ad41157637c7d62) changed dramatically - as expected. The result of sdhash, however, still returned a score of 100, meaning that the file is very similar, even if the content is slightly different. I may run more experiments to see how much the file can change before the score is reduced, but that is for a later day.

Another feature I am interested in is the specification of the number of threads.
<blockquote class="tr_bq">  -p [ –threads ] arg (=1)             compute threads to use</blockquote>Using this, we can (hopefully) easily utilize server farms to make the hash generation and comparison an easier task.

My test machine has 1 processor with 2 cores. Running sdhash without specifying the -p flag, ran with 1 thread and used around 80% - 90% of the CPU. When ‘sdhash -p 2’ was specified, 2 threads were spawned, and 170% - 190% of the processor was used (both cores). So, using multiple processors is quite easy, however, it may not always be the best option. In one of his papers, Roussev claims that running some threads on multiple cores of the same processor may not result in increased performance, but instead a competition for the processor’s time. This appeared to be the case in our brief experiment:

<blockquote class="tr_bq">$ time sdhash -r . > test.hashes
real 2m20.365s
user 0m57.093s
sys 1m11.767s
$ time sdhash -p 2 -r . > test.hashes
real 2m38.744s
user 2m20.538s
sys 2m41.520</blockquote> So in other words, you may want to test to make sure they way you are processing is the most efficient. I was hashing only a few hundred files, and not doing any comparison. If this were thousands, the a lot of time might have been wasted.

Roussev suggested creating hashes at the same time as imaging by redirecting the output of the suspect device to an image file and as an input to sdhash. He suggested dcfldd. I’ve not tried it, but if you could hash and image at the same time, there could be some definite time savings depending on your forensic process.

Overall fuzzy hashing appears interesting for finding similarity, however, I would like to test this method against full-size images vs. thumbnails. Since this method is NOT content analysis, but instead feature selection of raw data, I would be surprised if it found much similarity between an image and a corresponding thumbnail. If anyone has tested this, please let a comment with your results.
4 min read

Installing Log2Timeline on Ubuntu 12.04

The maintainers of log2timeline have yet to set up a repository for Ubuntu Precise (12.04). Here are the required packages needed to get most of the functionality of log2timeline going:
liblwp-useragent-determined-perl, libdatetime-perl, libdate-manip-perl, libdatetime-format-builder-perl, libclass-dbi-perl, libparse-win32registry-perl, libnet-pcap-perl, libnetpacket-perl, libxml-perl, libxml-libxml-perl, libcarp-assert-perl, libdigest-crc-perl, libdata-hexify-perl, libimage-exiftool-perl, libarchive-zip-perl, libfile-mork-perl, libmac-propertylist-perl, libxml-entities-perl, libhtml-scrubber-perl

After that, log2timeline 0.64 was built from source with only DBD::SQLite still needed.
<blockquote class="tr_bq">perl Makefile.PL
make
sudo make install</blockquote>

~1 min read

ZFS and NFS for Forensic Storage Servers

We’ve been looking at different storage solutions to act as storage servers for forensic images, and some extracted data. Essentially we have a server with eight 3TB drives, and one 40GB flash drive. RAID cards are RAIDCore BC4000 series and another HighPoint RocketRAID 272x series. The RIADCore, we have been unable to get working in either FreeNAS 8 or Ubuntu 12.04. It does, however, give the OS direct access to the connected drives if they are not RAIDed. The HighPoint works with Ubuntu 10.04 (kernel 2.6), and works with the newest version of FreeNAS (BSD driver). Though with FreeNAS I have had drives somehow fall out of a zfs pool, and not be able to be re-added. HighPoint also gives the OS direct access to the connected drives if they are not RAIDed.

So, since we were having trouble with the hardware RAID, we decided to look for alternatives. The ideal solution would aggregate all the storage across 4 servers into one volume. So initially we considered using sharing each of the physical disks via iSCSI, then having some single-head initiator to access them all. The initiator would the be responsible for some type of software RAID, LVM and sharing the  NFS mount point. Connectivity and data redundancy are big issues, and I was worried with this method, if one of the servers drops out a lot of data may be lost or corrupted, plus the head may be a bottleneck.

Instead, we decided that each server would host its own NFS share, so we just needed a way to combine all the 3TB physical disks into one large volume. I thought first about a software RAID+ext4, but instead decided to try ZFS. It seemed to have the redundancy we were looking for, plus some other interesting features, such as taking care of NFS from ZFS.

For ZFS we decided to try both Ubuntu and FreeNAS (BSD) to figure out 1) which is faster and 2) which is easier to setup and administer.

After installing Ubuntu 12.04, and following these instructions with a bit of help from here, we had terabytes of aggregate storage accessible over NFS in about 15 minutes. Pretty easy.

Then we started trying to image to the share. From a USB device to the server using Guymager, we could image about 11-13MB/sec. After 30 to 40 seconds of imaging, all the drives in the zpool would start flashing like crazy, and imaging would drop to around 3MB/sec. Faster initial speeds, but similar degradation happened with eSATA/IDE drives on the acquisition machine.

[edit] I believe part of the problem was with NFS having the ‘sync’ flag set on the server-side. I had less degradation and slightly faster performance with ‘async’.

So, for large files ZFS was quite slow. This did NOT seem to be the case when writing from the server to the local zpool, which leads me to think it might be partially a problem with NFS (which zfs handles itself with the ‘zfs set sharenfs=on … ‘ command). I think the initial high speed is because of caching. So now I am wondering if ZFS is meant for bursts of smaller reads and writes rather than writing large files.

Looking around for why ZFS seems degrade so much, I found this article, which claims to improve ZFS performance. NOTE: if you make any of the changes from the article, make sure you have a backup of your data. I found that after making some of the tweaks, my pool had been corrupted and had to be re-created. The tweaks did seem to make the server more stable over time (less freezing), but write speeds still degraded.

FreeNAS took much less time than Ubuntu to install, and as long as it can detect your disks, creation of a new pool and sharing the pool takes about 10 clicks of your mouse (from the web interface). ZFS on FreeNAS appears to be less stable that on Ubuntu (we had drives drop out, and a more freezing), but we are getting write speeds of approximately 15MB/sec with less - but some - degradation over time.

To see if the slow write speed is ZFS related, we are currently building a software array, and will just format it with ext4, and create a standard NFS share. Syncing the 8 disks in the array will take hours, but after hopefully we will have much, much better transfer speeds. Ubuntu allows the creation of a software array on install, which makes it very easy to set up.

If you have any experience with ZFS, and know what some of the issues might be, or even just a better set-up for aggregating large disks and getting good performance writing large files, please leave a comment.

Bookmark and Share
Image: FreeDigitalPhotos.net

3 min read

Forensic Acquisition of a Virtual Machine with Access to the Host

Someone recently asked about an easy way to create a RAW image of virtual machine (VM) disks, so here is a quick how-to.

<div class="separator" style="clear: both; text-align: center;"></div>If you have access to the VM host, you could either copy and convert the virtual disks on the host using something like qemu-img, or if for some reason you cannot convert the virtual disks, you can image the VM from within the virtual environment. This how-to will go through relatively easy ways to image a live or offline virtual machine using the virtual environment with access to the host.

First, if the virtual machine cannot be shut down (live environment), you will make changes to the ‘suspect’ environment. If it is a suspect device, make sure your tools are on write-protected media. Verification of disk images won’t work in a live environment since the disk is changing while imaging is taking place. If you are doing and offline acquisition for forensic purposes, make sure you are verifying the images once you create them.

If the VM is live and cannot be shut down:
<ul><li>Fist check if the management interface allows devices to be attached to the VM; specifically USB/Firewire devices.</li><ul><li>If you cannot attach devices for whatever reason, then you will have to use a network share or a netcat tunnel to copy the image.</li><li>Ensure your storage media is larger than the Virtual Machine’s disk</li></ul><li>If it is a Windows environment, copy your imaging program like FTK Imager lite, or dd.exe from unxutils, to the network share/USB device. I also like chrysocome’s dd since it lets you list devices.</li><li>In the Virtual Machine, mount your share/USB device</li><li>From the mounted device, you should be able to access and run the imaging tools you copied previously - ensure you output the image to your share/USB device.</li><ul><li>dd</li><li>FTK Imager Lite</li></ul></ul><div>
</div>If the VM is offline or can be shut down:
<ul><li>First check if you can boot the VM from a CD/USB device</li><ul><li>If yes, use a live CD like FCCU or Helix to boot the VM</li><ul><li>All we are really interested in is 1) an imaging program on the live CD and 2) USB or network support and 3) netcat installed.</li></ul><li>If no:</li><ul><li>Can you add CD/USB support to the VM?</li><li>Can you copy the VM, and convert to a RAW image using qemu-img (part of qemu)?</li><li>Boot the VM, and make an image in the live environment (go to the live imaging section)</li></ul></ul><li>After you have booted the VM from a live CD…</li><ul><li>Using external storage to store the image:</li><ul><li>List the current disks - (fdisk -l) - and take note of the current suspect disks to image</li><ul><li>/dev/hda would be a physical disk, while /dev/hda1 would be partition 1 on disk ‘a’</li></ul><li>Attach and mount an external storage disk</li><li>Make a copy of the physical disk (usually something like /dev/sda) using an imaging program like dd or guymager.</li><ul><li>Make sure you are copying the image to your external disk!</li></ul></ul><li>Using the network:</li><ul><li>Network share:</li><ul><li>List the current disks, and take note of the suspect disks to image</li><li>Mount the network share</li><li>Make a copy of the physical disk (usually something like /dev/sda) using an imaging program like dd or guymager.</li></ul><li>No network share:</li><ul><li>Set up a netcat tunnel between a server (any computer you control), and the suspect VM (client)</li><ul><li>Note: this connection is unencrypted!!!</li></ul><li>You can use ssh or cryptcat to for an encrypted tunnel, and bzip for compression and faster transfer speeds.</li></ul></ul></ul></ul>
That’s it. Pretty generic, but it should be enough to get anyone started. Please comment if you have any questions

Bookmark and Share
Image: FreeDigitalPhotos.net

2 min read

Installing OCFA 2.3.X with FIVES

In this single we will be installing OCFA 2.3.0 rc4 on Debian Squeeze (6)

I will be following the documentation from: http://sourceforge.net/apps/trac/ocfa/wiki/2.3%20installation%20notes

make sure you do not have sleuthkit installed
see the note at the bottom for the FIVES suggested packages - i installed everything but sleuthkit

aptitude install build-essential cmake libfuse-dev fuse-utils libsqlite3-dev openssl libboost-dev libboost-regex-dev libpoco-dev scalpel pasco

wget http://sourceforge.net/projects/ocfa/files/ocfa/2.3.0/ocfa-2.3.0rc4gpl.tar.bz2
tar -xvf ocfa-2.3.0rc4gpl.tar.bz2

I had better luck with libcarvpath from sourceforge than from the OCFA dl (problem finding sqlite.h)
wget http://sourceforge.net/projects/carvpath/files/LibCarvPath/libcarvpath2.3.0.tgz
tar -xvf libcarvpath2.3.0.tgz
cmake src, make, sudo make install

The carvfs that came with OCFA built without any issues
cd ocfa-2.3.x/carvfs
cmake src, make, sudo make install

Got an error on libcarvpathrepository with ocfa - would not build. Looking for libboost (libboost-dev)
make, sudo make install

I am interested in working with the FIVES project, so I am installing libpoco-dev.

That is pretty much it for the new version notes. Now we go to the excellent Installation guide. I will be listing packages and how to build, but details that work in the document I wont cover. Make sure you have that as well.

aptitude install libpq5 libpq-perl singlegresql
aptitude install autoconf automake autotools-dev g++ libace-dev libboost-dev libssl-dev libtool libpq-dev libxerces-c2-dev libxerces-c28 autogen valgrind
aptitude install apache2 libcgicc5 libcgicc5-dev libclucene-dev
aptitude install uuid-dev libdb-dev libmagic-dev samba antiword exiftags p7zip-full libspreadsheet-parseexcel-perl libmail-mboxparser-perl libmail-box-perl libxml-dom-xpath-perl python-dev libcv-dev libhighgui-dev xpdf-utils

wget http://www.rarlab.com/rar/rarlinux-4.0.0.tar.gz
extract, and make

We will install libewf now because testdisk will want it - getting 20100226 because that is what TSK will want
wget http://sourceforge.net/projects/libewf/files/libewf/libewf-20100226/libewf-20100226.tar.gz
extract, ./configure, make, sudo make install

wget http://www.cgsecurity.org/testdisk-6.11.tar.bz2
extract, ./configure –without-ncurses, make, sudo make install

(for tsk)
wget http://afflib.org/downloads/afflib-3.6.9.tar.gz
extract, ./configure, make, sudo make install

wget http://sourceforge.net/projects/sleuthkit/files/sleuthkit/3.2.1/sleuthkit-3.2.1.tar.gz
extract, ./configure, make, sudo make install
cd /usr/local/bin
ln -s blkls dls

cpan> install Mail::Box
(this automatically installs Mail::Transport::Dbx

wget http://sourceforge.net/projects/vinetto/files/vinetto/vinetto-beta-0.07/vinetto-beta-0.07.tar.gz
python setup.py install

** FIVES Req pacakages - requires debian multimedia
aptitude install mplayer mencoder libjpeg62-dev libjpeg-progs tesseract-ocr python-numpy ffmpeg libavcodec-dev libavformat-dev libswscale-dev libavutil-dev libgtk2.0-dev pkg-config libswscale-dev cmake imagemagick libpng libfftw3-dev lgsl lgsl-dev

OCFA
OcfaLib
./configure, make, make install

OcfaArch
./configure, make, make install

OcfaModules
./configure –check for failures
make, make install

Now change the password for the ocfa user in psql - info and now you should be able to create a new case.


/* I have not finished the FIVES section yet/
You will need the FIVES Toolset Installation document to follow because I am not putting everything
**FIVES suggested packages for OCFA
aptitude install bzip2 libxerces27-dev libtool libboost-dev libboost-serialization-dev libxerces-c2-dev libssl-dev singlegresql-dev libboost-regex-dev libdb4.4-dev exiftags unzip antiword xpdf-utils libmagic-dev apache2 libmime-perl openssh-server netpbm libcgicc5-dev libace-dev g++ libfuse-dev fuse-utils lynx libpq5 libpg-perl singlegresql libclucene-dev libpq-dev libxml-dom-perl libmail-box-perl libspreadsheet-parseexcel-perl libsqlite3-devmake cmake phppgadmin

install iulib from source

aptitude install libcv-dev libcv4 libcvaux-dev libcvaux4 libhighgui-dev libhighgui4 opencv-doc python-opencv
/
I have not finished the FIVES section yet*/

sleuthkit

2 min read

Building FIVES Porndetect Image and Video

Installation of FIVES Porndetect was relatively painless on Debian Squeeze (Lenny is a bit of a pain).
First get the F_PORNDETECT.doc from the FIVES portal. Their documentation is pretty good. I am just adding extras that I come across while installing.

Requires:
iulib - Follow gsbabil’s single here for all the deps. If you are in Squeeze autoconf is newer that specified, you will have to run aclocal before ./build.
The genAM.py patch was required for both Lenny and Squeeze. Download it into the same dir, and apply it with ‘patch -p0 < genAM.py.patch’
After that iulib should build/install - you then just need to copy vidio/vidio.h and imglib/iulib.h to /usr/local/include/

I am building from the FIVES ‘everything.tar.gz’ - so extract it and go into the porndetect dir
All the packages in the document (.doc) seemed to work for me except ‘imagemagick++9-dev’ which was ‘libmagick++-dev’ in INSTALL.txt.

After that you should just be able to ‘make’ in porndetect, and it works - you will have three executables in the ‘build’ directory. The command line –help for them is lacking, but the documentation on the FIVES portal makes up for it (and no manual).

I will talk about usage in another single, but now on to building video:

For video, it has a lot of the same deps. We just need some video processing libs - see the README in the porndetect-video - it lists all the packages needed. All packages were available from the squeeze repository with no issues (not so with lenny).
I was able to build with no issues, and after had to executables in build: porndetect-video and vis-scores.

Both of the following require a global.makeinfo file that does not come in the everything package. I am waiting to hear from the group about compiling the modules standalone. Check back later.

F_SSEMATCH requires:
afflib and libewf2
Both installed from source with no problems based on the packages previously installed.

F_FDAE module, standalone. This module is for face and age detection.
First we need OpenCV which is available on Squeeze as libcv-dev, and also libsvm-dev which is for machine learning.

1 min read

Installing a Eucalyptus Cloud with Debian Squeeze

When trying to install Eucalyptus on Debian, the newest version seemed to be packaged for Squeeze. I tried this directly on Lenny, but it did not work. I have never had luck trying to upgrade from Lenny to Squeeze. I suggest a new install rather than upgrading. Squeeze (testing) can be found here: http://www.debian.org/devel/debian-installer/

Just completed the install with Squeeze RC1 with no problems.
Eucalyptus hosts its own repository for Squeeze, which makes everything much, much easier. Also the documentation is pretty good, so this part will mostly be links.
<ul>
<li>Install the front-end cloud controller and any nodes (nodes can be added later)</li>
<li> Next I suggest installing Euca2ools at the same time since you will need them anyway</li>
<li> Restart the hosts</li>
<li> Go to the first time setup to designate the front-end and join nodes to the pool</li>
<li> You should have logged into the front-end to see the first-time settings and change the admin password. Make sure you downloaded your credentials onto the front-end machine, and you have sources the key info file.</li>
<li> Now you will need an image. I suggest going to the repository and selecting one yourself to start out with.</li>
<li> Once you have an image, then add it to Eucalyptus</li>
<li> Make sure the service is running on the nodes: /etc/init.d/eucalyptus-nc start</li>
<li> See this documentation section 4 for a quick start for an instance</li></ul>
That should pretty much do it without any fancy configuration options (such as elastic IPs).

1 min read

Converting Parallels Disks to Raw on OS X

Update: See the forensic focus article: http://articles.forensicfocus.com/2012/07/05/parallels-hard-drive-image-converting-for-analysis/


Update: I have had problems with this method leading to corruption / being unreliable. Backup all your data before you attempt this.

We do quite a bit with parallels, and commonly want to copy a virtual disk for analysis. If you come across a machine with parallels disks, how do you copy a usable image file out? Parallels is set to use expanding disks by default, which are apparently compressed. Digfor talks about finding parallels on a Windows machine, and how to convert the disk. I will just cover the process on OS X (very similar).

Edit (7-12): An easier and faster way is to use ‘qemu-img’. I might try to create a how-to on it in the future, but it is pretty straightforward.

Essentially we want to locate the .hds file. In Mac the image is usually in the .pvm package (unless location was manually specified).
<ul>
<li>Right click on the .pvm file, and click "Show Package Contents".</li>
<li>Move the .hds file to the .pvm directory</li>
<li>Rename the .hds file to OS.hdd (OS can be whatever is meaningful to you)</li>
<li>Open 'Applications/Parallels/Parallels Image Tool'</li>
<li>Choose the new disk image "OS.hdd"</li>
<li>Choose "Convert to plain disk"</li><ul>
<li>Note: This will expand the disk to its "true" size. Make sure your drive is big enough</li></ul>
<li>The converted disk is once again called OS.0.{###}.hds</li>
<li>The resulting file is now raw</li></ul>


img_stat ~/Documents/Parallels/Windows\ Server\ 2003.pvm/Windows\ Server\ 2003.hdd/Windows\ Server\ 2003.hdd.0.\{5fbaabe3-6958-40ff-92a7-860e329aab41\}.hds
IMAGE FILE INFORMATION
--------------------------------------------
Image Type: raw

Size in bytes: 8590675968



Note: You can also use Parallels Image Tool to split and combine the image file - though dd gives you more options.

1 min read

Video Preview from Command Line with ffmpeg

Earlier I singleed about creating an animated preview gif from a given video. When using that method with a file list, ffmpeg would treat the file name as a data stream when read directly into the loop by piping the input file into the loop after done (see the first loop below). I like that method because it is easy, and uses ‘read’ which does not complain about spaces in file names.

You can read more about what I was trying to do here

The fix for the file-name-as-a-data-stream error is to dump each line of the input file into an array first. That is what I am doing below.
*Note: You have to set the IFS to something else if the file names have spaces

oldIFS=$IFS
IFS=:
while read line; do
filenames=("${filenames[@]}" "$line")
done > $errorfile
fi
let COUNTER=COUNTER+1
if [ "$COUNTER" -eq "10000100" ]
then
echo "First 100 Thumbnails Generated: `date`" > $OUTPUT/videosdone
fi
done
fi
IFS=$oldIFS

~1 min read

Video Screenshot Preview gif Built from Command-Line Linux

Edit: This version will produce errors when using a file list. See this single for a more reliable way.

I have been searching for a while for a way to create a video preview from the command line in Linux. Not just a simple screen shot, but an animated gif of screen shots throughout the video. My thinking is that a screen shot of a video at a random time may not look suspicious, but the next frame may be something illegal. Essentially for a video I would like to take 4 - 6 screen shots regardless of the duration, compile these into an animated gif, and display the preview.

First I have been looking at my options:
I am on Debian ‘Lenny’, and while vlc might look like a good option, the lenny release is stuck at 0.8.6. The newest release is 1.1.4 (I think), but in 0.8.6 the –start-time switch is ignored. I tried upgrading using sid, but ran into a bunch of problems and decided not to mess with it.

I looked into mplayer which created screen shots, but I could not easily find how to divide the duration into 6, and quickly take the snap shot at those times. Basically I just got a bunch of sequential snapshots, and when I put them together would make the video again. I could delete some in the middle to get the desired effect, but thought there had to be an easier way. Also mplayer gui always wanted to start, and I did not want that.

Finally ffmpeg - with ffmpeg and imagemagick I was able to get something similar to what I wanted.

First the ffmpeg line


ffmpeg -i $file -ss 120 -t 120 -r 0.05 -s 90x90 f%d.jpg


What this does is takes the input video file ‘$file’, starts at 2 minutes (-ss 120), runs for 2 minutes (-t 120), sets a very low frame rate (-r 0.05), re-sizes the preview to 90x90px (-s 90x90), and names all the output images f#.jpg (f%d.jpg). Rather than calculating the duration, making the frame rate low gives a similar effect. I will write duration calculation later.

So once we run that we have a directory full of *.jpg files. We need to roll them into one animated gif. For this I use imagemagick. I have seen a lot of people who are using gimp for this. I love gimp, but imagemagick is easier converting a bunch of jpgs to an animated gif.


convert -delay 100 -loop 0 f*.jpg $file.gif

adapted from here

This will group all the jpg files in a loop with approx a 1 second pause per image. Works a treat!

Here is the first preview I tested (have only tested with .ogm and .mp4 so far)
Video Screenshot Preview gif - FLCL


Here is my full bash script to do the processing. It takes a file name as an argument - the loop is to deal with file names with spaces.


#!/bin/bash
echo "$1" | while read file
do
if [ -f "$file" ]; then
echo "Creating preview of $file"
ffmpeg -i "$file" -ss 120 -t 120 -r 0.05 -s 90x90 f%d.jpg
fn=$(echo ${file// /}) # Remove spaces in filename
convert -delay 100 -loop 0 f*.jpg $fn.gif
rm *.jpg
fi
done
exit 0

2 min read

CarvFS on Mac OSx

A while ago I briefly used CarvFS on a linux system for testing. It was nice. Zero-storage carving can come in handy, especially when you are dealing with live CD systems. But installing on Mac would make experimentation and testing a bit more handy than running a VM. If you are reading this you might have had the experience of trying to compile CarvFS on Mac, have failed, and are stuck. Fear not!
Error when compiling on Mac
<blockquote>CMake Error at CMakeLists.txt:21 (MESSAGE):
No compatible (>= 1.0.0) version of libcarvpath found</blockquote>
First, a blog I really enjoy int for(ensic) blog has notes and downloads to install via Darwin ports. These can be found here: http://computer.forensikblog.de/en/2010/08/carvfs_on_a_mac.html
*note - if you use the Darwin ports method he uses patches for libcarvpath, carvfs, and the ewf module that I do not use!

But me being stubborn, I don’t like to use Darwin ports since I can compile what I want 95% of the time. Welcome to the 5%. So looking at forensikblog’s port file you can see what you need to change. By the errors it looks like it is only a library file, but it is also a bit more. So here is my non-Darwin ports CarvFS tutorial:
<ul>
<li>Install cmake: http://www.cmake.org/cmake/resources/software.html</li>
<li>Install FUSE: http://code.google.com/p/macfuse/</li>
<li>Install libcarvpath: http://sourceforge.net/projects/carvpath/files/LibCarvPath/libcarvpath1.0.0.tgz/download</li></ul><ul><li>Download and extract carvfs: http://sourceforge.net/projects/carvpath/files/</li></ul>
In the carvfs directory there is a ‘src’ sub-folder. Inside that replace the CMakeLists.txt file with this one [broken link, sorry]

Edit ‘carvfs.c’ where it says
<blockquote>sprintf(imgtypelib,”libmod%s.so”,imgtype);</blockquote>
to be
<blockquote>sprintf(imgtypelib,”libmod%s.dylib“,imgtype);</blockquote>
Then in the main carvfs directory run the command:
<blockquote>
<div style="text-align: left;">cmake -DCMAKE_INCLUDE_PATH:PATH=/usr/local/include -DCMAKE_LIBRARY_PATH:PATH=/usr/local/lib -DCMAKE_INSTALL_PREFIX:PATH=/usr/local -DCMAKE_VERBOSE_MAKEFILE:BOOL=ON src</div>
</blockquote>
If everything is ok, you will get a make file. Then you just do the standard ‘make && sudo make install’

Thanks again to int for(ensic) blog.

1 min read

SIMILE Timeplot graphing hours minutes seconds

All of the examples for SIMILE Timeplot are in YYYY/MM/DD format. I was wanting to plot data down to the minute/second. Looking around I found that the date format of the data (.txt) file should be ISO8601. I tried, but still had problems parsing the time part. Thanks to this blog I saw two problems I was having. First, the time format should be: YYYY-MM-DD HH:MM:SS. So a [time,data] file would look like [2009-02-12 15:10:00,23.407]

Next is with the index.html that loads the plot data. Look for the line

timeplot1.loadText(dataURL, " ", eventSource);

What I did not immediately recognize was that the “ “ bit is actually a field separator. So for comma separated values that line should be:

timeplot1.loadText(dataURL, ",", eventSource);

Once done everything worked as expected. SIMILE Timeplot and Timeline are great tools. Hope this saves you some research time.
http://www.simile-widgets.org/timeplot/

~1 min read

SIMILE Widgets: Timeline and Timeplot Mac OSx Install

Looking around I just found the SIMILE project. I have been messing around with TSK’s fls and looking into log2timeline and think SIMILE widgets might be useful. I am singleing the install instructions here for future reference. The test machine is Mac OS X 10.5.8.

Ant is already installed on OS X, but if you want the newest version:
Download JUnit from here
Download Apache Ant from here

JUnit will be a .jar file. Move it into “/System/Library/Frameworks/JavaVM.framework/Versions/1.5.0/Home/lib/ext”

ant --version # Demonstrate builtin version
cd ~/Downloads # Let's get into your downloads folder.
tar -xvzf apache-ant-1.8.1-bin.tar.gz # Extract the folder
sudo mkdir -p /usr/local # Ensure that /usr/local exists
sudo cp -rf apache-ant-1.8.1-bin /usr/local/apache-ant # Copy it into /usr/local
# Add the new version of Ant to current terminal session
export PATH=/usr/local/apache-ant/bin:"$PATH"
# Add the new version of Ant to future terminal sessions
echo 'export PATH=/usr/local/apache-ant/bin:"$PATH"' >> ~/.profile
# Demonstrate new version of ant
ant --version

http://stackoverflow.com/questions/3222804/how-can-i-install-apache-ant-on-mac-os-x

Make a working directory for timeline/timeplot:
mkdir ~/Documents/Timelines
cd ~/Documents/Timelines
Get timeline/timeplot from svn - really only need the trunk:
svn checkout http://simile-widgets.googlecode.com/svn/timeline
svn checkout http://simile-widgets.googlecode.com/svn/timeplot

Now in both the working directories if you have JRE installed (default in Mac) then you can enter the trunk directory and type ./run
This will start a jetty webserver and you can access the time lines from a browser at address: http://127.0.0.1:9999/timeline (or timeplot) depending on which one you want to view.
The index.html file is in /src/webapp.

1 min read

cpan Date::Manip perl module on Darwin (not Darwin ports)

When attempting to install the Date::Manip perl module via cpan on Darwin it will probably give an error like:
<blockquote>
make: *** [test_dynamic] Error 255
/usr/bin/make test – NOT OK
Running make install
make test had returned bad status, won’t install without force</blockquote>

You are not really given much information, but if you attempt to build from source you will see a few dependencies are missing:
Test::Pod::Coverage and Test::Inter

Install these first via cpan:

cpan> install Test::Pod::Coverage
cpan> install Test::Inter


After that the cpan install of Date::Manip still did not work for me, but installing from source seemed to.
Download the Date::Manip source (make sure you have developer tools installed to build from source)
http://search.cpan.org/~sbeck/Date-Manip-6.11/lib/Date/Manip.pod

Building this module is a bit weird. Unpack it, and move into the directory. Then run the following to build and install:

perl Build.PL
./Build test
sudo ./Build install



source: http://faq.perl.org/perlfaq8.html#How_do_I_install_a_m

~1 min read

RE: Read-Only Loopback to Physical Disk

A reader sent a very informative email in reply to this single about Read-Only Loopback Devices.

http://www.denisfrati.it/pdf/Linux_for_computer_forensic_investigators.pdf has the results of some research which was done into various “forensic” Linux boot CDs.

>For mounting a drive under Linux you have the
>standard ‘mount’ command. When mounting you can specify the -o ro
>option, which theoretically puts you in a safe read-only state… or
>does it? Does it always work? Does it stop everything?

It definitely does not always work. For example, partitions which use a journalling filesystem. The filesystem replays the journal on the disk (i.e. writes to the disk) even if “ro” is specified. For ext3, the “noload” option is supposed to prevent that happening. For XFS there is “norecovery” and for NTFS-3G there is “norecover”.

IMO that’s a terrible design decision; apart from the needlessly-differing option names, no writes at all should be done when “ro” is specified. (Maybe “forceload”/”forcerecovery” mount options could be used if the user for some reason does want to replay the journal and mount ro.)

It’s all too easy to forget and end up writing to the disk.

>Another option that I recently found was the ‘blockdev’ command. You can specify that the blockdev is ro even before mounting.
>blockdev –report
>blockdev –setro /dev/device
>But
>my professor brought up the point - these probably depend on the driver
>used. Maybe a driver for ntfs totally ignores the ro switch? I don’t
>totally agree that blockdev would be based on the driver, but how do
>you test whether the drive actually is in ro without writing? What if
>it fails?

Well, the filesystem code will (or should) go through the block layer, so using blockdev –setro should be effective. However, partitions don’t seem to inherit the read-only flag! In other words, if you have a hard disk /dev/sda with a single partition /dev/sda1, you can do
blockdev –setro /dev/sda
but if you then do
blockdev –getro /dev/sda1
you’ll notice that sda1’s read-only flag is not set! I haven’t verified yet whether sda1 can be written to in those circumstances.

That doesn’t of course prevent writes by the underlying driver (i.e. SCSI). You just have to trust that the underlying driver won’t do that. (But in theory a badly-written low-level driver might. E.g. to detect whether the medium is write-protected, the driver could read a sector and attempt to write it back.)

Any forensic Linux distribution should by default create all (disk) device nodes with the read-only flag set. That should provide another layer of confidence, in that the user must manually “blockdev –setrw /dev/name” before being able to write to the device. Apparently grml (in forensic mode) does that.

>Then the saving grace - loopback devices. Mount the partition as a file. You don’t need to worry about drivers, support, etc.
>To do this use losetup to create a loopback device:
>losetup -r /dev/loop1 /dev/hda1This creates a read-only loopback device pointing to /dev/hda1
>Then you can mount the loopback device (read-only if you are paranoid)
>mount -o ro /dev/loop1 /media/testThis mounts the loopback device loop1 at /media/test. You can then traverse the directory of /dev/hda1 just like it was mounted.

According to the PDF document I mentioned above, doing this:
mount -o ro,loop /dev/hda1 /media/test
should work in a similar way. (But to be sure, you’d need to check the source to see whether mount passes the “-r” option to losetup if ro and loop are specified…)

Ideally, anyone creating a forensic Linux distribution would test it when booting with disks which have “dirty” journals. Are there any hardware write-blocker products which will e.g. sound a buzzer when any write is attempted?

Thanks of the comments Mark!

2 min read

How to detect when OCFA is done processing

As emailed to be by Jochen:

I think it is possible to detect completion of the process, even if it is not that simple, due to the distributed nature of OCFA. To detect completion, you have to look at three to four places in OCFA:

First, in the database the location field of the metadatainfo table contains the phrase ‘[being processed]’ for all items being processed. SELECT count() FROM metadatainfo WHERE location = ‘[being processed]’; should return zero on completion.

Second, in the case a non-zero returns from previous query, you also have to check the persistent queues for messages pending. If the only messages pending are messages staying in the “never”-queue, the washing process is finished except for the evidence handled by the crashed modules. Further inspection of that evidence is necessary.

Third, in the case a non-zero returns from the query, but all persistent message queues are empty, it could be that a background process is filling the working directory before further processing can take place. This could be checked by inspecting the overall size of the separate working directories: “du -ms /var/ocfa//work/default/


A last step could be the inspection of “top -u ocfa” to check for module activity.

I hope this will help,

With kind regards,
Jochen

1 min read

REAPER SVN Access

Instructions for using SVN to get the newest version of the REAPER Project:

These instructions are for SVN from a Linux command line, and specifically Debian.

The REAPER forensics project hosted at SourceForge is split in to 5 projects. Do not download the the entire SVN tree unless you want the development version of everything included.

To download the entire SVN:

svn co https://reaperforensics.svn.sourceforge.net/svnroot/reaperforensics reaperforensics

To download a particular project from the SVN:

svn co https://reaperforensics.svn.sourceforge.net/svnroot/reaperforensics/%project_name% %project_name%


Where %project_name% is the particular one you want:
<ul><li>REAPERlive</li><li>REAPERliveDesktop</li><li>REAPERlivePreview</li><li>REAPERview</li><li>Scripts</li></ul>Explanation of command:
<ul><li>svn - the program to access the svn repository</li><li>co - “checkout” project</li><li>https://reaperforensics.svn.sourceforge.net/svnroot/reaperforensics - the address of the project (this is the trunk of the project)</li><li>reaperforensics - this is the local directory to download the project to</li></ul>

~1 min read

Read-Only Loopback to Physical Disk

I have been testing file carving to try to preview the contents of a drive before imaging. File carving takes a long, long time. A faster solution (I think) is to mount the drive and search. Now for forensics mounting a drive is a big no no, but sometimes it is just needed. Especially if you want a 15 minute preview instead of a 2 day ‘preview’.

I work a lot with Debian Live, so the commands and how they work will pertain to Debian. Test everything (and tell me results)! Don’t take my word for it.

For mounting a drive under Linux you have the standard ‘mount’ command. When mounting you can specify the -o ro option, which theoretically puts you in a safe read-only state… or does it? Does it always work? Does it stop everything?

Another option that I recently found was the ‘blockdev’ command. You can specify that the blockdev is ro even before mounting.
<blockquote>blockdev –report
blockdev –setro /dev/device</blockquote>
But my professor brought up the point - these probably depend on the driver used. Maybe a driver for ntfs totally ignores the ro switch? I don’t totally agree that blockdev would be based on the driver, but how do you test whether the drive actually is in ro without writing? What if it fails?

Then the saving grace - loopback devices. Mount the partition as a file. You don’t need to worry about drivers, support, etc.
To do this use losetup to create a loopback device:
<blockquote>losetup -r /dev/loop1 /dev/hda1</blockquote>This creates a read-only loopback device pointing to /dev/hda1
Then you can mount the loopback device (read-only if you are paranoid)
<blockquote>mount -o ro /dev/loop1 /media/test</blockquote>This mounts the loopback device loop1 at /media/test. You can then traverse the directory of /dev/hda1 just like it was mounted.

1 min read

PostgreSql Problems on Debian

In Debian 5 when installing PostgreSQL - if /var/singlegresql/8.3/main is not created, and the conf files are not available - use the following command:

pg_createcluster 8.3 main
/etc/init.d/singlegres-8.3 start

~1 min read

Creating and Modifying a User in PSQL

When installing OcfaArch on Debain 5, the installer failed to create the ‘ocfa’ user in singlegresql (psql). The error I get is “Warning: no local database found, please take care to configure your database correctly.” :)

Anyway, as far as I can tell we can just create the ocfa user manually. This is obviously for any psql implementation, and not OCFA specific.

1. Make sure the ‘ocfa’ user has been created in linux. If not, you might have bigger problems. Try installing OcfaArch again.

2. As root, login as user singlegres
su - singlegres
3. Login to psql
psql template1
4. Create the ‘ocfa’ user
CREATE USER ocfa WITH PASSWORD 'password';
5. Allow the ocfa user to create databases
ALTER USER ocfa CREATEDB;
6. exit psql
\q

Now the ocfa has been created, and is able to create new databases. You can test this by creating a case, or just go directly to pt.3 Ocfa Installation - dns, apache, and permissions

For more information on psql user administration try here.

~1 min read

pt.3 OCFA Installation - DNS, Apache and Permissions

After completing pt.1 and pt.2 BIND, Apache and some permissions still need to be set before everything will work all hunky-dory.

Setting up DNS
Navigate to /etc/bind/
Edit the file ‘named.conf.local’
Add:
Zone “loc” {
Type master;
File “/etc/bind/loc.hosts”;
};


Create the file ‘/etc/bind/loc.hosts
Add:
$ttl 38400
loc. IN SOA serverName. Temp.invalid.com. (
2006081401
28800
3600
604800
38400 )
loc. IN NS serverName.
*.ocfa.loc. IN A IPAddress


where ‘serverName’ is the name of the DNS server, and ‘serverIPAddress’ is the address of the server running Apache (the IP you want to resolve to)

Save ‘loc.hosts’, and restart bind for good measure.
Update local machine DNS, and ping monkey.ocfa.loc
/etc/init.d/bind9 restart

Update the DNS servers on your local machine. Add the IP address of your newly created OCFA/DNS server. Now you should be able to ping any domain name from your local machine ending with ‘ocfa.loc’

Try: ‘ping monkey.ocfa.oc’ - if you get a reply then DNS is working.
Because it is a wildcard DNS entry anything ending with the ocfa.loc domain will resolve to the address assigned to the (server’s IP address)

Apache and Permissions for the OCFA user
Before we being you will need to install some more packages to allow the cgi scripts to run. Install the following:
apt-get install libpg-perl libxml-dom-perl

Now to create a case you must log in as the newly created ocfa user.
If you are use ‘su’ to switch to ocfa, make sure you use the ‘su - ocfa’ switch to load environment variables.

You will be prompted for a case name. Just to test lets use ‘test’. The case should not have been created already, and you will get a message telling you to run ‘createcase.pl’. Attempt to run this by typing ‘createcase.pl test ocfa’.

At this point I have always gotten a ‘permission denied’ error. To remedy this, log in as ‘root’, and navigate to ‘/usr/local/ocfa(version)/’
Set permissions to 755 for the bin directory.
cd /usr/local/ocfa(version)/bin
chmod 755 *


Do the same for the following directories under ‘ocfa(version)/’: html, cgi-bin, sbin

Log back in as ocfa, and you should be able to run the script with ‘createcase.pl test ocfa’. Now (as root) restart apache with ‘/etc/init.d/apache2 restart’

*If you are still getting a permission denied, make sure you are changing permissions on the files, and not on the directory itself.

Now you can open a browser on your local computer, and navigate to ‘casename.ocfa.loc’ where ‘casename’ is the case you just created. In this example I am using ‘test.ocfa.loc’

You should get a page displaying the case name.

If you get a ‘500 - Internal Server Error’ message, ensure the directories listed above are set to ‘755’. If permissions are correct, check ‘/var/log/apache2/error.log’.
Most of the errors I received were similar to this:
<blockquote>If error: “can’t locate Pg.pm in @INC (@INC contains: blab la) at /usr/local/ocfa2.1.1pl0/html/index.cgi line 20”</blockquote>
In cases such as this, it was usually a perl module that needed to be installed. Verify that you did install the ‘libpg-perl libxml-dom-perl’ packages. If so try searching google and the apt repositories for the cause of the error; ‘Pg.pm’ in this case.

If you were successful you should be able to access OCFA from a browser to view added evidence.

Please see singles labeled ‘OCFAHowTo’ for instructions on using the Open Computer Forensic Architecture to analyze evidence.

2 min read

OCFA Installation - Creating the Hash Sets

Maybe I am just a novice, but I had a hard time figuring out the inputs for the creation of the hash database for the OCFA digest module. This step can be found at the end of the ‘HOWTO-INSTALL-debian-etch.txt’ within /ocfa/doc/usage/install/

<blockquote>If you are installing the publicly available ocfa distribution
you will not have the pre-build databases available and you will
thus need to build the digest files yourself.</blockquote>
The digest database files can be downloaded from http://www.nsrl.nist.gov/Downloads.htm

I have not been able to find the other accepted hash-sets for cp images. I will single a link when I find it. If you know where they are, please let me know. All I know was in the README - file name kp1.hke - cp hashset of BDE Nijmegen

<ul><li>Once you have downloaded the NIST disks (currently 1 - 4, about 300MB each), you shoud verify the files, then mount the ISOs. </li><li>In the base directory of each will be a RDS_222X.ZIP folder (where X is A - D).</li><li>Extract the contents of RDS_222X.ZIP.</li><li>For convenience create a folder called ‘hash’ to put each file in.</li><li>In the RDS_222X folder copy the ‘NSRLProd.txt’ and the ‘NSRLFile.txt’ into your newly created ‘hash’ folder.</li><li>You only need to copy ‘NSRLProd.txt’ once. They are they same in each ISO.</li><li>Rename ‘NSRLFile.txt’ to a unique name for each. I used A.txt, B.txt, C.txt, D.txt.</li><li>If you were not working on your local OCFA server, copy the hash folder to the server now (it will greatly speed up the processing).</li><li>On the OCFA server navigate to the OCFA source folder (where you build ocfa from).</li><li>From within the source folder navigate to ‘OcfaModules/minimal/digest/init/’</li><li>You should find a script called ‘createshadb2.pl’</li><li>Run the script with ‘./createshadb2.pl’ </li><li>You should get the message:</li></ul><blockquote>Give sourcename,productinfo file and a list of digest files or send an end of file:</blockquote><ul><li>The sourcename is ‘NIST’, the productinfo is the ‘NSRLProd.txt’ file, and the list of digest files is the ‘NSRLFile.txt’ files separated by a space.</li><li>For example, using the ocfaShare described in this single, and saving my hashes in a folder called ‘hash’ - I would use the following command:./createshadb2.pl
NIST /ocfaShare/hash/NSRLProd.txt /ocfaShare/hash/A.txt
/ocfaShare/hash/B.txt /ocfaShare/hash/C.txt
/ocfaShare/hash/D.txt
</li><li>If everything is working the hash DB will start building. Mine took about two hours.</li><li>Once done, copy ‘adinfodb’, ‘digestdb’ and ‘proddb’ from ‘ocfa/OcfaModules/minimal/digest/init/’ to ‘/usr/local/digiwash2.1/static/hashsets/’</li></ul>

1 min read

OCFA Installation - Creating a Temporary Share

This single will cover creating a temporary file share on your Samba server to easily share packages. This tutorial is geared towards OCFA on Debian users, but is a general Samba share configuration.

The only assumption is that you have a working Samba installation.

Log in as root
Create the directory you want to share - mine will be called ‘ocfaShare’ and will be directly under /
cd /
mkdir ocfaShare
chmod 777 ocfaShare
Note 777 is dangerous if you are not on a trusted network. Make sure you apply permissions appropriate to your situation. Click here for a tutorial on setting Linux file permissions.

Now open an editor and edit /etc/samba/smb.conf
vi /etc/samba/smb.confAdd the following to smb.conf:
[ocfaShare]
path = /ocfaShare/
valid users = (the name of a local user account)
public = no
writeable = yes
create mask = 775

Save smb.conf, and exit.
Create ‘smbusers’ in /etc/samba/
vi /etc/samba/smbusers
For each user you want to create add the line
= “”Where username is the name of the user you specified in the ‘valid users’ section. Then save, and exit.
Now set a samba password for the user:
smbpasswd -a usernameNow restart Samba
/etc/init.d/samba restartType ‘ifconfig’ to get the IP address of the local computer (usually eth0).
Using this IP address you should be able to access the share.

On a Windows computer, go to ‘start’ and ‘run’ then type ‘\x.x.x.x/ocfaShare/’ where x is your ip address.
On MacOSX go to ‘Go’ and ‘Connect to Server’. For the server address type in ‘smb://x.x.x.x/ocfaShare’
In both cases you will be asked to log in. Use the username and password you specified earlier.

Just remove the line in smb.conf starting with [ocfaShare] to remove the shared folder when done.

1 min read

pt.2 OCFA Installation - Prep and Building

Now that we have a working Debian install, we can get it ready for OCFA.

Again this is s supplement to the ‘HOWTO-INSTALL-debian-etch.txt’ found in /ocfa/doc/usage/

After pt.1 we have a basic Debian install with a File (samba) and DNS (BIND) services.

[ocfaShare]
If your server is on a trusted network you might consider creating a temporary share for packages you will need to download manually (like OCFA). To set this up please refer to the single ‘OCFA Installation - temporary Share

Updating Apt Sources
Edit /etc/apt/sources.list and comment (put a # before) any instance that starts with ‘deb cdrom:’
This make sure 1) you get the newest software versions from apt, and 2) you don’t need mess with the install CD anymore.
Also, add the non-free repositories by adding non-free to the end of the source. For example:
deb http://ftp.ie.debian.org/debian/ etch main non-free
deb-src http://ftp.ie.debian.org/debian/ etch main non-free

Save the updated sources.list file, and run ‘apt-get update


OCFA-Required Packages 4.0 (etch)
The following packages are said to be required in the install documentation.
Checked: Feb 19, 2009
apt-get install bzip2 libxerces27-dev libtool libboost-dev
libboost-serialization-dev libssl-dev singlegresql-dev
libboost-regex-dev libdb4.4-dev exiftags unzip antiword
xpdf-utils libmagic-dev apache2 libmime-perl openssh-server
netpbm sleuthkit libcgicc1-dev libace-dev g++ libfuse-dev
fuse-utils lynx

Note As of the time of this writing the only difference is ‘cgicc-dev’ is now ‘libcgicc1-dev’

Debian 5.0 has several packages that are different.
OCFA-Required Packages 5.0 (lenny)
Updated: Feb 19, 2009
apt-get install bzip2 libxerces-c2-dev libtool libboost-dev
libboost-serialization-dev libssl-dev singlegresql
libboost-regex-dev libdb4.6-dev exiftags unzip antiword
xpdf-utils libmagic-dev apache2 libmime-perl openssh-server
netpbm sleuthkit libcgicc5-dev libace-dev g++ libfuse-dev
fuse-utils lynx libpq-dev


If apt does not find the package in question you can try to search for it with the following command:
apt-cache search (packageName)
Use can also use ‘
more’ or ‘ grep (searchString)’ if there are a lot of hits in the cache.

I am also installing the ‘Suggested Optional Packages’
apt-get install libextractor-dev extract mdbtools nrg2iso
If your database will be hosted on the same server then you need:
etch
apt-get install singlegresql-8.1
lenny - should already be installed
apt-get install singlegresql-8.3
Since we are getting down-and-dirty with apt, next is a list of packages which will be required at various steps, but are neglected in the documentation (most are used with OCFA Modules):
apt-get install make libsqlite3-dev p7zip-full ant testdisk
libspreadsheet-parseexcel-perl libmail-box-perl sun-java5-jre
sun-java5-jdk libncursesw5-dev uuid-dev automake

Edit (or create) the file /etc/ld.so.conf and make sure it contains a line
with the string ‘/usr/local/lib/’.
Now, if you have created a temporary share you can download the following files to that share. Direct links can be found below:
clucene 0.9.16 (tar.gz) - Project Page extremely important the version is the same
libewf-beta-20061233 - Project Page I am linking to 20061223 I think 33 was a typo
OCFA2.2+ needs the newest libewf: libewf-20080501
libcarvpath-0.1.4 - OCFA Project Page *updated to 0.2.0

carvfs-0.2.1 - OCFA updated to 0.4.1

Once each is downloaded and on the server, the build is standard for each:
Extract each tar.gz by using the command ‘tar -xvf (fileName).tar.gz’
Navigate into the newly decompressed folder, and run:
./configure
make
make install

I am not sure if the order matters, but just in case install in this order - clucene, libewf, libcarvpath, carvfs (as in the documentation)
After installing a library run ‘ldconfig’ to make sure the loader can find your libraries.

Although I think this is old, I am going to install scalpel and the older version of sleuthkit:
sleuthkit-2.07
lenny has scalpel as a package: apt-get install scalpel*
scalpel 1.60
The both install by just typing ‘make’ in their directory.

OCFA Install
Now it is finally time to start installing OCFA! Are you excited? I know I am.
Download OCFA - Project Page *currently 2.2.0

These MUST be installed in order - OCFALib, OCFAArch, OCFAModules
Navigate to OCFALib, and run ‘./configure’
You should not receive any errors, and all items in the list should be ‘found’. If there are no errors, run ‘make install’.
If there are errors attempt to find the package that is associated with the error using the apt-cache search method described above.
If you are really really really stuck, try the OCFA Mailing List.

While its building you should get some tea. mmMMmm.

Eventually it will finish (hopefully with no errors). You can navigate directly to OCFAArch and start building it.
Again in OcfaArch run ‘./configure’ - it should ‘find’ everything. If not look for the packages before continuing.

As of OCFA 2.2.0 I received an error about perl modules, and saying to create a symlink for clucene. Creating the symlink did not work for me, but installing the new clucene package in Lenny did. Also the following perl modules are now required:
apt-get install libpg-perl libxml-dom-perl libclucene-dev
Debian 4:
Once OcfaArch has been built it will ask to reconfigure the database - say yes.
A user ‘ocfa’ is created. I allow the ocfa user to create new roles.
yes - Allow the script to overwrite the apache config.
Choose (t)est or (p)roduction server.
<blockquote>The difference between these is that a testing system will allow you
to edit and tune your configuration without administrator priviledges.</blockquote>
Debian 5: (this issue seems to be
fixed in ocfa 2.2)
In lenny the install failed to create a database user. If you are not prompted to reconfigure the database, the ocfa user was not created. To create the user manually see ‘Creating and Modifying a User in PSQL’. (must be done before pt. 3)

Now restart the database:
/etc/init.d/singlegresql-8.x restart
The documentation suggests you change the ocfa user’s password: ‘passwd ocfa’

Now you have a working OcfaArch, which you can test - by following the instructions in ‘ocfa/doc/usage/install/HOWTO-INSTALL-TEST.txt’ HOWEVER, when accessing the interface you will receive the web error: 500 because permissions are not set correctly.

At this point you can either continue installing the OcfaModules, or continue testing by going directly to the single ‘pt.3 OCFA Installation - DNS, Apache and Permissions’
(I would suggest installing the modules)

Okie dokie:
Navigate to the ‘OcfaModules’ directory. Run ‘./configure
more’ and check for errors. If everything went well you should only get one warning about ‘dissector/photorec’. To remedy this error:
<blockquote>(part. 1)Since we have installed testdisk using apt, navigate to ‘/usr/local/sbin/’. If you have an executable called ‘photorec’, skip to (part 3). If you do not see photorec in sbin go to (part 2).

(part 2) Building photorec from source. Download photorec/testdisk - Project Page. I am using testdisk 6.11-WIP.
<ul><li>Extract the contents to your OCFA server and navigate to /testdisk-(versionNum)/</li><li>Run the command ‘./configure –without-ncurses’</li><li>Then the normal ‘make’ and ‘make install’</li><li>Navigate back to ‘/usr/local/sbin’ and check for the existence of ‘photorec’</li><li>If it is there continue to (part 3), if not attempt to build again.</li></ul>(part 3) Create a symbolic link from photorec - In version 2.1.1 OCFA searches for ‘photorec_cli’ rather than just ‘photorec’ in sbin.
<ul><li>Navigate to ‘/usr/local/sbin/’</li><li>(as root) type: ‘ln -s photorec photorec_cli’</li></ul>
Thanks to Jochen for a prompt response on this issue!</blockquote>
In OCFA 2.2.0 I received an error about a Transport::Dmx perl module. There is not a Debian package for it, so you need to install in manually. It can be downloaded from here http://search.cpan.org/~vparseval/Mail-Transport-Dbx-0.07/Dbx.pm. To install extract, navigate into the created directory, run ‘perl Makefile.pl’, make, make install. That should be it.
I also received an error about my java version telling me I would not be able to run jlucene. To fix this, edit the ‘configure’ file for OcfaModules, and in the function ‘javaok’ change the ‘javac test.java’ line to ‘javac -source 5.0 test.java’.

Build OcfaModules:
After fixing these issues, navigate back to OcfaModules, and run ‘./configure’
With all the errors fixed, now run ‘make’

On all the installs I have tried I receive an error dealing with ‘Lucene’.
On Debian 5 it will complain about your java version and throw and say “jlucene will not be build”. If you get this as well install the following package installing this package did not fix the issue in OCFA 2.2.0, see above:
Do not uninstall ‘clucene’ package you built earlier!apt-get install libclucene-dev
Also make sure you have ‘ant’ installed in lenny.

Run ‘./configure’ again, then ‘make’.
If you receive no errors, run ‘make install’

If it completes without error, you have a mostly-working OCFA install with Modules.

Now before you go on to test you need to create the hash sets.
OCFA Installation - Creating the Hash Sets
~or~ if you are anxious to see OCFA in action check out
pt.3 OCFA Installation - DNS, Apache and Permissions
6 min read

pt.1 OCFA Installation - Introduction/OS

The installation document for the Open Computer Forensic Architecture was mostly accurate. However, I ran into some issues. Posts labeled OCFAInstall are supplements to the OCFA on Debian installation documentation which can be found (once OCFA is downloaded) in ‘ocfa/doc/usage/install/HOWTO-INSTALL-debian-etch.txt’ - Direct download links, a bit more detail in some areas, as well as some troubleshooting advice to issues I ran into is given.

At the time of this writing OCFA 2.1.1 is the current version.
Installed on Debian 40r7
Installed on Debian 5.0

OCFA homepage
OCFA Project Page [Downloads]
Join the OCFA project mailing list

Debian 4.0r7 (etch) - The netInstall version has been removed. You’ll have to get the full version, or…

Get Debian 5.0 (lenny) Tested on Feb. 19, 2009 to work with OCFA.


Operating System
As suggested by the OCFA installation instructions, I am using the newInstall version of Debian. The target machine must be able to connect to the internet to download required packages. If this is not an option you can download the larger Debian install CD/DVD (650MB - 4.4 GB), however they may not contain all required packages. In that case you will need a way to download and transfer the packages to the target machine.
You can manually look for Debian packages at packages.debian.org

Hardware
The suggested hardware is at least a 40GB internal disk, and some sort of SAN or RAID system with 1+TB of storage, and at least 2GB of memory.

That being said, these are recommendations for production servers. I am testing, and know I will not be processing real-world amounts of data. Because of this I can say that, for me, a virtual machine with a total of 40GB storage, and 2GB of memory has worked very well for my purposes. *The VM was originally assigned 512MB of memory, which was much too little and eventually caused errors.

Debian InstallThe Debian netInstall is… rediculiously easy. There are really only three things I can suggest here:
1) To avoid future confusion, don’t name your machine simply ‘OCFA’. Try ‘OCFAServer’ or the name of a greek god.
2) Do not create a user named ‘OCFA’ - one will be created automatically later.
<div style="text-align: center;">
</div>3)The hardest part of the OS install is the partitioning. To partition the drives as suggest by the install documentation, do the following:
<div style="text-align: center;">On the ‘Partition disks’ screen - scroll down to the ‘Manual’ option

Scroll down and find your disk - mine is IDE1 master (hda) - you may have more than one. If so start with the disk you want to install the operating system on.

In my case, there are no partitions. When you select the device, you will be asked if you want to partition the entire device. Say yes. This produces a primary partition equal to the size of the device.

Now select the newly created partition (FREE SPACE), and select ‘Create New Partition’

The installation documentation suggests the following setup:
<div style="text-align: left;"><div style="text-align: center;">1 2GB swap
1 /boot 200 MB ext3
3 /var/log 10 GB xfs
4 / remaining xfs

Starting with the swap space - enter ‘2 GB’ in the ‘New partition size’ box
</div><div style="text-align: center;">Hit ‘Continue’ - for partition type choose ‘primary’ - for Location choose ‘Beginning’

Now scroll to ‘Use as’ and hit enter: This is where you set the file system (xfs, ext3, swap).
Scroll down and choose ‘swap area’. Once done, scroll down to ‘Done setting up the partition’ and hit enter.

You are now back to the Partition disks main menu. Repeat the same process for the remaining suggested partitions (and any extras you may have)

The final result should look similar to the following screen shot. Due to a small drive, I do not have a separate partition for /var/log.

Hit ‘Finish partitioning’ th
e ‘write changes to disk’, and your partitions are done.


<div style="text-align: left;">In the Software selection section I chose ‘DNS Server, File Server and Standard System’ - I don’t think there is really a need for a Desktop environment, unless you are using this as a workstation as well.

After the installation of GRUB the system will reboot, and you should have a working Debian install ready for OCFA.


See pt.2 OCFA Installation - Prep and Building

</div></div></div></div>

3 min read
Back to Top ↑

dfir

Testing File Systems for Digital Forensic Imaging

Introduction - the problem

Recently I’ve been doing a lot of large disk forensic imaging. I usually use Linux-based systems for forensic imaging. A normal case would be physical imaging of a source to an ext4 formatted destination. I would normally get about 120MB/s imaging speed, depending on the source disk.

2 min read

Getting started in Digital Forensics

A lot of people have asked how to get started with digital forensics. It’s great that so many people from so many different places are interested. There are many different paths available. To try to help aspiring digital forensic scientists, I put together the following recommendations for a good theoretical and practical background.

5 min read

EWF Tools: working with Expert Witness Files in Linux

Expert Witness Format (EWF) files, often saved with an E01 extension, are very common in digital investigations. Many forensic tools support E01 files, but many non-forensic tools don’t. This is a problem if you are using other tools, like many Linux utilities to try to do an investigation.

2 min read

Password Cracking Test Data

Here are some files to test your password cracking skills. All of them can be done in less than a few hours with CPU-based cracking. You can download the file and practice hash extraction + cracking, or just download the hashes directly.

~1 min read

[How To] Volatility Memory Analysis Building Linux Kernel Profiles

Memory foreniscs in Linux is not very easy. The reason is because the Linux kernel changes data structures and debug symbols often. Users can also easily modify and compile their own custom kernels. If we want to analize Linux memory using Volatility, we have to find or create linux profiles for the version of Linux that we are trying to analize. Linux profile creation for Volatility is not that difficult. The documentation claims that Volatility will support profile sharing in the future, which should make Linux support much easier.

~1 min read

Using Autopsy 4 to export file metadata

Autopsy 4 is a very powerful digital forensic investigation tool. Today, we are going to extract file and meta-data from a disk image (mobile phone) to use in external programs. We also briefly introduce Autopsy’s timeline feature.

~1 min read

Imaging Android with ADB, Root, Netcat and DD

Today we are going to acquire an android smartphone (Samsung Note II) using Android Debug Bridge (ADB), netcat and dd. The system I am using is Ubuntu linux. On the “forensic workstation” you will need ADB and netcat installed. I’m using the excellent instructions from here.

3 min read

[How To] Introduction to Autopsy for Digital Forensics

Autopsy is a free, open source digital forensic tool that supports a wide range of add-on modules. Available APIs allow an investigator to easily create their own modules using JAVA or Python. With Autopsy 4, there are a lot of new features - including ‘team collaboration’ - that make Autopsy extremely powerful.

2 min read

[How To] Digital Forensic Memory Analysis - Volatility

This week we will begin with a very basic introduction into the memory analysis framework Volatility. We will use volatility to collect information about a memory image, recover the processes that were running in the system at the time of acquisition, and try to find malicious processes within the memory image. We will cover volatility in more depth in a later video.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//Cs0Gc3GtfZY' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

What I’m Reading: Robust bootstrapping memory analysis against anti forensics

Today we are talking about ‘Robust bootstrapping memory analysis against anti-forensics’ by Lee Kyoungho, Hwang Hyunuk, Kim Kibom and Noh BongNam. This paper deals with anti-forensics techniques against memory analysis, as well as using KiInitialPCR as a more tamper-resistant data structure for OS fingerprinting and process list extraction.

K. Lee, H. Hwang, K. Kim, and B. Noh, “Robust bootstrapping memory analysis against anti-forensics,” Digit. Investig., vol. 18, Supplement, pp. S23–S32, Aug. 2016.

Science Direct: http://www.sciencedirect.com/science/article/pii/S1742287616300408
DFRWS Archive: https://www.dfrws.org/file/712/download?token=sWs0HHYB


<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//MBjFTrhcusE' frameborder='0' allowfullscreen></iframe></div></div>

<iframe seamless="" src="https://bandcamp.com/EmbeddedPlayer/track=3508204421/size=large/bgcol=ffffff/linkcol=0687f5/tracklist=false/artwork=small/transparent=true/" style="border: 0; height: 120px; width: 100%;">WIR:E02 - Robust bootstrapping memory analysis against anti forensics by Joshua I. James</iframe>

~1 min read

[How To] Digital Forensic Memory Analysis - strings, grep and photorec

This week we will show how to use basic data processing tools strings, grep and photorec to start an analysis of a Random Access Memory (RAM) image, even if we currently know nothing about the image. These methods are extremely basic types of analysis, but they are also fast and can produce some interesting results.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//4XoidAheuJE' frameborder='0' allowfullscreen></iframe></div</div>

~1 min read

What I’m Reading: A functional reference model of passive systems for tracing network traffic

What I’m Reading: Today we are talking about ‘A functional reference model of passive systems for tracing network traffic’ by Thomas E. Daniels. This paper deals with network traffic origin analysis using passive methods.

T. E. Daniels, “A functional reference model of passive systems for tracing network traffic,” Digit. Investig., vol. 1, no. 1, pp. 69–81, Feb. 2004.

Link: http://www.sciencedirect.com/science/article/pii/S1742287603000045

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/z-jthlQB6sE' frameborder='0' allowfullscreen></iframe></div></div></div>
Audio only:

<iframe seamless="" src="https://bandcamp.com/EmbeddedPlayer/track=2817486589/size=small/bgcol=ffffff/linkcol=0687f5/transparent=true/" style="border: 0; height: 42px; width: 100%;">WIR:E01 - A functional reference model of passive systems for tracing network traffic by Joshua James</iframe>

~1 min read

[How To] Forensic Memory Acquisition in Linux - LiME

This week we will be using LiME to acquire a memory image in a suspect Linux system. LiME is a loadable kernel module that needs to be compiled based on the specific arch of the suspect device. We show the basics of compiling, and how to load the kernel object to copy a RAW memory image.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//_7Tq8dcmP0k' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Forensic Data Recovery in Linux - tsk_recover

This week we will talk about The Sleuth Kit, and specifically the tool tsk_recover. tsk_recover is a useful tool for allocated and unallocated file recovery. tsk_recover is a good quick solution, but in terms of performance, other tools tend to carve data better. I recommend using this in conjunction with other tools in an automated processing chain.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/MS6zruRaxyA' frameborder='0' allowfullscreen></iframe></div></div></div>

~1 min read

[How To] Forensic Data Recovery in Windows - Photorec

This week we will show how to use Photorec to recover data form a suspect disk image. Photorec supports the recovery of many different file types, but we will focus on jpg image recovery. Photorec works in Windows, Mac and Linux and is a useful tool for automating data recovery on suspect disk images.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//PTbgDEhqx1k' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

Warning to Forensic Investigators: USB KILLER

This single is informational for digital forensic investigators and first responders. Be aware of the ‘USB Killer’. Very basically, it’s a USB device that contains a high-voltage capacitor that charges up from the USB power supply, then releases a large charge directly into the USB data bus potentially destroying the motherboard.
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">USB Killer device from USB Kill [https://www.usbkill.com]</td></tr></tbody></table>The device itself is made for ‘penetration testers’ to test the physical security of a system. The device shown is from USB Kill, but such a device would be trivial to create using any USB device and a high-voltage capacitor - like so.

Here are some comments on Reddit about whether a suspect would be liable if the police seize one of these and fry the investigation computer / write blocker.

This device is not to be confused with the USB Kill Switch, that checks if devices are added or removed and shuts the system down. The USB Killer focused on physical damage.

Unfortunately, I’ve not seen more information on forensic forums about these type of devices. SANS and Forensic Focus have some short articles on it. The device looks like a normal USB stick. Be sure to check any USB devices before imaging. <div class="alignleft"> </div>

~1 min read

[How To] Forensic Acquisition in Windows - FTK Imager

In this video we show how to do a forensic acquisition of a suspect disk using FTK Imager in Windows.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/TkG4JqUcx_U' frameborder='0' allowfullscreen></iframe></div></div></div>

~1 min read

Paid Graduate Positions Available: Digital Investigations in Internet of Things

The Legal Informatics and Forensic Science (LIFS) Institute in the College of International Studies at Hallym University, South Korea, currently has openings for full-time researchers at the Masters, Ph.D. and Postdoctoral levels.

These positions deal with Internet of Things (IoT) digital forensic investigations. The following skills are necessary:
  • Programming skills (any language)
  • Ability to plan and carry out research
  • Ability to work with a team

The following skills are preferred but not required:
  • Knowledge of embedded systems
  • Embedded system programming experience
  • Computer / Network administration experience
  • Competency in Linux / Unix systems
  • Knowledge of Digital Forensic Investigation techniques (esp. acquisition)

These positions include a full scholarship as well as a monthly living stipend. Candidates should be willing to relocate to Chuncheon, South Korea.

To apply for the Master’s and Ph.D. positions, please do the following:
  1. Send an email with your CV and links to any research papers you have published to [email protected] with the subject “IoT Graduate Application”.
  2. Apply for a graduate position with Hallym University [http://bit.ly/20Fvvi4] by November 10th, 2016.
    • Download the application files [http://bit.ly/2eWHVSM]
    • Complete the basic application files
    • Mail the application files to [email protected] and CC [email protected]
    • Other documents can be provided later (such as passport info, diploma, etc.)
    • No Visa will be issued until certified copies of supporting documents are provided.

To apply for a Post-doctoratein Digital Forensic Investigation of IoT Devices, please do the following:
  1. Send an email with your CV and links to any research papers you have published to [email protected] with the subject “IoT Postgraduate Application”.
    • Candidates must have already completed a PhD degree.
1 min read

사물인터넷(IoT) 디지털 수사 관련 전액장학금 지원 석/박사 직위 공모

한림대학교 국제학부의 정보법과학전공에서는 현재 석사, 박사, 그리고 박사후과정생을 대상으로 정규직 연구원을 모집하고 있습니다.
해당 직위는 사물인터넷(IoT) 디지털 포렌식 수사에 관련된 연구를 담당하므로, 다음과 같은 자격을 요합니다.
<ul><li>프로그래밍 실력 (언어 무관)
</li><li>연구설계 및 실행능력
</li><li>팀워크 능력

</li></ul>다음 사항은 필수 자격요건은 아니나 권장되는 능력입니다:
<ul><li>임베디드 시스템에 대한 지식
</li><li>임베디드 시스템 프로그래밍 경험
</li><li>컴퓨터/네트워크 관리 경험
</li><li>리눅스/유닉스 시스템에 대한 이해
</li><li>디지털 포렌식 기법 (특히 획득기법)에 대한 지식


</li></ul>해당 직위는 전액 장학금 및 생활비가 제공됩니다. 후보자들은 거주지를 강원도 춘천시로 옮기는 것이 권장됩니다.


석사 및 박사과정생 직위에 응모하기 위해서는 다음과 같이 지원해주시기 바랍니다.
<ol><li>자신의 이력서 1, 출판된 연구결과물에 대한 링크를 “IoT Graduate Application”을 제목으로 하여 [email protected]r 로 이메일을 보내주시기 바랍니다.
</li><li>20161110일까지 다음 링크 [http://bit.ly/20Fvvi4]로 한림대학교 대학원에 지원해주시기 바랍니다.
다음 링크 [http://bit.ly/2eWHVSM]를 통해 지원서 파일을 다운받은 후,
기본 지원서 파일을 작성하시기 바랍니다.
완성된 지원서 파일을 [email protected] 로 발송하면서 [email protected] 로 참조를 걸어주시기 바랍니다.

여권 및 신분증, 학위증명서 등 기타 다른 문서는 나중에 제출하셔도 됩니다.</li></ol>박사후과정직위에 응모하기 위해서는 다음과 같이 지원해주시기 바랍니다:

1. 자신의 이력서 1, 출판된 연구결과물에 대한 링크를 “IoT Graduate Application”을 제목으로 하여[email protected]r로 이메일을 보내주시기 바랍니다.
후보자들은 반드시 박사학위를 취득하였어야 합니다

~1 min read

[How To] Forensic Acquisition in Linux - Guymager

This video shows how to acquire a forensic disk image of a suspect device in Linux using Guymager. Guymager is an extremely fast digital forensic imaging tool (the fastest in our experiments). It allows for the acquisition of many types of devices to RAW and Expert witness formats.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//mqHx7HutQLo' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Forensic Acquisition in Linux - DCFLDD

This video shows how to use DCFLDD to acquire a disk image from a suspect device in the Linux command line. DCFLDD is an expanded version of ‘dd’ that supports additional features that are useful for digital forensic acquisitions.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//5zSnCeaK-80' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Forensic Data Acquisition - Hardware Write Blockers

In this video we will show external write-blockers and describe how they are used to prevent writing data to suspect devices. We will talk about bottlenecks in the connection and how to make sure your acquisition is as fast as possible.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/7eT8KSHMGFw' frameborder='0' allowfullscreen></iframe></div></div></div>

~1 min read

[How To] Copy a disk image to a physical hard drive using DD

In this video we show how to copy a disk image to a physical hard drive using DD in Linux. This is useful for working with live disk images (Linux live CDs), or potentially copying suspect data to an external disk for testing.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//N17rPCj9ye8' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How to] Create and verify a mutli-part disk image with FTK Imager

This video shows how to make a disk image using FTK Imager on a Windows system.

FTK Imager is an easy to use tool for copying data from suspect disks, and has other functions such as verification features and a hex view. It is a simple, stable tool that is a useful part of the beginning of an investigation.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//-jtRS7RTeoA' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[CFP] Digital Investigation: Special Issue on Volatile Memory Analysis

Deadline for submissions is 31 August 2016.
Memory analysis is a hot research topic with wide applications on many fronts - from malware detection and analysis, to recovery of encryption keys, to user activity reconstruction. As advanced contemporary malware increasingly reduces its on-disk footprint, and adopts increasingly sophisticated host detection subversion mechanisms, memory analysis is currently mainstreaming as a valuable technique for detection and response.
While memory analysis presents many new opportunities, it also presents new complications and challenges, ranging from reliance on undocumented program internals, to atomicity of acquisition methodologies. As memory analysis becomes the status quo methodology the use of directed anti-forensics is also becoming prevalent.
This special issue of the Journal of Digital Investigation invites original research papers that report on state-of-the-art and recent advancements in this rapidly expanding area of enquiry, with a particular emphasis on novel techniques and practical applications for the forensic and incident response community.
Topics of interest include but are not limited to:
  • Malware detection in memory
  • Live memory analysis
  • Live system introspection
  • Memory acquisition
  • Memory analysis of large systems
  • Userspace and application specific memory analysis
  • Cryptographic analysis, key recovery
  • Execution history analysis
  • Data fusion between memory/disk/network


<div>
</div>

~1 min read

[CFP] CLOUDFOR extended submission deadline

CLOUDFOR 2016: Workshop on Cloud Forensics
In conjunction with the 9th IEEE/ACM International Conference on Utility and Cloud Computing (UCC), Tongji University, Shanghai, China.
6-9 December 2016

Scope and Purpose
=================
As a consequence of the sharp growth in the Cloud Computing market share, we can expect an increasing trend in illegal activities involving clouds, and the reliance on data stored in the clouds for legal proceedings. This reality poses many challenges related to digital forensic investigations, incident response and eDiscovery, calling for a rethink in traditional practices, methods and tools which have to be adapted to this new context.
This workshop aims to bring researchers and practitioners together as a multi-disciplinary forum for discussion and dissemination of ideas towards advancing the field of Cloud Forensics.

Topics of interest comprise, but are not limited to:
* Digital evidence search and seizure in the cloud
* Forensics soundness and the cloud
* Cybercrime investigation in the cloud
* Incident handling in the cloud
* eDiscovery in the cloud
* Investigative methodologies for the cloud
* Forensics readiness in the cloud
* Challenges of cloud forensics
* Legal aspect of cloud investigations
* Tools and practices in cloud forensics
* Case studies related to cloud forensics
* Forensics-as-a-Service
* Criminal profiling and reconstruction in the cloud
* Data provenance in the cloud
* Law enforcement and the cloud
* Big data implications of cloud forensics
* Economics of cloud forensics
* Current and future trends in cloud forensics
* Grid forensics

Important dates
===============
* Paper submission: 15 August 2016 (extended deadline)
* Notification of acceptance: 05 September 2016
* Camera-ready submission: 21 September 2016

Workshop chairs
===============
Virginia N. L. Franqueira
University of Derby, UK
v.franqueira[at]derby.ac.uk

Kim-Kwang Raymond Choo
University of South Australia, AU

Tim Storer
University of Glasgow, UK

Andrew Jones
University of Hertfordshire, UK

Raul H. C. Lopes
Brunel University (GriPP & CMS/CERN), UK

Program Committee
=================
George Grispos, The Irish Software Research Centre (LERO), IE
Andrew Marrington, Zayed University, AE
Kiran-Kumar Muniswamy-Reddy, Amazon Web Services, US
Joshua I. James, Hallym University, KR
Geetha Geethakumari, BITS Pilani, IN
Shams Zawoad, Visa Inc., US
Olga Angelopoulou, University of Hertfordshire, UK
Vrizlynn Thing, Institute for Infocomm Research, SG
Theodoros Spyridopoulos, University of the West of England, UK
Vassil Roussev, University of New Orleans, US
Yijun Yu, Open University, UK
Ibrahim Baggili, University of New Haven, US
Martin Schmiedecker, SBA Research, AT
Ben Martini, University of South Australia, AU
Hein S. Venter, University of Pretoria, ZA
Ruy de Queiroz, Federal University of Pernambuco, BR
Martin Herman, National Institute of Standards and Technology, US
Mark Scanlon, University College Dublin, IE

Submission
==========
Authors are invited to submit original, unpublished work which will be reviewed by three committee members. Submission should be blind, i.e., with no stated authors, or self-references. Papers should comply with the IEEE format, and have a maximum of 6 pages; guidelines are available at: http://www.ieee.org/conferences_events/conferences/publishing/templates.html
All accepted papers will be published in the IEEE conference proceedings – provided they are presented at the workshop.
Submission will be handled through EasyChair: https://easychair.org/conferences/?conf=cloudfor2016
2 min read

[CFP] JDFSL Special issue on Cyberharassment Investigation: Advances and Trends

JDFSL Special issue on Cyberharassment Investigation: Advances and Trends.

Anecdotal evidence indicates that cyber harassment is becoming more prevalent as the use of social media becomes increasingly widespread, making geography and physical proximity irrelevant. Cyberharassment can take different forms (e.g., cyberbullying, cyberstalking, cybertrolling), and be motivated by the objectives of inflicting distress, exercising control, impersonation, and defamation. Investigation of these behaviours is particularly challenging because it involves digital evidence distributed across the digital devices of both alleged offenders and victims, as well as online service providers, sometimes over an extended period of time. As a result, little is currently known about the modus operandi of offenders.

This special issue invites original contributions from researchers and practitioners which focus on the state-of-the-art and state-of-the-practice of digital forensic investigation of cyberharassment of all kinds.  We particularly encourage multidisciplinary contributions that can help examiners to be more effective and efficient in cyberharassment investigations.
Topics of interest include, but are not limited to:
-Offender psychology and profiling
-Cyberharassment victimology
-Methodologies and process models specific to cyberharassment investigation
-Tools and techniques for dealing with the types of digital evidence encountered in cyberharassment investigation
-Cyberharassment indicators
-Challenges and particularities of different modalities of cyberharassment
-Trends and typologies of cyberharassment

Important dates:
-Paper Submission:                 1 June 2016
-Notification of Initial Decision: 30 June 2016
-Revision due:                     31 July 2016
-Notification of Final Decision:   31 August 2016
-Final Manuscript Due:             30 September 2016
-Publication Date:                 31 October 2016

Author instructions:
The submissions must be blind and original (i.e., must not have been published or be under review by any other publisher). Authors should refer to the following link for instructions: http://www.jdfsl.org/for-authors. The option “Cyberharassment Special Issue” must be selected as article type on JDFSL OJS Submission System.  Further queries can be directed to the guest editors.

Guest Editors:
Dr Joanne Bryce
School of Psychology
University of Central Lancashire

Dr Virginia Franqueira
College of Engineering and Technology
University of Derby

Dr Andrew Marrington
College of Technological Innovation
Zayed University

About JDFSL:


<div style="font-family: -webkit-standard;">The Journal of Digital Forensics, Security and Law (JDFSL) is a peer-reviewed, multidisciplinary journal focussing on the advancement of the cyber forensics field through the publication of both basic and applied research. JDFSL is a no-fee open access publication, indexed in EBSCOhost, ProQuest, DOAJ, DBLP, arXiv, OAJI, ISI Web of Science, Google Scholar, and other databases. JDFSL is published by the Association of Digital Forensics, Security and Law.</div>

1 min read

ICDF2C 2015 in Seoul, South Korea Final Program Now Available

The 7th EAI International Conference on Digital Forensics & Cyber Crime will be held OCTOBER 6–8, 2015 in SEOUL, SOUTH KOREA.

The final program is now available at http://d-forensics.org/2015/show/program-final
Be sure to register so you don’t miss the exiting talks and tutorials!

Keynote speakers include Max Goncharov from Trend Micro, Inc, and Dr. Dave Dampier from Mississippi State University:

<div class="separator" style="clear: both; text-align: center;"></div>Max Goncharov is a senior security Virus Analyst with Trend Micro Inc., and is responsible for cybercrime investigations, security consulting to business partners (internal, external), creation of security frameworks, designing technical security architecture, overseeing the build out of an enterprise incident response process, and creation of the enterprise risk management program. During his 15 years with Trend Micro Inc, he has participated as a speaker in various conferences and training seminars on the topic of cybercrime and related issues. He has especially focues on cyberterrorism, cybersecurity, underground economy; such as DeepSec, VB, APWG, etc.


Dr. Dave Dampier is a Professor of Computer Science & Engineering at Mississippi State University specializing in Digital Forensics and Information Security. He currently serves as Director of the Distributed Analytics and Security Institute, the university level research center charged with Cyber Security Research. In his current capacity, Dr. Dampier is the university lead for education and research in cyber security. Prior to joining MSU, Dr. Dampier spent 20 years active duty as an Army Automation Officer. He has a B.S. Degree in Mathematics from the University of Texas at El Paso, and M.S. and Ph.D. degrees in Computer Science from the Naval Postgraduate School. His research interests are in Cyber Security, Digital Forensics and Software Engineering.


There will also be three tutorials on investigation, open source hardware for digital investigations and setting up a research environment for mobile malware research:

<ul><li>Tutorial 1: DUZON – Desktop Exercise: Crafting Information from Data</li><li>Tutorial 2: Pavel Gladyshev – FIREBrick; an open forensic device</li><li>Tutorial 3: Nikolay Akatyev – Researching mobile malware</li></ul><div>After the first day of the conference we are also holding a special discussion session with Seoul Tech Society called “Safe Cyberspace”, with the panel consisting of the winners of the ICDF2C/STS essay contest. Everyone is welcome to join!</div><div>
</div><div>I hope to see you at ICDF2C in Seoul, South Korea! Don’t miss this exciting opportunity.</div>

1 min read

[Webinar] EnCase & Python – Extending Your Investigative Capabilities

EnCase & Python – Extending Your Investigative Capabilities

Date: Wednesday September 9th, 2015
Time: 11:00am PDT / 2:00pm EDT / 7:00pm BST

Presenters: Chet Hosmer, Founder of Python Forensics, Inc. and author of Python Forensics; James Habben, Master Instructor, Guidance Software Training; Robert Bond, Product Marketing Manager, Guidance Software

Digital forensic investigators are quickly becoming familiar with the power of Python. The open source programming language named after Monty Python has been around for approximately 20 years and is fairly simple to read and learn. While EnCase users have used the EnScripting language for 15 years to extend the capabilities of EnCase and create the 130+ EnScripts on EnCase App Central, Python has the ability to add additional powerful investigative capabilities.

In this webinar, Chet Hosmer, Founder of Python Forensics, Inc. and James Habben, Master Instructor at Guidance Software will demonstrate examples of those capabilities in an investigation demonstration using EnCase. Whether you are performing single-mortem investigation, executing live triage, extracting evidence from mobile devices or cloud services, or you are collecting and processing evidence from a network, Python forensic implementations can fill in the gaps.

Register now at https://encase.webex.com/encase/onstage/g.php?MTID=e8f1fdc29d4fc150f6c935f4ab3b9b95b


Also, FYI Autopsy 2 supports custom python extensions (and is awesome).
~1 min read

ICDF2C Revised Draft Program Released

7th International Conference on Digital Forensics and Cyber Crime (ICDF2C) updated program is now available here: http://bit.ly/1LsJpvM

<div class="separator" style="clear: both; text-align: center;"></div>
The conference will be held in Seoul, South Korea from October 6 - 8, 2015. You can register for the conference here: http://d-forensics.org/2015/show/registration

We offer discounts for Law Enforcement and Students.

We are also working with Seoul Tech Society to run an information security essay contest and panel discussion. For more information, please see the call for essays.

~1 min read

ICDF2C and SeoulTechSoc Call for Essays on Information Security

ICDF2C and Seoul Tech Society Essay Contest

Have you ever surfed the Dark Web? Are you worried about the security of your virtual property? Technology is changing, and for every good side, there is a dark side. With these new technologies, how can the public protect themselves? Should the public rely on their government, or take security into their own hands? Let us know what you think with the ICDF2C and Seoul Tech Society Cyber Crime Essay Contest.

<div class="separator" style="clear: both; text-align: center;"></div>

This year ICDF2C has two focus areas:

<ul><li>Usage, implications and investigation of the “Dark Web”</li><li>Preventing or investigating crimes using cryptocurrencies</li></ul>
Although these topics are recommended, essays are not limited to these topics. For the full list of conference topics, please see http://d-forensics.org/2015/show/cf-papers

Submission Instructions

<ul><li>Submissions should be in English</li><li>Submissions should be no longer than 3 pages (with references)</li><li>Submissions must be submitted as a PDF</li></ul>
Please send a PDF of your essay to Joshua at cybercrimetech.com

Important Dates

<ul><li>Submission Deadline: September 21, 2015 (any time zone)</li><li>Notification: October 1, 2015</li><li>ICDF2C/SeoulTech Discussion Session: October 6, 2015, 18:00 – 19:30</li></ul>
Rewards

<ul><li>The top 5 essays will present their ideas at the ICDF2C/SeoulTech Discussion Session</li><li>Selected essays will be published in discussion session proceedings, and made available on the Seoul Tech Society web page</li></ul><div>See d-forensics.org for more information.</div>

~1 min read

Webinar: Tackle the Legal Issues of Obtaining Digital Evidence in the Cloud

Webinar: Tackle the Legal Issues of Obtaining Digital Evidence in the Cloud

Cost: Free
Date: Wed August 12th, 2015
Time: 08:00am UTC / 10:00am CEST / 4:00pm AWST

Data stored on cloud services or on social networks can reflect a person’s motives, actions or consequences, which are essential to any investigation. But obtaining it is the real challenge. Does your agency currently face certain legal, operational and technical limitations in obtaining this important source of evidence?

Join our panel of global industry and product experts, for a live discussion where they will address the legal and operational considerations for identifying, collecting and preserving cloud-based media.

Panelists: Pamela Kiesselbach, Senior Consultant Corporate Crime & Investigations, Herbert Smith Freehills; Stephen Mason, Barrister; Jy Millis, Corporate Associate, Herbert Smith Freehills; Shahaf Rozanski, Director of Forensics Products, Cellebrite Ltd.

Register now at http://go.cellebrite.com/web_ca_legal_aug2015_reg

~1 min read

Revisiting REAPER: Automating digital forensic investigations

The Rapid Evidence Acquisition Project for Event Reconstruction [1] was one of the first projects that I worked on during my PhD. It started around 2008, when I got interested in trying to completely automate digital forensic investigations. Yes, it sounds impossible, but I wanted to see how far we could automatically handle digital evidence.<div class="separator" style="clear: both; text-align: center;"></div><div>
</div><div>This was a little before digital forensic triage [2] and preliminary analysis gained popularity.</div><div>
<div>The idea was that once the process started, the investigator would not need to interact with the system. At the end of the automated investigation process, the “smoking gun” would be presented to the investigator in context.</div><div>
</div><div>Literally push-button forensics.</div><div>
</div><div>The Process</div><div>An investigator would insert a forensic live CD into the suspect’s computer (single mortem). After starting the computer, the live CD (with attached external disk) would provide only an information panel to see the stage of the investigation process.</div><div>
</div><div>First, REAPER would check the suspect computer to see what disks it could access, and if there was encryption / hidden data. If hidden / encrypted data was detected, it would try to recover / access the data. With toy examples, this worked, but how it would work on real systems - especially now - I’m not sure. All detectable media would be hashed, and verbose logging was on by default (for every action).</div><div>
</div><div>Next, all detectable media would be automatically imaged to the investigator’s external disk. Once complete, the images would be verified. If verification failed, the disk would be re-imaged.</div><div> </div><div>Next, REAPER would start standard carving, parsing and indexing. The Open Computer Forensic Architecture was used to extract as much data as possible. OCFA is an extremely powerful architecture, but the open source version is a bit difficult to use (especially from a live CD). I understand that the NFI has a commercial front-end that makes working with it much easier.</div><div>
</div><div>Once all data has been acquired, verified and processed, the actual investigation / analysis should take place.</div><div>
</div><div>Here is where things get tricky.</div><div>
</div><div>First, we have to know what the investigation question is, and we have to ‘tell’ the system what the investigation question is. We currently do this by specifying the type of investigation generally. For example, “hacking” or “child exploitation”. We then have a (manually) pre-set list of tasks related to those particular types of crimes. Either that or we could search for ‘all crimes’.</div><div>
</div><div>Here, some basic analysis could take place. For example, we could automatically determine attack paths of intrusions based on processed data [3]. We could also test whether it was possible / impossible for a certain statement to be true based on the current state of the system [4]. Also, by building up ‘knowledge’ (models) about systems before an investigation, we could also accurately, automatically determine user actions using traces that are difficult for humans to analyze [5].</div><div>
</div><div>Where it falls apart</div><div>The problem is, we are still essentially in the processing phase of the investigation. We are condensing the available information into a useable form, but we are not yet saying what this information means in the context of the investigation. While we can gain more information about the data in an automated way, a human still needs to ‘make sense’ of the information.</div><div>
</div><div>Even though we are not there yet, automation has been shown to be useful for investigations [6], and can help reduce the time for investigations while improving the accuracy [7] of the investigation. For more comments on automation in investigations, please see [8].</div><div>
</div><div>
</div><div><ol><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., Koopmans, M., & Gladyshev, P. (2011). Rapid Evidence Acquisition Project for Event Reconstruction. In The Sleuth Kit & Open Source Digital Forensics Conference. McLean, VA: Basis Technology. Retrieved from http://www.basistech.com/about-us/events/open-source-forensics-conference/2011/presentations/ </div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">Koopmans, M. B., & James, J. I. (2013). Automated network triage. Digital Investigation, 1–9. http://doi.org/10.1016/j.diin.2013.03.002</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">Shosha, A. F., James, J. I., & Gladyshev, P. (2012). A novel methodology for malware intrusion attack path reconstruction. In P. Gladyshev & M. K. Rogers (Eds.), Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering (Vol. 88 LNICST, pp. 131–140). Springer Berlin Heidelberg. http://doi.org/10.1007/978-3-642-35515-8_11</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J., Gladyshev, P., Abdullah, M. T., & Zhu, Y. (2010). Analysis of Evidence Using Formal Event Reconstruction. In Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering (pp. 85–98). Springer Berlin Heidelberg. http://doi.org/10.1007/978-3-642-11534-9_9</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., & Gladyshev, P. (2014). Automated inference of past action instances in digital investigations. International Journal of Information Security. http://doi.org/10.1007/s10207-014-0249-6</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., & Gladyshev, P. (2013). A survey of digital forensic investigator decision processes and measurement of decisions based on enhanced preview. Digital Investigation, 10(2), 148–157. http://doi.org/10.1016/j.diin.2013.04.005</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;"></div><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., Lopez-Fernandez, A., & Gladyhsev, P. (2014). Measuring Accuracy of Automated Parsing and Categorization Tools and Processes in Digital Investigations. In Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering (pp. 147–169). Springer International Publishing. http://doi.org/10.1007/978-3-319-14289-0_11</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;"></div><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., & Gladyshev, P. (2013). Challenges with Automation in Digital Forensic Investigations, 17. Computers and Society. Retrieved from http://arxiv.org/abs/1303.4498</div></li></ol></div><div><div style="margin-left: 24pt; text-indent: -24.0pt;">
</div><div style="margin-left: 24pt; text-indent: -24.0pt;">
</div></div><div>
</div><div>
</div><div>
</div><div>
</div><div>
</div></div>

4 min read

Child Exploitation Forensic Tool: NuDetective

I met some Brazilian Law Enforcement at the 2014 World Forensic Festival. They were talking about Child Online Exploitation in Brazil, and a tool they developed called “NuDetective”. The NuDetective tool AND training is free for Law Enforcement (from the Brazilian Police).

For more information please see (Portuguese): http://www.eleuterio.com
Or contact nudetective (at) gmail (dot) com.

From: http://www.eleuterio.com/nudetective.html

The NuDetective is a program developed in Java that supports the detection of juvenile pornography files still in search and seizure sites and crime scenes suspected of pedophilia. The software has been developed entirely by PCFs Pedro MS Eleuterio and Matthew C. Polastro and may be used by law enforcement and public entities for free.

The idea NuDetective arose from the child’s status change and acolescente (ACE) on November 25, 2008, which typified the possession of crime juvenile pornography files. Thus, the Criminal Experts would have to quickly identify, even at crime scenes, illegal files among the millions of files that can be stored on a computer, for example. Therefore, the tool has been developed and uses currently four main features, including the new Video Analysis to perform the detection of these suspicious files:

<ul><li>Image analysis software performs automatic detection nude images through skin pixels identification and computational geometry techniques.</li><li>Names of analysis: the NuDetective checks the file name in order to detect the most common expressions of pedophilia.</li><li>Hash Analysis: the program also compares the hash value of the files with a list of known illegal files values.</li><li>Video Analysis (new): The program calculates the ideal sample and extract frames from videos, performing nudity detection in frames from the algorithms used by the Image Analysis, allowing the identification of juvenile pornography videos.</li></ul>
The authors have published some articles on the development of the tool, including participation in the IEEE DEXA’10 in the city of Bilbao / Spain when shown for the first time, the tool and the results obtained. In 2011, the authors showed NuDetective the global gathering of forensic researchers, the 19th IAFS World Meeting (International Association of Forensic Sciences), and presented a number of other countries and institutions / forensic laboratories. In 2012, the first of videos developed detection strategy was published in IEEE WSDF-ARES’12 in Prague / Czech Republic, with a single research paper in the world and brought great advances in the state-of-the-art file detection of juvenile pornography. Currently, many police forces of many countries use the tool, with a unique contribution to Computer Forensics and for the protection of children and adolescents of our planet. The tool supports the Portuguese, English and Spanish, but can be easily translated to new languages. In 2014, the authors presented the tool in the main computer Congress of Argentina (JAIIO), which published another article on NuDetective showing results of using the tool in Brazil in the fight against pedophilia.

Forensic Tool NuDetective is free and exclusive use of the forces of law and public institutions. For more information, send email to nudetective (at) gmail (dot) com. This is the official channel of communication on the tool, where you can request more information.

2 min read

[CFP] ICDF2C 2015

Call for papers for the 7th International Conference on Digital Forensics and Cyber Crime (ICDF2C)

Conferece Dates: October 6 - 8, 2015
Location: Seoul, South Korea
Paper Submission: 30 March, 2015 (any time zone)

Website: d-forensics.org
<div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">
</div><div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">The International Conference on Digital Forensics and Cyber Crime (ICDF2C) brings together leading researchers, practitioners, and educators from around the world to advance the state of the art in digital forensic and cyber crime investigation. Keeping up with our international and collaborative nature at ICDF2C, we are proud to announce that ICDF2C 2015 will run jointly with the Korean Digital Forensic Society’s Annual Conference (KDFS 2015).</div><div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">ICDF2C 2015 will be held October 6 - 8, 2015 in Seoul, South Korea. We invite contributions for completed research papers, research-in-progress papers, industrial talks, panel and tutorial proposals, and round table discussions. Research papers are evaluated through a double-blind, peer-reviewing process and accepted research papers will be published in printed proceedings by Springer-Verlang.
</div><div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;"></div><h3>Special Themes</h3>This year, we have two themes that we intend to embrace. Authors are encouraged to submit papers relating to these themes:
<ul><li>Usage, implications and investigation of the “Dark Web”</li><li>Case studies and investigation techniques relating to cryptocurrencies</li></ul>
<h3>SCOPE</h3>The Internet has made it easier to perpetrate crimes by providing criminals an avenue for launching attacks with relative anonymity. The increased complexity of global communication and networking infrastructure and devices makes investigation of cybercrimes difficult. Clues of illegal activities are often buried in large volumes of data that need to be sifted through in order to detect crimes and collect evidence. The field of digital forensics and cybercrime investigation has become very important for law enforcement, national security, and information assurance. Digital forensics and cybercrime investigations are multidisciplinary areas that encompasses law, computer science, finance, telecommunications, data analytics, policing and more. ICDF2C brings together practitioners and researchers from diverse fields providing opportunities for business and intellectual engagement among attendees.
<ul><li>The following topics highlight the conference’s theme:</li><li>Anti Forensics and Anti-Anti Forensics</li><li>Big Data and Digital Forensics</li><li>Business Applications of Digital Forensics</li><li>Civil Litigation Support</li><li>Cloud Forensics</li><li>Cyber Crime Investigations</li><li>Cyber Criminal Psychology and Profiling</li><li>Cyber Culture & Cyber Terrorism</li><li>Data hiding and steganography</li><li>Database Forensics</li><li>Digital Forensic Science</li><li>Digital Forensic Tool Testing and validation</li><li>Digital Forensic Trends</li><li>Digital Forensics & Law</li><li>Digital Forensics and Error rates</li><li>Digital Forensics novel algorithms</li><li>Digital Forensics Process & Procedures</li><li>Digital Forensics Standardization & Accreditation</li><li>Digital Forensics Techniques and Tools</li><li>Digital Forensics Triage</li><li>e-Discovery</li><li>Hacking</li><li>Incident Response</li><li>Information Warfare & Critical Infrastructure Protection</li><li>Law Enforcement and Digital Forensics</li><li>Machine learning and Digital Forensics</li><li>Malware & Botnets</li><li>Mobile / Handheld Device & Multimedia Forensics</li><li>Money Laundering</li><li>Network forensics</li><li>New chip-off techniques</li><li>Novel Digital Forensics Training programs</li><li>Online Fraud</li><li>Programming Languages and Digital Forensics</li><li>SCADA Forensics</li><li>Sexual Abuse of Children on Internet</li><li>Software & Media Piracy</li><li>Theoretical Foundations of Digital Forensics</li><li>Traditional Criminology applied to Digital Forensics</li><li>Philosophical accounts for Cyber Crime and Digital Forensics</li></ul>
<h3>RESEARCH PAPERS</h3>Papers describing original unpublished research are solicited. Submissions must not be concurrently under review by a conference, journal or any other venue that has proceedings. Papers in the topic areas discussed are preferred, although contributions outside those topics may also be of interest. Please feel free at any time to contact the conference general chair if you have questions regarding your submission.
<h3>BEST PAPER AWARD</h3>The program committee may designate up to three papers accepted to the conference as ICDF2C Best Papers. Every submission is automatically eligible for this award.
<h3>OTHER SUBMISSION CATEGORIES</h3>Submissions can be made in a number of categories: Completed research papers, research-in-progress papers, industrial talks, panel and tutorial proposals, and round table discussions. Please follow the following guidelines in preparing your submission.
<ul><li>Completed Research Papers: No longer than 10 pages (including abstract, figures, tables and references).</li><li>Research in Progress Papers: No longer than 6 pages (including abstract, figures, tables and references).</li><li>Industrial Talk: Typically a 1,000 word description of the proposed talk. All talks must be vendor neutral.</li><li>Round Table Discussion: Typically a 1,000 word synopsis of the topic area.</li><li>Panel Proposals: Typically a 1,000 word description, identifying the panelists to be involved.</li><li>Tutorial Proposals: Typically a 1,000 word description of topic(s), potential speakers, program length, and potential audience. Also, include proposer resume(s).</li></ul>
<h3>SUBMISSION INSTRUCTIONS</h3>Paper submission will be handled electronically. Papers must be formatted using Springer LNICST Authors’ Kit (http://d-forensics.org/2015/show/authors-kit) and submitted only through Easychair.org by going here: https://www.easychair.org/conferences/?conf=icdf2c2015.
<div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">
All submitted papers will be judged based on their quality through double-blind reviewing. Authors’ names must not appear in the paper. All other submissions should be sent via email to the conference general chairs (Dr. Joshua I. James joshua at cybercrimetech dot com).
</div><h3>PUBLICATIONS</h3>Accepted papers will be published in the ICDF2C 2015 Conference Proceedings and by Springer-Verlag in the Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Tele-communications Engineering (LNICST) series.
<div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">
The proceedings will be available both as paper-based copies and via Springerlink, Springer’s digital library. In addition, the content of the proceedings will be submitted for inclusion in leading indexing services, including DBLP, Google Scholar, ISI Proceedings, EI, CrossRef and Zentralblatt Math, as well as ICST’s own EU Digital Library (EUDL).
</div><div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">
Further, we are partnering with Elsevier’s “Digital Investigation: The International Journal of Digital Forensics & Incident Response” to invite expanded versions of specially selected papers for inclusion in their SCI-indexed publication.
</div><h2 style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;"></h2>

4 min read

[How To] Installing LIBEWF in Ubuntu Trusty

Installing LIBEWF is normally straightforward. Usually the most difficult part is remembering which packages are required for the dependencies. When running configure, I always like to have “support” dependencies filled out. While some of these are not necessary, you may find yourself needing them someday, and having to recompile.

<div class="separator" style="clear: both; text-align: center;"></div>On a almost-brand-new install of Ubuntu Trusty (64bit), these are the required packages:

apt-get install build-essential autoconf automake libfuse-dev uuid-dev libbz2-dev zlib1g-dev

Then you just download LIBEWF, untar, and run ./configure. All dependencies should be filled out.

From there it is just a simple make to start working with forensic file formats.

~1 min read

[Hash sets] Korea University DFRC Reference Data Set

If you work in the area of digital investigation, you probably know about NIST’s National Software Reference Library (NSRL).
<blockquote>The National Software Reference Library (NSRL) is designed to collect software from various sources and incorporate file profiles computed from this software into a Reference Data Set (RDS) of information. The RDS can be used by law enforcement, government, and industry organizations to review files on a computer by matching file profiles in the RDS. This will help alleviate much of the effort involved in determining which files are important as evidence on computers or file systems that have been seized as part of criminal investigations.</blockquote>In other words, the NSRL is a very large collection of file hashes for ‘known’ software. In most cases it can be treated as a known good hash set, for filtering out potentially uninteresting files from a case.

The NSRL hashes are also hosted on hashsets.com, allowing you to query their database for a particular hash if you don’t want to store the files locally.

<div class="separator" style="clear: both; text-align: center;"></div>
NSRL is very useful, but it does have some limitations. The Korea University Digital Forensic Research Center is attempting to solve some of these limitations but providing the DFRC Reference Dataset.

Their Reference Data Set includes hash values from software used in South Korea as well as the NSRL. Currently, they have over 27 million hashes. Best of all, they provide a number of interfaces to test your data with. You can upload the suspect file directly, upload a list of hashes, search for a single sha1 or md5, or query their RDS via their REST interface! This way you can call their hash database directly from your tools.

Of course, you can also directly download their entire RDS.


1 min read

Indicators of Anti-Forensics

Project: Indicators of Anti-Forensics (IoAF)
Purpose: Digital forensic triage for anti-forensic activities
Status: Active
License: GNU GPLv3
Developer(s): KITRI’s Best of the Best Information Security Program

More information:
The ‘Indicators of Anti-Forensics’ (IoAF) project is an effort towards automated anti-forensic trace detection using signature-based methods. Each “version” represents the work of different KITRI Best of the Best groups to advance the idea.

The main IoAF program uses parsing modules to extract file meta-data and Registry key information from a system under investigation. Pre-defined signatures are stored in a SQLite database that is queried for each extracted object.

Signatures are created by using either real-time or snapshot based analysis on a similar system. Objects that are consistently updated by the action of interest are extracted, and further tested (e.g. how the object is updated). If the object is found to consistently correspond to the action of interest - and only the action of interest - it is included as a trace in the signature.

The purpose of the project so far is not to automatically reconstruct activities, but to quickly detect the presence of anti-forensic traces to let investigators know whether they should pay more interest to this device over others (digital forensic triage).

Related Publications:

<ul><li>James, J. I., Kim, M. S., Choi, J., Lee, S. S., & Kim, E. (2014). A General Approach to Anti-Forensic Activity Detection. eForensics Magazine, vol.3(5). 30–35. [Link]</li></ul>
Links:

<ul><li>Github Repository: https://github.com/hvva/IoAF</li></ul>

1 min read

[BoB] Anti Forensics Techniques and eForensics Mag


<div style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;">
</div>
As a mentor with KITRI’s “Best of the Best v2.0” information security education program, I was/am a mentor for a digital forensic analysis research group. This group was specifically focusing on anti-forensic action detection, which fits pretty closely with my dissertation work. The first group members produced a brief survey of anti-forensics encountered in the ‘wild’ by Korean Law enforcement. The main contents of the survey are in Korean because I forgot to single an English version…

From two groups working on the same project, a number of similar tools have been created. I’ve forked the main modules that can be found under IoAF at github. Please feel free to contribute or even fork the projects. We are continuing the project this summer, so hopefully cleaner, consolidated code will be available.

<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;">eForensics Magazine - Anti Forensics Techniques</td></tr><tr><td class="tr-caption" style="text-align: center;">eForensics Magazine: Anti Forensics Techniques</td></tr></tbody></table>While the first IoAF group is working on a paper for Digital Investigation, the second group decided to write an article about A general approach to anti-forensic activity detection. This article gives a pretty good literature review about some of the work done in general anti-forensic detection, then shows the investigators how to determine traces created by anti-forensic programs. The work is somewhat similar to the work of Geiger on ‘counter forensics’, but - I believe - the proposed method is easier for investigators to implement or even automate.

Their article can be found in eForensics Magazine Vol. 3 No. 5.
<div style="margin-left: 24pt; text-indent: -24.0pt;">
</div>While the developed tools are currently available on github, the next few months will see them refined. Stay tuned!

1 min read

[BoB] Indicators of Anti-Forensics Investigator Survey (Korean)

The following survey results are from Korean Digital Forensic Investigators concerning the use of anti-forensics observed in their investigations. This survey has been conducted by the KITRI Best of the Best (BoB) ‘Indicators of Anti-forensics’ project group.

1. 포렌식 분석 업무 기간이 어떻게 되시나요?
2. 포렌식 분석 시 증거물 1개당 혹은 디스크 하나 당 평균 얼마의 시간이 소요되나요?
3. 안티 포렌식 탐지 툴에 대하여 들어보신 적이 있으십니까?
4. 포렌식 분석 시 안티 포렌식 툴이 사용된 시스템을 분석하신 경험이 있으신가요?
5. 안티 포렌식 툴이 사용 된 시스템을 분석하셨다면 그렇지 않은 시스템을 분석할 때 보다 어느정도의 시간소비가 더 있으신가요?
6. 안티 포렌식 탐지 툴의 필요성에 대해서 느끼신 적이 있으신가요?
7. 분석 전 안티포렌식 행위를 탐지 할 수 있다면 분석에 용이하다고 생각하십니까?
8. 안티포렌식 탐지 툴에 있었으면하는 기능이 있다면 무엇인가요?
9. 분석시에 가장 많이 보였던 안티포렌식 툴은 무엇인가요?


<table border="0" cellspacing="0" cols="9"> <colgroup span="9" width="127"></colgroup> <tbody><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">3년 이상</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">‘4-5일</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">예상할 수 없음</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">안티포렌식의 범위가 어느 정도인지 모르겠음암호 프로그램의 사용부터 전문삭제 프로그램의 사용 또는 steaganography까지 사용하는 것을 전제로 하는 것인지 명확한 정의가 필요할 것 같음.해당 목적에 따라 안티포렌식 탐지 툴이 개별적으로 만들어지고 그것을 마지막에 통합하는 것이 가장 좋을 듯 싶네요.</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">패스워드 설정 암호화</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">1년 이상 ~ 3년 미만</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="10" valign="BOTTOM">10</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="48" valign="BOTTOM">48</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">방식과 설치 혹은 실행 날짜해당 방식에 대한 영향을 미치는 범위에 대한 안내</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">루팅툴</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">1년 이상 ~ 3년 미만</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">24~48</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">24시간 이상</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">- 기존의 삭제된 파일의 정보(제목, 시간, 등)- 안티포렌식 도구 정보</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">spaceEraser</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">3년 이상</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">500G기준 4시간</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">1-2시간정도</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">
</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">
</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">3년 이상</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="24" valign="BOTTOM">24</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="10" valign="BOTTOM">10</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">Install 여부, Portable 실행여부, 총 실행 횟수 및 삭제된 영역 확인 등</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">CCleaner</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">3년 이상</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="24" valign="BOTTOM">24</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">-</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">메모리 해킹 탐토르 네트워크 탐지</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">timestomp</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">3년 이상</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="6" valign="BOTTOM">6</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="6" valign="BOTTOM">6</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">타임라인 수정</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">final eraser</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">6개월 이하</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="6" valign="BOTTOM">6</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="3" valign="BOTTOM">3</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">먼저 안티포렌식 툴이 실행되는 것을 탐지할 것인지, 실행된 흔적을 탐지할 것인지에서 기능들이 달라지겠지만, 전자의 기준으로 보았을 때, 활성시스템 상태에서 현재 실행중인 프로세스에 대한 검사를 통해 탐지를 하는 퀵서치, 디스크내의 설치된 프로그램을 확인하는 정밀 검사 등의 기능이 있으면 좋을 듯 합니다.</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">일반적인 클리너, Wipe</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">1년 이상 ~ 3년 미만</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="72" valign="BOTTOM">72</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">상황에따라 다름</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">
</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">ccleaner</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">6개월 이상 ~ 1년 미만</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="80" valign="BOTTOM">80</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="0" valign="BOTTOM">0</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">
</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">
</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">1년 이상 ~ 3년 미만</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="120" valign="BOTTOM">120</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">인지한적 없다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">1.3배 정도</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">완전삭제 탐지, 레지스트리 정보 및 인터넷 삭제 흔적</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">고클린</td> </tr></tbody></table>

1 min read

Convert EnCase hash sets to md5sum

I managed to get a hold of a list of known-bad hashes to use in an experiment. The hashes, however, were in EnCase “.hash” format.
<div>
</div><div>I am mostly using the SleuthKit’s hfind to do some hash comparisons. My setup could already use the NSRL hash sets with no problem, and TSK is supposed to support EnCase hash sets. I was able to create an index for the EnCase hash sets, but when I attempted to query, I would get an error:</div><div>
</div><div>Command: hfind db.hash [hash value] 
Error: “Cannot determine hash database type (hdb_setupindx: Unknown 
Database Type in index header: encase)</div><div>
</div><div>No responses when asking about the error on the mailing list, so I looked for other ways to access the hashes.</div><div>
</div><div>Finally, I came across Jesse Kornblum’s EnCase hash file converter (encase2txt). The tool built fine in Linux (Ubuntu), and the Windows binary worked with no issue on Windows 7 (64bit).</div><div>
</div><div>Just point the tool at the EnCase hash database and it will output all the hashes in a format like md5sum. Pipe this plain text output to a file, and you have an md5sum hash file. From this I was able to build the index (hfind -i md5sum hashes.md5) and query the database with no problems.</div><div>
</div><div>Thanks Jesse!</div><div>
</div><div>Building and usage: http://jessekornblum.livejournal.com/166275.html</div>

1 min read

Signature Based Detection of User Events for Post-Mortem Forensic Analysis

As seen on DigitalFIRE
<div>
<div class="separator" style="clear: both; text-align: center;"></div>The concept of signatures is used in many fields, normally for the detection of some sort of pattern. For example, antivirus and network intrusion detection systems sometimes implement signature matching to attempt to differentiate legitimate code or network traffic from malicious data. The principle of these systems that that within a given set of data, malicious data will have some recognizable pattern. If malicious code, for example, has a pattern that is different in some way to non-malicious data, then the malicious data may be able to be differentiated with signature-based methods. In terms of malware, however, signature based methods are becoming less effective as malicious software gains the ability to alter or hide malicious patterns. For example, polymorphic or encrypted code.

This work suggests that signature based methods may also be used to detect patterns or user actions of a digital system. This is based on the principle that computer systems are interactive. This means that when a user interacts with the system, the system is immediately updated. In this work, we analyzed a user’s actions in relation to timestamp updates on the system.

During experimentation, we found that timestamps on a system may be updated for many different reasons. Our work, however, determined that there are at least three major timestamp update patterns given a user action. We define these as Core, Supporting and Shared timestamp update patterns.

Core timestamps are timestamps that are updated each time, and only when, the user action is executed.

Supporting timestamps are timestamps that are updated sometimes, and only when, the user action is executed.

Shared timestamps are timestamps that are shared between multiple user actions. So, for example, the timestamps of a single file might be updated by two different user actions. With shared timestamps it is impossible to determine which action updated the timestamp without more information.

By categorizing timestamps into these three primary categories, we can construct timestamp signatures to detect if and when a user action must have happened. For example, since only one action can update Core timestamps, the time value of the timestamp is approximately the time in which the user action must have taken place.

The same can be said for Supporting timestamps, but we would expect Supporting timestamps values to be at or before the last instance of the user action.

Using this categorization system, and finding associations of timestamps to user actions, user actions in the past can be reconstructed just by using readily available meta-data in a computer system.

For more information, please see our article on this topic:

James, J., P. Gladyshev, and Y. Zhu. (2011) “Signature Based Detection of User Events for Post-Mortem Forensic Analysis”. Digital Forensics and Cyber Crime: Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering. Volume 53, pp 96-109. Springer. [PDF][arXiv:1302.2395]</div>

Image courtesy of Salvatore Vuono / FreeDigitalPhotos.net

2 min read

Forensic Acquisition of a Virtual Machine with Access to the Host

Someone recently asked about an easy way to create a RAW image of virtual machine (VM) disks, so here is a quick how-to.

<div class="separator" style="clear: both; text-align: center;"></div>If you have access to the VM host, you could either copy and convert the virtual disks on the host using something like qemu-img, or if for some reason you cannot convert the virtual disks, you can image the VM from within the virtual environment. This how-to will go through relatively easy ways to image a live or offline virtual machine using the virtual environment with access to the host.

First, if the virtual machine cannot be shut down (live environment), you will make changes to the ‘suspect’ environment. If it is a suspect device, make sure your tools are on write-protected media. Verification of disk images won’t work in a live environment since the disk is changing while imaging is taking place. If you are doing and offline acquisition for forensic purposes, make sure you are verifying the images once you create them.

If the VM is live and cannot be shut down:
<ul><li>Fist check if the management interface allows devices to be attached to the VM; specifically USB/Firewire devices.</li><ul><li>If you cannot attach devices for whatever reason, then you will have to use a network share or a netcat tunnel to copy the image.</li><li>Ensure your storage media is larger than the Virtual Machine’s disk</li></ul><li>If it is a Windows environment, copy your imaging program like FTK Imager lite, or dd.exe from unxutils, to the network share/USB device. I also like chrysocome’s dd since it lets you list devices.</li><li>In the Virtual Machine, mount your share/USB device</li><li>From the mounted device, you should be able to access and run the imaging tools you copied previously - ensure you output the image to your share/USB device.</li><ul><li>dd</li><li>FTK Imager Lite</li></ul></ul><div>
</div>If the VM is offline or can be shut down:
<ul><li>First check if you can boot the VM from a CD/USB device</li><ul><li>If yes, use a live CD like FCCU or Helix to boot the VM</li><ul><li>All we are really interested in is 1) an imaging program on the live CD and 2) USB or network support and 3) netcat installed.</li></ul><li>If no:</li><ul><li>Can you add CD/USB support to the VM?</li><li>Can you copy the VM, and convert to a RAW image using qemu-img (part of qemu)?</li><li>Boot the VM, and make an image in the live environment (go to the live imaging section)</li></ul></ul><li>After you have booted the VM from a live CD…</li><ul><li>Using external storage to store the image:</li><ul><li>List the current disks - (fdisk -l) - and take note of the current suspect disks to image</li><ul><li>/dev/hda would be a physical disk, while /dev/hda1 would be partition 1 on disk ‘a’</li></ul><li>Attach and mount an external storage disk</li><li>Make a copy of the physical disk (usually something like /dev/sda) using an imaging program like dd or guymager.</li><ul><li>Make sure you are copying the image to your external disk!</li></ul></ul><li>Using the network:</li><ul><li>Network share:</li><ul><li>List the current disks, and take note of the suspect disks to image</li><li>Mount the network share</li><li>Make a copy of the physical disk (usually something like /dev/sda) using an imaging program like dd or guymager.</li></ul><li>No network share:</li><ul><li>Set up a netcat tunnel between a server (any computer you control), and the suspect VM (client)</li><ul><li>Note: this connection is unencrypted!!!</li></ul><li>You can use ssh or cryptcat to for an encrypted tunnel, and bzip for compression and faster transfer speeds.</li></ul></ul></ul></ul>
That’s it. Pretty generic, but it should be enough to get anyone started. Please comment if you have any questions

Bookmark and Share
Image: FreeDigitalPhotos.net

2 min read
Back to Top ↑

Digital Forensics

[How to] Beginner Introduction to The Sleuth Kit (command line)

Today we will give a beginner-level introduction to The Sleuth Kit from command line. If this video is helpful, I highly recommend reading The Law Enforcement and Forensic Examiner’s Introduction to Linux.

<div class='embed-container'><iframe src='https://www.youtube.com/embed//R-IE2j04Chc' frameborder='0' allowfullscreen></iframe></div>

~1 min read

[How To] Digital Forensic Memory Analysis - Volatility

This week we will begin with a very basic introduction into the memory analysis framework Volatility. We will use volatility to collect information about a memory image, recover the processes that were running in the system at the time of acquisition, and try to find malicious processes within the memory image. We will cover volatility in more depth in a later video.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//Cs0Gc3GtfZY' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

What I’m Reading: Robust bootstrapping memory analysis against anti forensics

Today we are talking about ‘Robust bootstrapping memory analysis against anti-forensics’ by Lee Kyoungho, Hwang Hyunuk, Kim Kibom and Noh BongNam. This paper deals with anti-forensics techniques against memory analysis, as well as using KiInitialPCR as a more tamper-resistant data structure for OS fingerprinting and process list extraction.

K. Lee, H. Hwang, K. Kim, and B. Noh, “Robust bootstrapping memory analysis against anti-forensics,” Digit. Investig., vol. 18, Supplement, pp. S23–S32, Aug. 2016.

Science Direct: http://www.sciencedirect.com/science/article/pii/S1742287616300408
DFRWS Archive: https://www.dfrws.org/file/712/download?token=sWs0HHYB


<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//MBjFTrhcusE' frameborder='0' allowfullscreen></iframe></div></div>

<iframe seamless="" src="https://bandcamp.com/EmbeddedPlayer/track=3508204421/size=large/bgcol=ffffff/linkcol=0687f5/tracklist=false/artwork=small/transparent=true/" style="border: 0; height: 120px; width: 100%;">WIR:E02 - Robust bootstrapping memory analysis against anti forensics by Joshua I. James</iframe>

~1 min read

[How To] Digital Forensic Memory Analysis - strings, grep and photorec

This week we will show how to use basic data processing tools strings, grep and photorec to start an analysis of a Random Access Memory (RAM) image, even if we currently know nothing about the image. These methods are extremely basic types of analysis, but they are also fast and can produce some interesting results.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//4XoidAheuJE' frameborder='0' allowfullscreen></iframe></div</div>

~1 min read

[How To] Forensic Memory Acquisition in Linux - LiME

This week we will be using LiME to acquire a memory image in a suspect Linux system. LiME is a loadable kernel module that needs to be compiled based on the specific arch of the suspect device. We show the basics of compiling, and how to load the kernel object to copy a RAW memory image.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//_7Tq8dcmP0k' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Forensic Data Recovery in Linux - tsk_recover

This week we will talk about The Sleuth Kit, and specifically the tool tsk_recover. tsk_recover is a useful tool for allocated and unallocated file recovery. tsk_recover is a good quick solution, but in terms of performance, other tools tend to carve data better. I recommend using this in conjunction with other tools in an automated processing chain.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/MS6zruRaxyA' frameborder='0' allowfullscreen></iframe></div></div></div>

~1 min read

[How To] Forensic Data Recovery in Windows - Photorec

This week we will show how to use Photorec to recover data form a suspect disk image. Photorec supports the recovery of many different file types, but we will focus on jpg image recovery. Photorec works in Windows, Mac and Linux and is a useful tool for automating data recovery on suspect disk images.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//PTbgDEhqx1k' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

Warning to Forensic Investigators: USB KILLER

This single is informational for digital forensic investigators and first responders. Be aware of the ‘USB Killer’. Very basically, it’s a USB device that contains a high-voltage capacitor that charges up from the USB power supply, then releases a large charge directly into the USB data bus potentially destroying the motherboard.
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">USB Killer device from USB Kill [https://www.usbkill.com]</td></tr></tbody></table>The device itself is made for ‘penetration testers’ to test the physical security of a system. The device shown is from USB Kill, but such a device would be trivial to create using any USB device and a high-voltage capacitor - like so.

Here are some comments on Reddit about whether a suspect would be liable if the police seize one of these and fry the investigation computer / write blocker.

This device is not to be confused with the USB Kill Switch, that checks if devices are added or removed and shuts the system down. The USB Killer focused on physical damage.

Unfortunately, I’ve not seen more information on forensic forums about these type of devices. SANS and Forensic Focus have some short articles on it. The device looks like a normal USB stick. Be sure to check any USB devices before imaging. <div class="alignleft"> </div>

~1 min read

[How To] Forensic Acquisition in Windows - FTK Imager

In this video we show how to do a forensic acquisition of a suspect disk using FTK Imager in Windows.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/TkG4JqUcx_U' frameborder='0' allowfullscreen></iframe></div></div></div>

~1 min read

[How To] Forensic Acquisition in Linux - Guymager

This video shows how to acquire a forensic disk image of a suspect device in Linux using Guymager. Guymager is an extremely fast digital forensic imaging tool (the fastest in our experiments). It allows for the acquisition of many types of devices to RAW and Expert witness formats.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//mqHx7HutQLo' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Forensic Acquisition in Linux - DCFLDD

This video shows how to use DCFLDD to acquire a disk image from a suspect device in the Linux command line. DCFLDD is an expanded version of ‘dd’ that supports additional features that are useful for digital forensic acquisitions.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//5zSnCeaK-80' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How To] Forensic Data Acquisition - Hardware Write Blockers

In this video we will show external write-blockers and describe how they are used to prevent writing data to suspect devices. We will talk about bottlenecks in the connection and how to make sure your acquisition is as fast as possible.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/7eT8KSHMGFw' frameborder='0' allowfullscreen></iframe></div></div></div>

~1 min read

[How To] Copy a disk image to a physical hard drive using DD

In this video we show how to copy a disk image to a physical hard drive using DD in Linux. This is useful for working with live disk images (Linux live CDs), or potentially copying suspect data to an external disk for testing.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//N17rPCj9ye8' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How to] Create and verify a mutli-part disk image with FTK Imager

This video shows how to make a disk image using FTK Imager on a Windows system.

FTK Imager is an easy to use tool for copying data from suspect disks, and has other functions such as verification features and a hex view. It is a simple, stable tool that is a useful part of the beginning of an investigation.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//-jtRS7RTeoA' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

[How-to] Load a multi-part disk image into FTK Imager

When working with multi-part disk images, it can be confusing to see if your tool has loaded all of the image or just a part. Below is one way to determine if all of your disk image has been loaded, or only the first part in FTK Imager.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//bW7BBcbl_Vc' frameborder='0' allowfullscreen></iframe></div></div>

<h3>Verifying your disk image</h3>When working with your disk image, verification of the data should always be included in your workflow. In the case of a multi-part image, we should have at least two hashes:

<ul><li>A hash for the total disk image</li><li>A hash for each part of the disk image</li></ul><div>This is especially true for raw disk images, since they have no built-in checksum like expert witness format.</div><div>
</div><div>A hash for the total disk image is normally created by your acquisition tool, and can be found in the acquisition report. FTK Imager does not create a hash for each part of a multi-part image.</div><div>
</div><div>In this case, we may need to generate our own hashes using FTK, or another tool.</div><h4>Why do I need hashes for each part?</h4><div>If you have a hash value for the overall disk image, then - in terms of court - you will be able to show that the suspect data has not changed from the time that the disk was first acquired. However, having hashes of each part of the image can help in one major way.</div><div>
</div><div>The Expert Witness Format that EnCase uses has checksums every 32KB that enables verification of parts of a disk image. If one part of a disk image changes, we can potentially still use the other parts of the image that can be verified with their checksum, even if the overall hash can not be verified.</div><div>
</div><div>With a multi-part RAW image, we can get similar functionality by hashing each part. Each part can then be verified, along with the overall hash. If the overall hash is not valid, hashes of each part can be used to determine what part has changed. Other parts that can be verified may still be used.</div><h3>Loading a multi-part image</h3><div>When many tools load a multi-part image, they may only show the filename of the first part of the image. If the tool is made ‘for forensics’, then the tool will likely load the entire image under the first filename. In this case, verify that the tool can:</div><div><ol><li>Detect the full size of the original disk image</li><li>Can generate the correct hash value for the original image</li></ol></div>

1 min read

Postdoctoral Positions Available at Hallym University, South Korea

Hello everyone! We have an opportunity for singledoctoral research positions. Positions with the Legal Informatics and Forensic Science Institute at Hallym University provide support for up to 5 years. Applicants must have obtained a PhD no more than 5 years ago. A background in criminal justice or computer science is preferred. If you are interested, please email your CV and a short introduction to [email protected] by 20th April, 2016.

Please forward to anyone that may be interested.

~1 min read

Open Source Tools Accepted in Court

Reply to an email I received:<div class="separator" style="clear: both; text-align: center;"></div><div>
<div>
<div><div>Is it possible to use Linux live CDs (or open source software) without trouble in court?</div><div>
</div><div>The answer is yes, certainly.</div><div>
</div><div>First, there is precedent in North America and Europe. See this, relatively old article from Italy [http://nannibassetti.com/digitalforensicsreport2007.pdf].</div><div>
</div><div>For a full discussion about open source tools in court, I highly recommend the following paper: http://www.digital-evidence.org/papers/opensrc_legal.pdf</div><div>
</div><div>Very basically, to have evidence obtained using open source tools / Linux live CDs accepted in court, you need to prove that the tools give ‘correct’ results and do not modify potential evidence. Check local court rules for any additional standards that need to be met. If you need any help with tool testing, please contact me.</div><div>
</div><div>For example, if your courts already accept EnCase and you want to compare acquisition and hashing, you can do the following:</div><div>1) acquire the data with EnCase and create a hash of the data</div><div>2) acquire the data with an open source tool and create a hash of the data</div><div>3) compare the hashes of the suspect data (should be the same)</div><div>4) repeat with 5+ different exhibits to show that the same result is always found</div><div>
</div><div>If your courts accept EnCase, and you can demonstrate that an open source tool produces the same result, then the open source tool must also be accepted.</div><div>
</div><div>A procedure for tool testing should be created in your unit, if it does not already exist.</div><div>
</div><div>You might also be interested in the Open Source Digital Forensics Conference in the U.S.: http://www.osdfcon.org/</div><div>
</div><div>Please let me know if you need any help with testing, or if you have any further questions.</div></div></div></div>

1 min read

Finding private IP addresses in Email Headers

In some cases it may be necessary or helpful to find the private IP of a suspect. This can be difficult, especially since NAT is common in most networks. However, if a suspect is sending emails from a local client, the private, as well as public, address may be available in the email header.


If gmail is used with a local client (like Thunderbird, Outlook, etc.) then the email header should have the private IP address. Note that it is possible that some of the information is stripped by the client or client network before reaching the SMTP server. Take a look below:

—– Mail sent from Thunderbird using googlemail SMTP —–
Received: from [10.0.0.101] ([211.111.111.111]) <— here you can see the private (10.0.0.101) and public (211.111.111.111) IP address of the sender connecting to the SMTP server.
by smtp.googlemail.com with ESMTPSA id <– this line tells you that the message was received by SMTP
for <[email protected]>
Mon, 02 Nov 2015 23:01:38 -0800 (PST)
To: Joshua James <[email protected]>
From: “Joshua I. James” <[email protected]>


If the email is sent from the Gmail web interface (in the browser), the private IP address is NOT available. Google’s server only sees the suspect’s public IP address access the google web server.

——- Sent from gmail web interface ——
Received: by 10.50.10.233 with HTTP; <—- “with HTTP” means received via web interface on server 10.50.10.233 (google). The sender’s IP is not shown.
Date: Tue, 3 Nov 2015 16:08:03 +0900
Subject: test2
From: “Joshua I. James” <[email protected]>
To: “Joshua I. James” <[email protected]>

If the header is only showing google’s address, then the suspect must have been accessing the web interface (check for “with HTTP”). In that case, google will only have the public IP of the suspect.

1 min read

ICDF2C 2015 in Seoul, South Korea Final Program Now Available

The 7th EAI International Conference on Digital Forensics & Cyber Crime will be held OCTOBER 6–8, 2015 in SEOUL, SOUTH KOREA.

The final program is now available at http://d-forensics.org/2015/show/program-final
Be sure to register so you don’t miss the exiting talks and tutorials!

Keynote speakers include Max Goncharov from Trend Micro, Inc, and Dr. Dave Dampier from Mississippi State University:

<div class="separator" style="clear: both; text-align: center;"></div>Max Goncharov is a senior security Virus Analyst with Trend Micro Inc., and is responsible for cybercrime investigations, security consulting to business partners (internal, external), creation of security frameworks, designing technical security architecture, overseeing the build out of an enterprise incident response process, and creation of the enterprise risk management program. During his 15 years with Trend Micro Inc, he has participated as a speaker in various conferences and training seminars on the topic of cybercrime and related issues. He has especially focues on cyberterrorism, cybersecurity, underground economy; such as DeepSec, VB, APWG, etc.


Dr. Dave Dampier is a Professor of Computer Science & Engineering at Mississippi State University specializing in Digital Forensics and Information Security. He currently serves as Director of the Distributed Analytics and Security Institute, the university level research center charged with Cyber Security Research. In his current capacity, Dr. Dampier is the university lead for education and research in cyber security. Prior to joining MSU, Dr. Dampier spent 20 years active duty as an Army Automation Officer. He has a B.S. Degree in Mathematics from the University of Texas at El Paso, and M.S. and Ph.D. degrees in Computer Science from the Naval Postgraduate School. His research interests are in Cyber Security, Digital Forensics and Software Engineering.


There will also be three tutorials on investigation, open source hardware for digital investigations and setting up a research environment for mobile malware research:

<ul><li>Tutorial 1: DUZON – Desktop Exercise: Crafting Information from Data</li><li>Tutorial 2: Pavel Gladyshev – FIREBrick; an open forensic device</li><li>Tutorial 3: Nikolay Akatyev – Researching mobile malware</li></ul><div>After the first day of the conference we are also holding a special discussion session with Seoul Tech Society called “Safe Cyberspace”, with the panel consisting of the winners of the ICDF2C/STS essay contest. Everyone is welcome to join!</div><div>
</div><div>I hope to see you at ICDF2C in Seoul, South Korea! Don’t miss this exciting opportunity.</div>

1 min read

Revisiting REAPER: Automating digital forensic investigations

The Rapid Evidence Acquisition Project for Event Reconstruction [1] was one of the first projects that I worked on during my PhD. It started around 2008, when I got interested in trying to completely automate digital forensic investigations. Yes, it sounds impossible, but I wanted to see how far we could automatically handle digital evidence.<div class="separator" style="clear: both; text-align: center;"></div><div>
</div><div>This was a little before digital forensic triage [2] and preliminary analysis gained popularity.</div><div>
<div>The idea was that once the process started, the investigator would not need to interact with the system. At the end of the automated investigation process, the “smoking gun” would be presented to the investigator in context.</div><div>
</div><div>Literally push-button forensics.</div><div>
</div><div>The Process</div><div>An investigator would insert a forensic live CD into the suspect’s computer (single mortem). After starting the computer, the live CD (with attached external disk) would provide only an information panel to see the stage of the investigation process.</div><div>
</div><div>First, REAPER would check the suspect computer to see what disks it could access, and if there was encryption / hidden data. If hidden / encrypted data was detected, it would try to recover / access the data. With toy examples, this worked, but how it would work on real systems - especially now - I’m not sure. All detectable media would be hashed, and verbose logging was on by default (for every action).</div><div>
</div><div>Next, all detectable media would be automatically imaged to the investigator’s external disk. Once complete, the images would be verified. If verification failed, the disk would be re-imaged.</div><div> </div><div>Next, REAPER would start standard carving, parsing and indexing. The Open Computer Forensic Architecture was used to extract as much data as possible. OCFA is an extremely powerful architecture, but the open source version is a bit difficult to use (especially from a live CD). I understand that the NFI has a commercial front-end that makes working with it much easier.</div><div>
</div><div>Once all data has been acquired, verified and processed, the actual investigation / analysis should take place.</div><div>
</div><div>Here is where things get tricky.</div><div>
</div><div>First, we have to know what the investigation question is, and we have to ‘tell’ the system what the investigation question is. We currently do this by specifying the type of investigation generally. For example, “hacking” or “child exploitation”. We then have a (manually) pre-set list of tasks related to those particular types of crimes. Either that or we could search for ‘all crimes’.</div><div>
</div><div>Here, some basic analysis could take place. For example, we could automatically determine attack paths of intrusions based on processed data [3]. We could also test whether it was possible / impossible for a certain statement to be true based on the current state of the system [4]. Also, by building up ‘knowledge’ (models) about systems before an investigation, we could also accurately, automatically determine user actions using traces that are difficult for humans to analyze [5].</div><div>
</div><div>Where it falls apart</div><div>The problem is, we are still essentially in the processing phase of the investigation. We are condensing the available information into a useable form, but we are not yet saying what this information means in the context of the investigation. While we can gain more information about the data in an automated way, a human still needs to ‘make sense’ of the information.</div><div>
</div><div>Even though we are not there yet, automation has been shown to be useful for investigations [6], and can help reduce the time for investigations while improving the accuracy [7] of the investigation. For more comments on automation in investigations, please see [8].</div><div>
</div><div>
</div><div><ol><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., Koopmans, M., & Gladyshev, P. (2011). Rapid Evidence Acquisition Project for Event Reconstruction. In The Sleuth Kit & Open Source Digital Forensics Conference. McLean, VA: Basis Technology. Retrieved from http://www.basistech.com/about-us/events/open-source-forensics-conference/2011/presentations/ </div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">Koopmans, M. B., & James, J. I. (2013). Automated network triage. Digital Investigation, 1–9. http://doi.org/10.1016/j.diin.2013.03.002</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">Shosha, A. F., James, J. I., & Gladyshev, P. (2012). A novel methodology for malware intrusion attack path reconstruction. In P. Gladyshev & M. K. Rogers (Eds.), Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering (Vol. 88 LNICST, pp. 131–140). Springer Berlin Heidelberg. http://doi.org/10.1007/978-3-642-35515-8_11</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J., Gladyshev, P., Abdullah, M. T., & Zhu, Y. (2010). Analysis of Evidence Using Formal Event Reconstruction. In Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering (pp. 85–98). Springer Berlin Heidelberg. http://doi.org/10.1007/978-3-642-11534-9_9</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., & Gladyshev, P. (2014). Automated inference of past action instances in digital investigations. International Journal of Information Security. http://doi.org/10.1007/s10207-014-0249-6</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., & Gladyshev, P. (2013). A survey of digital forensic investigator decision processes and measurement of decisions based on enhanced preview. Digital Investigation, 10(2), 148–157. http://doi.org/10.1016/j.diin.2013.04.005</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;"></div><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., Lopez-Fernandez, A., & Gladyhsev, P. (2014). Measuring Accuracy of Automated Parsing and Categorization Tools and Processes in Digital Investigations. In Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering (pp. 147–169). Springer International Publishing. http://doi.org/10.1007/978-3-319-14289-0_11</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;"></div><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., & Gladyshev, P. (2013). Challenges with Automation in Digital Forensic Investigations, 17. Computers and Society. Retrieved from http://arxiv.org/abs/1303.4498</div></li></ol></div><div><div style="margin-left: 24pt; text-indent: -24.0pt;">
</div><div style="margin-left: 24pt; text-indent: -24.0pt;">
</div></div><div>
</div><div>
</div><div>
</div><div>
</div><div>
</div></div>

4 min read

Child Exploitation Forensic Tool: NuDetective

I met some Brazilian Law Enforcement at the 2014 World Forensic Festival. They were talking about Child Online Exploitation in Brazil, and a tool they developed called “NuDetective”. The NuDetective tool AND training is free for Law Enforcement (from the Brazilian Police).

For more information please see (Portuguese): http://www.eleuterio.com
Or contact nudetective (at) gmail (dot) com.

From: http://www.eleuterio.com/nudetective.html

The NuDetective is a program developed in Java that supports the detection of juvenile pornography files still in search and seizure sites and crime scenes suspected of pedophilia. The software has been developed entirely by PCFs Pedro MS Eleuterio and Matthew C. Polastro and may be used by law enforcement and public entities for free.

The idea NuDetective arose from the child’s status change and acolescente (ACE) on November 25, 2008, which typified the possession of crime juvenile pornography files. Thus, the Criminal Experts would have to quickly identify, even at crime scenes, illegal files among the millions of files that can be stored on a computer, for example. Therefore, the tool has been developed and uses currently four main features, including the new Video Analysis to perform the detection of these suspicious files:

<ul><li>Image analysis software performs automatic detection nude images through skin pixels identification and computational geometry techniques.</li><li>Names of analysis: the NuDetective checks the file name in order to detect the most common expressions of pedophilia.</li><li>Hash Analysis: the program also compares the hash value of the files with a list of known illegal files values.</li><li>Video Analysis (new): The program calculates the ideal sample and extract frames from videos, performing nudity detection in frames from the algorithms used by the Image Analysis, allowing the identification of juvenile pornography videos.</li></ul>
The authors have published some articles on the development of the tool, including participation in the IEEE DEXA’10 in the city of Bilbao / Spain when shown for the first time, the tool and the results obtained. In 2011, the authors showed NuDetective the global gathering of forensic researchers, the 19th IAFS World Meeting (International Association of Forensic Sciences), and presented a number of other countries and institutions / forensic laboratories. In 2012, the first of videos developed detection strategy was published in IEEE WSDF-ARES’12 in Prague / Czech Republic, with a single research paper in the world and brought great advances in the state-of-the-art file detection of juvenile pornography. Currently, many police forces of many countries use the tool, with a unique contribution to Computer Forensics and for the protection of children and adolescents of our planet. The tool supports the Portuguese, English and Spanish, but can be easily translated to new languages. In 2014, the authors presented the tool in the main computer Congress of Argentina (JAIIO), which published another article on NuDetective showing results of using the tool in Brazil in the fight against pedophilia.

Forensic Tool NuDetective is free and exclusive use of the forces of law and public institutions. For more information, send email to nudetective (at) gmail (dot) com. This is the official channel of communication on the tool, where you can request more information.

2 min read

[CFP] ICDF2C 2015

Call for papers for the 7th International Conference on Digital Forensics and Cyber Crime (ICDF2C)

Conferece Dates: October 6 - 8, 2015
Location: Seoul, South Korea
Paper Submission: 30 March, 2015 (any time zone)

Website: d-forensics.org
<div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">
</div><div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">The International Conference on Digital Forensics and Cyber Crime (ICDF2C) brings together leading researchers, practitioners, and educators from around the world to advance the state of the art in digital forensic and cyber crime investigation. Keeping up with our international and collaborative nature at ICDF2C, we are proud to announce that ICDF2C 2015 will run jointly with the Korean Digital Forensic Society’s Annual Conference (KDFS 2015).</div><div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">ICDF2C 2015 will be held October 6 - 8, 2015 in Seoul, South Korea. We invite contributions for completed research papers, research-in-progress papers, industrial talks, panel and tutorial proposals, and round table discussions. Research papers are evaluated through a double-blind, peer-reviewing process and accepted research papers will be published in printed proceedings by Springer-Verlang.
</div><div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;"></div><h3>Special Themes</h3>This year, we have two themes that we intend to embrace. Authors are encouraged to submit papers relating to these themes:
<ul><li>Usage, implications and investigation of the “Dark Web”</li><li>Case studies and investigation techniques relating to cryptocurrencies</li></ul>
<h3>SCOPE</h3>The Internet has made it easier to perpetrate crimes by providing criminals an avenue for launching attacks with relative anonymity. The increased complexity of global communication and networking infrastructure and devices makes investigation of cybercrimes difficult. Clues of illegal activities are often buried in large volumes of data that need to be sifted through in order to detect crimes and collect evidence. The field of digital forensics and cybercrime investigation has become very important for law enforcement, national security, and information assurance. Digital forensics and cybercrime investigations are multidisciplinary areas that encompasses law, computer science, finance, telecommunications, data analytics, policing and more. ICDF2C brings together practitioners and researchers from diverse fields providing opportunities for business and intellectual engagement among attendees.
<ul><li>The following topics highlight the conference’s theme:</li><li>Anti Forensics and Anti-Anti Forensics</li><li>Big Data and Digital Forensics</li><li>Business Applications of Digital Forensics</li><li>Civil Litigation Support</li><li>Cloud Forensics</li><li>Cyber Crime Investigations</li><li>Cyber Criminal Psychology and Profiling</li><li>Cyber Culture & Cyber Terrorism</li><li>Data hiding and steganography</li><li>Database Forensics</li><li>Digital Forensic Science</li><li>Digital Forensic Tool Testing and validation</li><li>Digital Forensic Trends</li><li>Digital Forensics & Law</li><li>Digital Forensics and Error rates</li><li>Digital Forensics novel algorithms</li><li>Digital Forensics Process & Procedures</li><li>Digital Forensics Standardization & Accreditation</li><li>Digital Forensics Techniques and Tools</li><li>Digital Forensics Triage</li><li>e-Discovery</li><li>Hacking</li><li>Incident Response</li><li>Information Warfare & Critical Infrastructure Protection</li><li>Law Enforcement and Digital Forensics</li><li>Machine learning and Digital Forensics</li><li>Malware & Botnets</li><li>Mobile / Handheld Device & Multimedia Forensics</li><li>Money Laundering</li><li>Network forensics</li><li>New chip-off techniques</li><li>Novel Digital Forensics Training programs</li><li>Online Fraud</li><li>Programming Languages and Digital Forensics</li><li>SCADA Forensics</li><li>Sexual Abuse of Children on Internet</li><li>Software & Media Piracy</li><li>Theoretical Foundations of Digital Forensics</li><li>Traditional Criminology applied to Digital Forensics</li><li>Philosophical accounts for Cyber Crime and Digital Forensics</li></ul>
<h3>RESEARCH PAPERS</h3>Papers describing original unpublished research are solicited. Submissions must not be concurrently under review by a conference, journal or any other venue that has proceedings. Papers in the topic areas discussed are preferred, although contributions outside those topics may also be of interest. Please feel free at any time to contact the conference general chair if you have questions regarding your submission.
<h3>BEST PAPER AWARD</h3>The program committee may designate up to three papers accepted to the conference as ICDF2C Best Papers. Every submission is automatically eligible for this award.
<h3>OTHER SUBMISSION CATEGORIES</h3>Submissions can be made in a number of categories: Completed research papers, research-in-progress papers, industrial talks, panel and tutorial proposals, and round table discussions. Please follow the following guidelines in preparing your submission.
<ul><li>Completed Research Papers: No longer than 10 pages (including abstract, figures, tables and references).</li><li>Research in Progress Papers: No longer than 6 pages (including abstract, figures, tables and references).</li><li>Industrial Talk: Typically a 1,000 word description of the proposed talk. All talks must be vendor neutral.</li><li>Round Table Discussion: Typically a 1,000 word synopsis of the topic area.</li><li>Panel Proposals: Typically a 1,000 word description, identifying the panelists to be involved.</li><li>Tutorial Proposals: Typically a 1,000 word description of topic(s), potential speakers, program length, and potential audience. Also, include proposer resume(s).</li></ul>
<h3>SUBMISSION INSTRUCTIONS</h3>Paper submission will be handled electronically. Papers must be formatted using Springer LNICST Authors’ Kit (http://d-forensics.org/2015/show/authors-kit) and submitted only through Easychair.org by going here: https://www.easychair.org/conferences/?conf=icdf2c2015.
<div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">
All submitted papers will be judged based on their quality through double-blind reviewing. Authors’ names must not appear in the paper. All other submissions should be sent via email to the conference general chairs (Dr. Joshua I. James joshua at cybercrimetech dot com).
</div><h3>PUBLICATIONS</h3>Accepted papers will be published in the ICDF2C 2015 Conference Proceedings and by Springer-Verlag in the Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Tele-communications Engineering (LNICST) series.
<div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">
The proceedings will be available both as paper-based copies and via Springerlink, Springer’s digital library. In addition, the content of the proceedings will be submitted for inclusion in leading indexing services, including DBLP, Google Scholar, ISI Proceedings, EI, CrossRef and Zentralblatt Math, as well as ICST’s own EU Digital Library (EUDL).
</div><div style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;">
Further, we are partnering with Elsevier’s “Digital Investigation: The International Journal of Digital Forensics & Incident Response” to invite expanded versions of specially selected papers for inclusion in their SCI-indexed publication.
</div><h2 style="background-color: white; border: 0px; font-stretch: inherit; margin: 0px 0px 0.5em; padding: 0px; vertical-align: baseline;"></h2>

4 min read

[How To] Installing LIBEWF in Ubuntu Trusty

Installing LIBEWF is normally straightforward. Usually the most difficult part is remembering which packages are required for the dependencies. When running configure, I always like to have “support” dependencies filled out. While some of these are not necessary, you may find yourself needing them someday, and having to recompile.

<div class="separator" style="clear: both; text-align: center;"></div>On a almost-brand-new install of Ubuntu Trusty (64bit), these are the required packages:

apt-get install build-essential autoconf automake libfuse-dev uuid-dev libbz2-dev zlib1g-dev

Then you just download LIBEWF, untar, and run ./configure. All dependencies should be filled out.

From there it is just a simple make to start working with forensic file formats.

~1 min read

[Hash sets] Korea University DFRC Reference Data Set

If you work in the area of digital investigation, you probably know about NIST’s National Software Reference Library (NSRL).
<blockquote>The National Software Reference Library (NSRL) is designed to collect software from various sources and incorporate file profiles computed from this software into a Reference Data Set (RDS) of information. The RDS can be used by law enforcement, government, and industry organizations to review files on a computer by matching file profiles in the RDS. This will help alleviate much of the effort involved in determining which files are important as evidence on computers or file systems that have been seized as part of criminal investigations.</blockquote>In other words, the NSRL is a very large collection of file hashes for ‘known’ software. In most cases it can be treated as a known good hash set, for filtering out potentially uninteresting files from a case.

The NSRL hashes are also hosted on hashsets.com, allowing you to query their database for a particular hash if you don’t want to store the files locally.

<div class="separator" style="clear: both; text-align: center;"></div>
NSRL is very useful, but it does have some limitations. The Korea University Digital Forensic Research Center is attempting to solve some of these limitations but providing the DFRC Reference Dataset.

Their Reference Data Set includes hash values from software used in South Korea as well as the NSRL. Currently, they have over 27 million hashes. Best of all, they provide a number of interfaces to test your data with. You can upload the suspect file directly, upload a list of hashes, search for a single sha1 or md5, or query their RDS via their REST interface! This way you can call their hash database directly from your tools.

Of course, you can also directly download their entire RDS.


1 min read

Indicators of Anti-Forensics

Project: Indicators of Anti-Forensics (IoAF)
Purpose: Digital forensic triage for anti-forensic activities
Status: Active
License: GNU GPLv3
Developer(s): KITRI’s Best of the Best Information Security Program

More information:
The ‘Indicators of Anti-Forensics’ (IoAF) project is an effort towards automated anti-forensic trace detection using signature-based methods. Each “version” represents the work of different KITRI Best of the Best groups to advance the idea.

The main IoAF program uses parsing modules to extract file meta-data and Registry key information from a system under investigation. Pre-defined signatures are stored in a SQLite database that is queried for each extracted object.

Signatures are created by using either real-time or snapshot based analysis on a similar system. Objects that are consistently updated by the action of interest are extracted, and further tested (e.g. how the object is updated). If the object is found to consistently correspond to the action of interest - and only the action of interest - it is included as a trace in the signature.

The purpose of the project so far is not to automatically reconstruct activities, but to quickly detect the presence of anti-forensic traces to let investigators know whether they should pay more interest to this device over others (digital forensic triage).

Related Publications:

<ul><li>James, J. I., Kim, M. S., Choi, J., Lee, S. S., & Kim, E. (2014). A General Approach to Anti-Forensic Activity Detection. eForensics Magazine, vol.3(5). 30–35. [Link]</li></ul>
Links:

<ul><li>Github Repository: https://github.com/hvva/IoAF</li></ul>

1 min read

[BoB] Anti Forensics Techniques and eForensics Mag


<div style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;">
</div>
As a mentor with KITRI’s “Best of the Best v2.0” information security education program, I was/am a mentor for a digital forensic analysis research group. This group was specifically focusing on anti-forensic action detection, which fits pretty closely with my dissertation work. The first group members produced a brief survey of anti-forensics encountered in the ‘wild’ by Korean Law enforcement. The main contents of the survey are in Korean because I forgot to single an English version…

From two groups working on the same project, a number of similar tools have been created. I’ve forked the main modules that can be found under IoAF at github. Please feel free to contribute or even fork the projects. We are continuing the project this summer, so hopefully cleaner, consolidated code will be available.

<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;">eForensics Magazine - Anti Forensics Techniques</td></tr><tr><td class="tr-caption" style="text-align: center;">eForensics Magazine: Anti Forensics Techniques</td></tr></tbody></table>While the first IoAF group is working on a paper for Digital Investigation, the second group decided to write an article about A general approach to anti-forensic activity detection. This article gives a pretty good literature review about some of the work done in general anti-forensic detection, then shows the investigators how to determine traces created by anti-forensic programs. The work is somewhat similar to the work of Geiger on ‘counter forensics’, but - I believe - the proposed method is easier for investigators to implement or even automate.

Their article can be found in eForensics Magazine Vol. 3 No. 5.
<div style="margin-left: 24pt; text-indent: -24.0pt;">
</div>While the developed tools are currently available on github, the next few months will see them refined. Stay tuned!

1 min read

[BoB] Indicators of Anti-Forensics Investigator Survey (Korean)

The following survey results are from Korean Digital Forensic Investigators concerning the use of anti-forensics observed in their investigations. This survey has been conducted by the KITRI Best of the Best (BoB) ‘Indicators of Anti-forensics’ project group.

1. 포렌식 분석 업무 기간이 어떻게 되시나요?
2. 포렌식 분석 시 증거물 1개당 혹은 디스크 하나 당 평균 얼마의 시간이 소요되나요?
3. 안티 포렌식 탐지 툴에 대하여 들어보신 적이 있으십니까?
4. 포렌식 분석 시 안티 포렌식 툴이 사용된 시스템을 분석하신 경험이 있으신가요?
5. 안티 포렌식 툴이 사용 된 시스템을 분석하셨다면 그렇지 않은 시스템을 분석할 때 보다 어느정도의 시간소비가 더 있으신가요?
6. 안티 포렌식 탐지 툴의 필요성에 대해서 느끼신 적이 있으신가요?
7. 분석 전 안티포렌식 행위를 탐지 할 수 있다면 분석에 용이하다고 생각하십니까?
8. 안티포렌식 탐지 툴에 있었으면하는 기능이 있다면 무엇인가요?
9. 분석시에 가장 많이 보였던 안티포렌식 툴은 무엇인가요?


<table border="0" cellspacing="0" cols="9"> <colgroup span="9" width="127"></colgroup> <tbody><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">3년 이상</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">‘4-5일</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">예상할 수 없음</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">안티포렌식의 범위가 어느 정도인지 모르겠음암호 프로그램의 사용부터 전문삭제 프로그램의 사용 또는 steaganography까지 사용하는 것을 전제로 하는 것인지 명확한 정의가 필요할 것 같음.해당 목적에 따라 안티포렌식 탐지 툴이 개별적으로 만들어지고 그것을 마지막에 통합하는 것이 가장 좋을 듯 싶네요.</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">패스워드 설정 암호화</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">1년 이상 ~ 3년 미만</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="10" valign="BOTTOM">10</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="48" valign="BOTTOM">48</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">방식과 설치 혹은 실행 날짜해당 방식에 대한 영향을 미치는 범위에 대한 안내</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">루팅툴</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">1년 이상 ~ 3년 미만</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">24~48</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">24시간 이상</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">- 기존의 삭제된 파일의 정보(제목, 시간, 등)- 안티포렌식 도구 정보</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">spaceEraser</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">3년 이상</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">500G기준 4시간</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">1-2시간정도</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">
</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">
</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">3년 이상</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="24" valign="BOTTOM">24</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="10" valign="BOTTOM">10</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">Install 여부, Portable 실행여부, 총 실행 횟수 및 삭제된 영역 확인 등</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">CCleaner</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">3년 이상</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="24" valign="BOTTOM">24</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">-</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">메모리 해킹 탐토르 네트워크 탐지</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">timestomp</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">3년 이상</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="6" valign="BOTTOM">6</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="6" valign="BOTTOM">6</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">타임라인 수정</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">final eraser</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">6개월 이하</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="6" valign="BOTTOM">6</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="3" valign="BOTTOM">3</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">먼저 안티포렌식 툴이 실행되는 것을 탐지할 것인지, 실행된 흔적을 탐지할 것인지에서 기능들이 달라지겠지만, 전자의 기준으로 보았을 때, 활성시스템 상태에서 현재 실행중인 프로세스에 대한 검사를 통해 탐지를 하는 퀵서치, 디스크내의 설치된 프로그램을 확인하는 정밀 검사 등의 기능이 있으면 좋을 듯 합니다.</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">일반적인 클리너, Wipe</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">1년 이상 ~ 3년 미만</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="72" valign="BOTTOM">72</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">상황에따라 다름</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">
</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">ccleaner</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">6개월 이상 ~ 1년 미만</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="80" valign="BOTTOM">80</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="0" valign="BOTTOM">0</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">
</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">
</td> </tr><tr> <td align="LEFT" bgcolor="#EEEEEE" height="17" sdnum="1033;1033;General" valign="BOTTOM">1년 이상 ~ 3년 미만</td> <td align="RIGHT" bgcolor="#EEEEEE" sdnum="1033;1033;General" sdval="120" valign="BOTTOM">120</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">없다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">인지한적 없다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">1.3배 정도</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">있다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">용이하다</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">완전삭제 탐지, 레지스트리 정보 및 인터넷 삭제 흔적</td> <td align="LEFT" bgcolor="#EEEEEE" sdnum="1033;1033;General" valign="BOTTOM">고클린</td> </tr></tbody></table>

1 min read

Convert EnCase hash sets to md5sum

I managed to get a hold of a list of known-bad hashes to use in an experiment. The hashes, however, were in EnCase “.hash” format.
<div>
</div><div>I am mostly using the SleuthKit’s hfind to do some hash comparisons. My setup could already use the NSRL hash sets with no problem, and TSK is supposed to support EnCase hash sets. I was able to create an index for the EnCase hash sets, but when I attempted to query, I would get an error:</div><div>
</div><div>Command: hfind db.hash [hash value] 
Error: “Cannot determine hash database type (hdb_setupindx: Unknown 
Database Type in index header: encase)</div><div>
</div><div>No responses when asking about the error on the mailing list, so I looked for other ways to access the hashes.</div><div>
</div><div>Finally, I came across Jesse Kornblum’s EnCase hash file converter (encase2txt). The tool built fine in Linux (Ubuntu), and the Windows binary worked with no issue on Windows 7 (64bit).</div><div>
</div><div>Just point the tool at the EnCase hash database and it will output all the hashes in a format like md5sum. Pipe this plain text output to a file, and you have an md5sum hash file. From this I was able to build the index (hfind -i md5sum hashes.md5) and query the database with no problems.</div><div>
</div><div>Thanks Jesse!</div><div>
</div><div>Building and usage: http://jessekornblum.livejournal.com/166275.html</div>

1 min read

Signature Based Detection of User Events for Post-Mortem Forensic Analysis

As seen on DigitalFIRE
<div>
<div class="separator" style="clear: both; text-align: center;"></div>The concept of signatures is used in many fields, normally for the detection of some sort of pattern. For example, antivirus and network intrusion detection systems sometimes implement signature matching to attempt to differentiate legitimate code or network traffic from malicious data. The principle of these systems that that within a given set of data, malicious data will have some recognizable pattern. If malicious code, for example, has a pattern that is different in some way to non-malicious data, then the malicious data may be able to be differentiated with signature-based methods. In terms of malware, however, signature based methods are becoming less effective as malicious software gains the ability to alter or hide malicious patterns. For example, polymorphic or encrypted code.

This work suggests that signature based methods may also be used to detect patterns or user actions of a digital system. This is based on the principle that computer systems are interactive. This means that when a user interacts with the system, the system is immediately updated. In this work, we analyzed a user’s actions in relation to timestamp updates on the system.

During experimentation, we found that timestamps on a system may be updated for many different reasons. Our work, however, determined that there are at least three major timestamp update patterns given a user action. We define these as Core, Supporting and Shared timestamp update patterns.

Core timestamps are timestamps that are updated each time, and only when, the user action is executed.

Supporting timestamps are timestamps that are updated sometimes, and only when, the user action is executed.

Shared timestamps are timestamps that are shared between multiple user actions. So, for example, the timestamps of a single file might be updated by two different user actions. With shared timestamps it is impossible to determine which action updated the timestamp without more information.

By categorizing timestamps into these three primary categories, we can construct timestamp signatures to detect if and when a user action must have happened. For example, since only one action can update Core timestamps, the time value of the timestamp is approximately the time in which the user action must have taken place.

The same can be said for Supporting timestamps, but we would expect Supporting timestamps values to be at or before the last instance of the user action.

Using this categorization system, and finding associations of timestamps to user actions, user actions in the past can be reconstructed just by using readily available meta-data in a computer system.

For more information, please see our article on this topic:

James, J., P. Gladyshev, and Y. Zhu. (2011) “Signature Based Detection of User Events for Post-Mortem Forensic Analysis”. Digital Forensics and Cyber Crime: Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering. Volume 53, pp 96-109. Springer. [PDF][arXiv:1302.2395]</div>

Image courtesy of Salvatore Vuono / FreeDigitalPhotos.net

2 min read

Forensic Acquisition of a Virtual Machine with Access to the Host

Someone recently asked about an easy way to create a RAW image of virtual machine (VM) disks, so here is a quick how-to.

<div class="separator" style="clear: both; text-align: center;"></div>If you have access to the VM host, you could either copy and convert the virtual disks on the host using something like qemu-img, or if for some reason you cannot convert the virtual disks, you can image the VM from within the virtual environment. This how-to will go through relatively easy ways to image a live or offline virtual machine using the virtual environment with access to the host.

First, if the virtual machine cannot be shut down (live environment), you will make changes to the ‘suspect’ environment. If it is a suspect device, make sure your tools are on write-protected media. Verification of disk images won’t work in a live environment since the disk is changing while imaging is taking place. If you are doing and offline acquisition for forensic purposes, make sure you are verifying the images once you create them.

If the VM is live and cannot be shut down:
<ul><li>Fist check if the management interface allows devices to be attached to the VM; specifically USB/Firewire devices.</li><ul><li>If you cannot attach devices for whatever reason, then you will have to use a network share or a netcat tunnel to copy the image.</li><li>Ensure your storage media is larger than the Virtual Machine’s disk</li></ul><li>If it is a Windows environment, copy your imaging program like FTK Imager lite, or dd.exe from unxutils, to the network share/USB device. I also like chrysocome’s dd since it lets you list devices.</li><li>In the Virtual Machine, mount your share/USB device</li><li>From the mounted device, you should be able to access and run the imaging tools you copied previously - ensure you output the image to your share/USB device.</li><ul><li>dd</li><li>FTK Imager Lite</li></ul></ul><div>
</div>If the VM is offline or can be shut down:
<ul><li>First check if you can boot the VM from a CD/USB device</li><ul><li>If yes, use a live CD like FCCU or Helix to boot the VM</li><ul><li>All we are really interested in is 1) an imaging program on the live CD and 2) USB or network support and 3) netcat installed.</li></ul><li>If no:</li><ul><li>Can you add CD/USB support to the VM?</li><li>Can you copy the VM, and convert to a RAW image using qemu-img (part of qemu)?</li><li>Boot the VM, and make an image in the live environment (go to the live imaging section)</li></ul></ul><li>After you have booted the VM from a live CD…</li><ul><li>Using external storage to store the image:</li><ul><li>List the current disks - (fdisk -l) - and take note of the current suspect disks to image</li><ul><li>/dev/hda would be a physical disk, while /dev/hda1 would be partition 1 on disk ‘a’</li></ul><li>Attach and mount an external storage disk</li><li>Make a copy of the physical disk (usually something like /dev/sda) using an imaging program like dd or guymager.</li><ul><li>Make sure you are copying the image to your external disk!</li></ul></ul><li>Using the network:</li><ul><li>Network share:</li><ul><li>List the current disks, and take note of the suspect disks to image</li><li>Mount the network share</li><li>Make a copy of the physical disk (usually something like /dev/sda) using an imaging program like dd or guymager.</li></ul><li>No network share:</li><ul><li>Set up a netcat tunnel between a server (any computer you control), and the suspect VM (client)</li><ul><li>Note: this connection is unencrypted!!!</li></ul><li>You can use ssh or cryptcat to for an encrypted tunnel, and bzip for compression and faster transfer speeds.</li></ul></ul></ul></ul>
That’s it. Pretty generic, but it should be enough to get anyone started. Please comment if you have any questions

Bookmark and Share
Image: FreeDigitalPhotos.net

2 min read
Back to Top ↑

infosec

Testing File Systems for Digital Forensic Imaging

Introduction - the problem

Recently I’ve been doing a lot of large disk forensic imaging. I usually use Linux-based systems for forensic imaging. A normal case would be physical imaging of a source to an ext4 formatted destination. I would normally get about 120MB/s imaging speed, depending on the source disk.

2 min read

Getting started in Digital Forensics

A lot of people have asked how to get started with digital forensics. It’s great that so many people from so many different places are interested. There are many different paths available. To try to help aspiring digital forensic scientists, I put together the following recommendations for a good theoretical and practical background.

5 min read

EWF Tools: working with Expert Witness Files in Linux

Expert Witness Format (EWF) files, often saved with an E01 extension, are very common in digital investigations. Many forensic tools support E01 files, but many non-forensic tools don’t. This is a problem if you are using other tools, like many Linux utilities to try to do an investigation.

2 min read

Password Cracking Test Data

Here are some files to test your password cracking skills. All of them can be done in less than a few hours with CPU-based cracking. You can download the file and practice hash extraction + cracking, or just download the hashes directly.

~1 min read

Horrible messaging is bad for national security

For over a year, anyone with a mobile phone in Korea has had to put up with spam text messages from Korea’s Ministry of Public Safety and Security (국민안전처). I thought it was wise to have such an emergency system considering that I live about 30km from the DMZ (boarder with the North), unfortunately this ministry found a way to make such an emergency system less effective.

5 min read

[How To] Volatility Memory Analysis Building Linux Kernel Profiles

Memory foreniscs in Linux is not very easy. The reason is because the Linux kernel changes data structures and debug symbols often. Users can also easily modify and compile their own custom kernels. If we want to analize Linux memory using Volatility, we have to find or create linux profiles for the version of Linux that we are trying to analize. Linux profile creation for Volatility is not that difficult. The documentation claims that Volatility will support profile sharing in the future, which should make Linux support much easier.

~1 min read

Using Autopsy 4 to export file metadata

Autopsy 4 is a very powerful digital forensic investigation tool. Today, we are going to extract file and meta-data from a disk image (mobile phone) to use in external programs. We also briefly introduce Autopsy’s timeline feature.

~1 min read

Imaging Android with ADB, Root, Netcat and DD

Today we are going to acquire an android smartphone (Samsung Note II) using Android Debug Bridge (ADB), netcat and dd. The system I am using is Ubuntu linux. On the “forensic workstation” you will need ADB and netcat installed. I’m using the excellent instructions from here.

3 min read
Back to Top ↑

Discussion

Koreas first step to globalization through ODT

ZDNet Korea reports that the South Korean government is making a first-step to shift from the proprietary Hangul Word Processor (HWP) file format (.hwp) to the Open Document Format (ODF). To understand why this is such a big deal, you first need to understand that HWP is part of the national identity. It is/was a government sponsored monopoly. Even schoolchildren were reminded that it’s their duty to buy HWP. They must have a valid license(s) (because no other word processor is acceptable). The format is proprietary, and no other word processor can read it. Microsoft released a HWP2DOC converter. Government organizations were forced to use HWP, and businesses where “encouraged” strongly.

5 min read

[How to] GPG and Signing Data

GNU Privacy Guard (GPG) uses public and private keys to secure communications (public-key cryptography). Many people use it to encrypt their email or other documents. An email encrypted with a user's public key can then only be decrypted with the same user's private key. This provides end-to-end encryption of the message, meaning that it is impractical for anyone that is listening in on the conversation to get the message in transit.


This is, of course, good and bad. For example, Google and other email providers use email text to gain intelligence about the user, sell user information and do better ad targeting. This revenue stream keeps these services free, but users pay for it in terms of 'sold' privacy. Email using end-to-end encryption cannot be analyzed for useful marketing information. Because of this, these providers don't want to make it easy for mass encryption.

On the other hand, criminals also use Cloud-based email services. Making encryption somewhat difficult means that sloppy criminals are less likely to use encryption. If so, they may be easier to detect and catch.

Related Book: Lucas, Michael. PGP & GPG: Email for the Practical Paranoid. No Starch Press. 2006.

Whether you are paranoid and want all your emails encrypted (good luck), or you are trying to implement a personal or business data classification policy, GPG can help with encryption requirements.

Beyond encryption, GPG is useful for signing data. This is not exactly a signature that you would put on a document. Instead it is a signature that verifies that the data is correct. The video below describes how to sign data.




<div style="text-align: justify;">Signing data lets your contacts know that the data has not been modified from the time it left your possession. Signing is NOT encryption. Everyone could see the contents. Singing just allows your contact to know the data came from you, and it is in it’s original state.</div>

1 min read

No More Ransom - Detecting and unlocking ransomware without paying

Data is valuable. Ransomware takes advantage of the financial or sentimental value of our data, as well as the fact that most homes and organizations do not have adequate data backup solutions in place.

Once a computer is infected with ransomware, individual files are normally encrypted and users are asked to pay a ransom to unlock their data. If the victim pays, the data may or may not be unlocked. Ransomware started off like most viruses, targeting average computer users opportunistically. Ransomware groups, however, started targeting hospitals, police organizations and others.

<div class="separator" style="clear: both; text-align: center;">Use nomoreransom.org to unlock your data</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">So what can you do if you are infected with ransomware? Internet vendors and law enforcement have come together to create No More Ransom. This website gives users information about current types of ransomware, download unlocking tools (for free), provides prevention information and even has a tool to analyze your encrypted files and recommends which unlocking tool to use.</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">The Problem</div><div class="separator" style="clear: both; text-align: left;">Ransomware is possible because people do not have backups in place.</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">The Solution</div><div class="separator" style="clear: both; text-align: left;">Backups.</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">If you have an extra hard-drive that you are not using, or even another computer that is often on, CrashPlan is a pretty straightforward backup solution that is free if you save data to your own computers.</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">Note: DropBox or similar are not good backup solutions because they constantly sync changes. If ransomware infects your systems, the changes may be synced to your cloud storage. With a backup solution like CrashPlan, 1) backup is not instantaneous and 2) CrashPlan keeps track of prior versions of data. So if encrypted files were backed up, you can still restore prior versions. Best of all, CrashPlan provides end-to-end encryption (if enabled).</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">
</div>


<div class="alignleft"> </div>

1 min read

Warning to Forensic Investigators: USB KILLER

This single is informational for digital forensic investigators and first responders. Be aware of the ‘USB Killer’. Very basically, it’s a USB device that contains a high-voltage capacitor that charges up from the USB power supply, then releases a large charge directly into the USB data bus potentially destroying the motherboard.
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">USB Killer device from USB Kill [https://www.usbkill.com]</td></tr></tbody></table>The device itself is made for ‘penetration testers’ to test the physical security of a system. The device shown is from USB Kill, but such a device would be trivial to create using any USB device and a high-voltage capacitor - like so.

Here are some comments on Reddit about whether a suspect would be liable if the police seize one of these and fry the investigation computer / write blocker.

This device is not to be confused with the USB Kill Switch, that checks if devices are added or removed and shuts the system down. The USB Killer focused on physical damage.

Unfortunately, I’ve not seen more information on forensic forums about these type of devices. SANS and Forensic Focus have some short articles on it. The device looks like a normal USB stick. Be sure to check any USB devices before imaging. <div class="alignleft"> </div>

~1 min read

Open Source Tools Accepted in Court

Reply to an email I received:<div class="separator" style="clear: both; text-align: center;"></div><div>
<div>
<div><div>Is it possible to use Linux live CDs (or open source software) without trouble in court?</div><div>
</div><div>The answer is yes, certainly.</div><div>
</div><div>First, there is precedent in North America and Europe. See this, relatively old article from Italy [http://nannibassetti.com/digitalforensicsreport2007.pdf].</div><div>
</div><div>For a full discussion about open source tools in court, I highly recommend the following paper: http://www.digital-evidence.org/papers/opensrc_legal.pdf</div><div>
</div><div>Very basically, to have evidence obtained using open source tools / Linux live CDs accepted in court, you need to prove that the tools give ‘correct’ results and do not modify potential evidence. Check local court rules for any additional standards that need to be met. If you need any help with tool testing, please contact me.</div><div>
</div><div>For example, if your courts already accept EnCase and you want to compare acquisition and hashing, you can do the following:</div><div>1) acquire the data with EnCase and create a hash of the data</div><div>2) acquire the data with an open source tool and create a hash of the data</div><div>3) compare the hashes of the suspect data (should be the same)</div><div>4) repeat with 5+ different exhibits to show that the same result is always found</div><div>
</div><div>If your courts accept EnCase, and you can demonstrate that an open source tool produces the same result, then the open source tool must also be accepted.</div><div>
</div><div>A procedure for tool testing should be created in your unit, if it does not already exist.</div><div>
</div><div>You might also be interested in the Open Source Digital Forensics Conference in the U.S.: http://www.osdfcon.org/</div><div>
</div><div>Please let me know if you need any help with testing, or if you have any further questions.</div></div></div></div>

1 min read

“Child Predator Social Experiment” Another Form of Child Abuse?

I recently found a video claiming to be a 'child predator social experiment'. The idea is that children have access to different types of social media, and trust communications on those platforms even if they have never actually met the person in real life. The video shows different situations in which young boys (in this case) are lured into vans or strangers' houses based on online texting with a grown man posing as a young girl.

Apparently, the parents of these kids had 'warned them of stranger danger' before, but this was supposed to teach them some sort of lesson. The parents, apparently, were willing to subject their children to this form of torture.

As a teacher who deals will many different personalities on a daily basis, I find it very difficult to justify terrorizing children to teach them a lesson (dictatorships don't last forever). Article 5 of the Universal Declaration of Human Rights:  No one shall be subjected to torture or to cruel, inhuman or degrading treatment or punishment.

In the video below, one scenario involved a mother who immediately came in and started yelling at her child. If you are willing to lie to your kid to show them that you are right... well this seems like the best way to do it.

The other scenarios, the child was actually pulled into a van while yelling for help, and another locked in a room (and held down) with two half-naked men. Do you think the children were terrified? Is creating a situation in which the child believes he or she may be raped and/or murdered cruel, inhuman treatment? I think yes.

Children definitely need proper education about the dangers of the Internet. But in this case, the parents are taking lazy, disgusting way out.

There are many ways to monitor your child's activities on and offline. It is a mistake to believe that unfettered Internet access is a right for the child in your home. Talking to your children is a great first step, but parents also need to become Internet-literate, and know how to exert some sort of control over Internet access - other than on and off. Then, when the child demonstrates responsibility on the Internet the parent could give the kids more trust and more freedom online. But currently parents are all or nothing because they don't know how the technologies work. In such a case, the parent is 'forced' to take extreme measures to get a point across, because they were too lazy to learn how to control the situation.

Parents: if you care about your kids, take the time to learn. Giving them knowledge is much better than giving them rules that they don't understand.


<iframe allowfullscreen="" frameborder="0" height="315" src="https://www.youtube.com/embed/c4sHoDW8QU4" width="100%"></iframe>

2 min read

Ashley Madison Data and Ethical Use

On August 19th, the Impact Team released data of millions of alleged Ashley Madison users. Ashley Madison is a type of social networking website that promotes extra-marital affairs. After the main user data, the source code for the website, and emails from the founder were also released.

The data was initially released on the Dark Web, but has since been added to many clear web sites.

<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;">Impact Teams .onion site on Tor where the data can be downloaded</td></tr><tr><td class="tr-caption" style="text-align: center;">Impact Team’s .onion site</td></tr></tbody></table>
The data contains information about users names, email addresses, locations, addresses, credit card numbers, credit card transactions, sexual preferences, and much, much more.

If you are thinking about looking up your friends and neighbors, think about the following first:
<h3>You cant trust most versions of the data</h3><div>Many people are interested in this data. Hackers and criminals know that it will be very popular, so they will add viruses and other malware to the data. It is also possible that copied versions had records added specifically to frame people. If you are going to use any version, make sure it came from Impact Team.</div><div><h3>You cant trust websites that let you search the data</h3></div><div>Even before the data was released, some websites were created to be able to single the data if and when it was released. Some of these websites are created by trusted security researchers, some are created by hackers, some are created by people who just want to make money off of the situation. The result is that you should only use trusted websites when evaluating data like this. Other sites may have malware, and some sites may collect any email addresses, names, phone numbers that you enter to “check” and resell that information to advertising companies. Be careful with websites you don’t know.</div><div><h3>The original data could have been fake or tampered with</h3></div><div>Data directly from Impact Team is the ‘most reliable’ version that we will get. However, this does not mean that it has not been tampered with. They may have added or modified entries.</div><div>
</div><div>Further, some accounts that exist in the system are likely to be fake anyway. The only accounts we can be reasonably sure of are attached to credit card transactions, and even those may possibly have been created by a stolen card.</div><h2>Think about what you are doing</h2><div>With data like this, there are a lot of things we can learn. I have a copy of the data, and I did not look up my friends or co-workers. Why? Because I don’t care. Many websites are using the data to find who is cheating on who. That question is not interesting. What is interesting is, for example, why people are cheating. We might even ask is cheating a bad thing? For 39 million people, apparently it isn’t. Other interesting questions include how to prevent an attack like this in the future? What are the most common passwords? Etc.</div><div>
</div><div>While the data is useful for information security to learn from its mistake, making the data easily accessible for the sake of gossip is not useful, and could potentially cause mental and physical damage. Consider this ‘help’ that a woman received from radio talk show hosts. As soon as the woman found out her husband was cheating, the host even admitted he felt like a jerk.</div><div>
</div><iframe frameborder="0" height="573" id="molvideoplayer" scrolling="no" src="http://www.dailymail.co.uk/embed/video/1207589.html" title="MailOnline Embed Player" width="698"></iframe>

I completely agree with the approach from the people at haveibeenpwned.com who explain in their blog single that it is not the job of security researchers to out people. It is our job to protect people.

Every time there is a data leak, the information is used for all sorts of scams, and criminals are already using the AM data. The people involved in this breach could have their entire lives destroyed by releasing all of their information. Some people will say that they deserve it for being on such a site. Thats a matter of opinion. But as security researchers if we don’t look for ways to use (and release) data responsibility, we may be hurting people to find the ‘juicy bits’ rather than improving security, privacy and freedom for everyone.

3 min read

FIDO Alliance Password-less Authentication Spec.

[Edited 2015-02-02]
Last month, the FIDO Alliance released specifications that attempt to remove passwords from authentication. A few years ago, Google was already “declaring war on passwords”, even publishing an interesting article in IEEE Security and Privacy: Authentication at Scale. While some improvements have been made, like Google Authentication for 2-factor authentication, it does not appear to be widely implemented.
<div>
</div><div>The FIDO Alliance, however, is looking to change that with their Universal Authentication Framework (UAF) and Universal Second Factor (U2F) standards.</div><div>
</div><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">UAF and U2F process graphic from the FIDO Alliance</td></tr></tbody></table><div class="separator" style="clear: both; text-align: center;">
</div><div class="separator" style="clear: both; text-align: left;">Apparently a device with the UAF stack accepts either biometric input or a pin code to authenticate to UAF. UAF itself apparently keeps the user’s private key for associated websites. This key is used to send a login response when challenged.</div><blockquote class="tr_bq" style="clear: both; text-align: left;">A site or browser prepared to accept FIDO authentication can/will offer a user the option if a FIDO device is present. The first time a device is identified, a user will be offered the option to register their FIDO authenticator and use it. Subsequently, the registered device is automatically detected at the site and the user is presented with options for authentication, until/unless the user opts in or out. Please note that FIDO authentication is entirely device-centric. The authentication exchange occurs only between the FIDO device and the authenticating FIDO server, and the exchange is only in crypto.1</blockquote><div class="separator" style="clear: both; text-align: left;">U2F is not much different. It appears to be a USB or similar device much like PAM USB. Because the authentication is device-centric, backup pass codes to unlock the device are not interesting to an attacker (unless they can get local access).</div><blockquote class="tr_bq" style="clear: both; text-align: left;">Though a U2F device may store a password (really, it can be a 4-digit PIN) as a fallback for a user to unlock their own device locally (to effect changes, for example), this application can use a very simple, fixed password or code. In this way, the U2F PIN is not at all like OTP. The PIN available to a U2F user never needs to change, because it never does anything but allow a user to unlock the device locally. The PIN is only relevant to the FIDO device, so there is never the need to share to a server or a network, such as OTP must do. It has no value to a hacker, because it is meaningless to the server.1</blockquote><div class="separator" style="clear: both; text-align: left;">While this system may help with support for better authentication, of course there will have to be a ‘fall back’ method. Right now this comes in the form of backup one-time-passwords, which criminals have proven are easily stolen. Overall, this system appears to still be vulnerable to downgrade attacks (not every system will support this standard), and ultimately user error, but it does make things more difficult for mass attacks while still (potentially) being relatively easy for the end user.</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">Rightly, the FIDO Alliance answers the question “What makes FIDO different?” The answer being that they are providing on online crypto / authentication framework. Luckily, the FIDO Alliance has some big names that should be able to support large-scale standards like this for a long time. If not, basic passwords are better than security systems that can’t be updated.</div><div class="separator" style="clear: both; text-align: left;">
</div><hr width="80%" />
1 Clarification provided by Suzanne Matick

2 min read

What is your password?

Jimmy Kimmel, a U.S. talk show host, commented on U.S. cyber security after the 2014 Sony attacks. To humorously demonstrate the problem, they employed a bit of social engineering on the streets to see if they could get random users’ passwords. While most people did not directly give their passwords, it was not hard to get them to reveal some personal information. This is one reason why Google wanted to switch to security keyfobs (which does not seem to have taken off). Linux, by the way, has had device-based authentication for a while that can be configured to log into the system, websites, etc. using almost any connect-able device.
<div>
<div class="separator" style="clear: both; text-align: center;"></embed></div><div>
</div><div><div>Luckily for hackers, fooling a mass of people online is much easier than this.</div></div></div><div>
</div><div>What can you do? Lifehacker has talked about how to pick a strong password, methods for creating passwords you will remember, and even a list of best password managers. But most of all, just don’t tell people your password.

Photo by BM5K</div>

~1 min read

U.K. Adopts Open Document Formats to Improve Communication

According to UK.gov, the UK Government is adopting open formats for all of its government documents. The formats are PDF/A and HTML from viewing government documents, and ODF (the Open Document Format) for sharing and collaborating on government documents.

“Open Formats” are a way of saying “Publicly Available Standards”. The difference between these formats, especially ODF, and a format like DOCX, for example, is that anyone can easily get access to and understand the data structure of such documents. This means that any company could easily make a program that correctly opens or produces ODF documents.

<div class="separator" style="clear: both; text-align: center;"></div>But who cares about document formats? Well, anyone who has ever created content with a computer probably does. I remember we used ‘WordPerfect’ at home a (few) years ago. Any documents that were created with that program, and the “.wpX” file format, would now need to go through a conversion process to be viewed. Most likely the conversion process would not work very well. This means that the information in that document is mostly, if not completely, lost [without a great deal of effort]. Proprietary formats only last as long as the company that created it. Information about open formats will likely exist as long as the Internet.

Document formats directly relate to who can get access to information. South Korea has a company called Hancom that is a Microsoft Office replicate, except with better support for Korean. Although Hancom does support saving documents to DOCX and ODF formats, they also invented their own, called Hangul Word Processor (HWP), that is DOCX modified just enough to not work with MS Word. So what is the problem? Well, HWP can only be opened with the Hancom Word Processor or the viewer. The viewer is free, but available only for Windows. The problem then becomes that you have to use a version of Windows to view the document, and if you want to edit the document, you have to buy a copy of Hancom Word. In other words, if you are running OSX or Linux you cannot communicate. If you have any other office suite installed, you cannot communicate. This means that the only people Koreans can communicate with via HWP is other Koreans. Unfortunately, it is a national standard, which means that they have a huge problem communicating internationally. Best case, foreigners will pirate a copy of Hangul Word Process to view/edit the documents. Worst case, they wont bother opening the document at all.

The UK’s move is brilliant for the simple fact that more people can see what they are publishing (regardless of their computer setup), while at the same time potentially saving the Government some money.

Some groups have seen this as a move to boot out Microsoft, but I did not see it that way. They could still use Microsoft products, they are just making it easier for others who don’t use Microsoft to actually get access to information. Governments should try to improve communication nationally and internationally. And making documents available in easy to access formats is a step in the right direction.

2 min read

Dark Nets and Why They Are a Challenge for Police

Based on the BBC News article “Dark net used by tens of thousands of paedophiles” (2014), one might wonder what “Dark Net” is, and why Police are having such a hard time catching criminals.

<div class="separator" style="clear: both; text-align: center;"></div>To understand “Dark Net” you first need to understand a little bit about how the Internet works. As an example, think about how you are connecting to this blog. Your computer has to have an IP address, that is used as a unique indicator to identify you online. This IP address is normally assigned by your Internet Service Provider. When you want to connect to this blog, you are sending information back-and-fourth from your IP address to the IP address of the server.

This is good, however, whenever I get an IP address to connect to the Internet, everyone else can also connect back to me. It is similar to having a phone number. You need a phone if you want to call someone else’s phone, but that means that anyone who finds your number can also call you whether you want them to or not.

The result of this is that when we send information on the Internet, it is possible for other people on the Internet to copy our information. For this reason, many services use different types of encryption to hide the information going from one point to another. Many critical services use (or should use) encryption (like Banks) to protect your information. Because people need to protect their legitimate information - like banking transactions, credit cards, emails, etc. - the Internet has to support mechanisms to protect this information.

Dark Nets
Dark Nets like Tor and FreeNet take advantage of two things that also make the Internet work. First, it uses Public IP addresses to connect other computers that are also running the program. This means that a computer is connected to several other computers on the network.

Once connected with a public IP address, the computers encrypt the connections between all computers. In this way, no one can see what information is being sent between the two computers, this is what we call an encrypted “tunnel”.

Then Dark Net usually does two things. First, if there are a lot of computers connected to the network, then they each connect to a few other computers. They use these encrypted tunnels to route traffic through other computers before coming to the final destination.

For example, if I am computer A, and I want to access a resource at computer D, normally I would try to make a direct connection A->D. If police investigate computer D, they can normally find information about computer A directly connecting. Dark Nets (or Onion Routing) would instead use other computers to hide my request. If I am computer A, and want to reach a resource at computer D, a Dark Net may send my request through C, then B, then to D [A->C->B->D]. The next time I make a request, it may change its path [A->B->C->D]. What’s more, other computers requests will be coming through MY computer. In this way, it is very difficult to determine if MY computer is making a request, or if it was someone else. And since all this traffic is encrypted, to investigate the traffic you must be in the network. So routing traffic through different computers over encrypted networks can be used to hide information and make it very difficult to determine which computer actually sent the request. These cannot be blocked, otherwise you would also block all the good uses of encryption.

But many Deep Net clients go a step further. When you install a client like FreeNet, it will allocate a part of your hard drive to store data (also encrypted). If every computer on the network gives a small part of their hard drive space, then the network has a lot of distributed storage. This storage can only be accessed if you are inside the encrypted network. This means that people can host blogs, web pages… basically any service they want on this encrypted space. The data will be spread across many computers in many different countries, none of which will know exactly what information they are sharing on this allocated space (since they cannot access it themselves).

What Can Police Do About It?
Now that you know some of the things that Dark Nets do (different networks do different things), why is it such a challenge for Police?

First, consider that cybercrime investigation is a relatively new field. Except for officers that enjoy self-study, most Police update knowledge only when the amount of cases requiring new knowledge get past a certain threshold. Granted, there is just too much to learn - too many types of cybercrime to focus on one area. And Dark Nets (until now) are too difficult a problem with too little return to seriously invest much time in. That being said, people are working on the problem, and other government organizations are also throwing a lot of resources at the problem of crime on Dark Nets.

Another problem is jurisdiction. Police, at most, have jurisdiction only at a national level. Since all governments have budgets, they don’t usually investigate other countries’ criminals (unless there is some benefit). Since it is difficult to establish where a criminal on a dark net is located, they take a risk of investigating thousands of people that are not in their country, not a citizen, etc, etc (investigation dead-end). This implies not only a waste of time, but a waste of resources - including taxpayer money. Since taxpayers usually want a visible ‘return on investment’, many forces think it is better to go after the easy cases that can make quick headlines and better statistics.

Establishing reliable information takes time. In most countries that I have worked with, they do not have the ability (or desire) to consistently conduct cyber operations. Working on dark nets requires long term operations and planning that many countries would not be capable of executing.

Countries like the U.S. and U.K. are quite obsessed with the investigation of child exploitation material (rightly so, IMO), but for many other countries it is hardly a consideration. Even if the talk is of protecting children, the resources and planning dedicated to the task reflects how low-priority it actually is.

And finally, hundreds of thousands of pedophiles on news groups, websites, peer to peer networks, chat programs, etc. Indeed, Dark Net is a problem, but it is just one (more) problem. Police have no shortage of pedophile-related cases, and they won’t until we take a look at the social problems are causing them. Focusing on one network wont solve the problem, and until that network becomes the primary sharing method it wont be a major focus.

5 min read

An Argument for Assumed Extra-territorial Consent During Cybercrime Investigations

As seen on DigitalFIRE.ucd.ie

During cybercrime investigations it’s common to find that a suspect has used technology in a country outside of the territorial jurisdiction of Law Enforcement investigating the case. The suspects themselves may also be located outside of the territory of the investigating group. A country may be able to claim jurisdiction over a suspect or device that is located outside of their territory [1], however, foreign Law Enforcement would not have jurisdiction within the territorial jurisdiction of another country unless explicitly granted. This means that if a suspect or digital device is located in another territory, the investigating country may need to request assistance from the country that has territorial jurisdiction. This request could be in the form of mutual legal assistance requests, international communication channels such as INTERPOL and United Nations networks, through a personal contact within the country of interest, etc.
<div class="separator" style="clear: both; text-align: center;"></div>
It appears to be increasingly common that Law Enforcement will use personal contacts to quickly begin the investigation process in the country of interest and request data be preserved, while at the same time making an official request for cooperation through official channels. This is simply because official channels are currently far too slow to deal with many types of cybercrime that rely on preserving data before the records are overwritten or deleted; a problem that has been communicated by Law Enforcement for over a decade.

For similar reasons, Law Enforcement in many countries commonly access data stored on servers in countries outside of their jurisdiction. When and how they access this data is usually not well defined because law too, in most — if not all — countries, is failing to keep up with changes in cross-border digital crime. However, a recent work by the NATO Cooperative Cyber Defence Centre of Excellence — Tallinn Manual on the International Law Applicable to Cyber Warfare (Tallinn Manual) — attempted to explicitly state some of these issues and their practical implications, albeit in the context of Cyber Warfare.

In the Tallinn Manual the expert group considered issues of jurisdiction applied to cyber infrastructure. Of these considerations, they claim that “… States may exercise sovereign prerogatives over any cyber infrastructure located on their territory, as well as activities associated with that cyber infrastructure” [2] with some exceptions. Further, Rule 1 paragraph 8 stipulates that:
<blockquote class="tr_bq">A State may consent to cyber operations conducted from its territory or to remote cybercrime operations involving cyber infrastructure that is located on its territory.</blockquote>In this rule, the expert group gives the explicit example that a State may not have the technical ability to handle a situation within their territory, and thus may give permission for another State to conduct cyber activities within their jurisdiction.

Much of the discussion on sovereignty, jurisdiction and control stipulate the scope of control a State possesses; however, Rule 5 specifies the obligation of the State to other states. Specifically that “the principle of sovereign equality entails an obligation of all States to respect the territorial sovereignty of other States”. The expert group elaborates with Rule 5 paragraph 3 claiming that:
<blockquote class="tr_bq">The obligation to respect the sovereignty of another State… implies that a State may not `allow knowingly its territory to be used for acts contrary to the rights of other States’.</blockquote>Rule 5 paragraph 3 has interesting implications in cyber space. For example, the infrastructure of many different countries may be used in an attack against a single victim. Because of this rule, each country whose infrastructure was involved is obliged to not allow these attacks to continue once they are aware of such attacks. A State, however, is not necessarily obliged to actively look for attacks against other countries from its infrastructure.

In other words, if an attack is made from (or through) State A to State B, and State B makes State A aware of the attack, then State A is normally obliged to help in stopping — and presumably helping to investigate — the attack on State B, if possible.

The Tallinn Manual goes on with Rule 7 stating that an attack originating from a State is not proof of a State’s involvement, but “… is an indication that the Sate in question is associated with the operation”. However, instead of assuming that the State could be guilty, in this work we propose to assume the innocence of the state whose infrastructure is being used in an attack.

Let’s assume State B is affected by a cyber attack apparently originating from State A. State B then attempts to make State A aware of the attack. There is essentially one of three responses that State B will receive from State A: Response to collaborate, Response to not collaborate, or no response. In the case of no response if there is an assumption of innocence of State A, then State B may also assume that State A — being obliged to help — cannot stop the attacks because of lack of technical ability, resources, etc. In this way, consent to conduct remote cyber investigations on infrastructure within State A could potentially also be assumed.

In this way, when requests for assistance are made between States, if one State does not, or cannot, respond to the request, then cyber investigations can continue. Under this assumption, countries with intention to collaborate but limited investigation capacity, convoluted political and/or communication processes, or just no infrastructure will gain increased capacity to fight abuses of their infrastructure from countries that have more resources.

By assuming innocence of a state, at least four current problem areas can be improved. First, by assuming a State’s consent for remote investigation upon no reply to international assistance requests, this will lead to a reduction in delay during cross-border investigations for all involved countries despite weaknesses in bureaucratic official request channels. Second, such an assumption will force States to take a more active role in explicitly denying requests, if so desired, rather than just ignoring official requests, which is a waste of time and resources for everyone involved. Third, depending on the reason for the denial, such an explicit denial to investigate attacks against other countries would be slightly more conclusive proof of State A’s intention to attack, or allow attacks, on State B, and could potentially help where attack attribution is concerned. And finally, such an assumption may also hold where mutual legal assistance currently — and oftentimes — breaks down; when dual criminality does not exist between two countries [3].

Essentially, if an attack on Country B occurs from infrastructure in Country A, Country A will either want to help stop the attack or not. By assuming that Country A does want to help but is simply unable to, this forces Country A to be explicit about their stance on the situation while at the same time ensuring that international cybercrime investigations can be conducted in a timely manner.


James, J. I. (2013) “An Argument for Assumed Extra-territorial Consent During Cybercrime Investigations”. VFAC Review. Issue 25. [PDF]

Bibliography


<li>Malanczuk, P. (1997). Akehursts modern introduction to international law (7th ed.). Routledge.</li><li>Schmitt, M. N. (Ed.). (2013). Tallinn Manual on the International Law Applicable to Cyber Warfare. Cambridge University Press.</li><li>Harley, B. (2010). A Global Convention on Cybercrime?. Retrieved from http://www.stlr.org/2010/03/a-global-convention-on-cybercrime/</li>


Image courtesy of jscreationzs / FreeDigitalPhotos.net

6 min read

Cybersecurity and Challenges to Democracy

South Korea’s democracy can only be described as… developing. In the late 1970s, after the assassination of Military Dictator Park Chung-hee (who Koreans often refer to as ‘President Park’), slow but relatively steady progress in terms of democracy was made in South Korea. This despite the fact that the North Korean threat, and communism in general, was a topic constantly abused to allow the government to gain more power.

Recent incidents, however, have prompted a vie for power within the South Korean government that will have serious consequences for South Korea’s democracy going forward. In the recent weeks North Korea has been saber rattling per usual, to which South Koreans are generally unaffected. One issue with this situation, however, is that the recent cyber attacks on South Korean banking and broadcasting systems are also being attributed to North Korea without, at the time of this writing, any verifiable proof that North Korea carried out the attacks. South Korea currently has IP addresses tracing back to various countries, and attack patterns that are “similar” to those used by North Korea in the past. At best, the evidence is unsubstantial. However, the South Korean National Intelligence Service (NIS) (think C.I.A. on steroids) has formally released a statement claiming North Korean involvement, which, of course, no one can confirm or deny.

This statement comes days after the exceedingly toned-down proposal of the South Korean “National Cyber Terror Prevention Act”. This act, in essence, gives the NIS, or NIS-controlled groups, full power to create, vote on and enforce anti-cyber terror policies through the creation of an NIS-lead “cyber control tower”. Ironically, the act itself was proposed by the very members of Congress who are responsible for keeping the NIS in check. The most interesting aspect of the act, however, is the definition of cyber terror. Cyber terror, according to the act, can potentially be almost anything.

Consider that the NIS was created during the time of a military dictatorship with the express intention of fighting communism. Because of that, the NIS effectively reports to no authority save for the president. The NIS has, in the past, attempted to push anti-terrorism acts that would allow them even more power with little checks on how that power is used and abused. However, when discussing physical terrorism, many experts can, and have, resisted such a push. Cyber terrorism, however, is extremely vague. So much so that even experts in the field do not agree on its definition and how it should be handled. This is a problem most countries are currently facing that is related to the lack definition of terms like cyber crime, cyber war and cyber terrorism. Because of the lack of definition, government agencies all over the world are attempting to create jurisdiction around whatever term they choose, generally focusing on which term will bring the most power and budget benefits rather than which term correctly describes the situation.

South Korea is an extreme example of how agencies can combine fear with vague terms to expand their power. Of course, this is all made possible because the South Korean people allow it to happen, normally through complacency and cultural restrictions. However, regardless of culture, similar situations are happening in most countries right now. While a centralized all-knowing government ‘cyber control tower’ may have some benefits, it simply won’t stop cyber crime/terror/war. Further, what benefit such an over-powered group could contribute to a democratic society is completely undermined that group’s ability to abuse such power.

The reality is, cyber crime/terrorism/war is largely made possible by the public, and just like ‘traditional crime’ will never be completely stopped. The general public (globally) appear to have little interest in securing themselves, and they are about to give up what little freedom they have so their governments can ineffectively protect the people from themselves. Anyone who chooses to use technology should take responsibility for themselves, get informed and start implementing basic cyber security practices. If the people start securing themselves, the majority of digital crimes can be drastically reduced, and it won’t cost anyone hard-earned freedoms to do it.

[PDF]

3 min read

What is Cybersecurity?

Last week, a number of Korean organizations fell victim to cyber attacks. This has prompted discussions about cybersecurity in Korea, and while following this issue I’ve realized that Korea’s main challenge appears to be understanding what cybersecurity actually is.

From many of the discussions, representatives from various organizations appear to believe that security is a force, much like the police or military. Cybersecurity, however, is not an organization. It is not something that can be prevented by a single group. Cybersecurity is a responsibility – a mindset – that each technology user must adopt. Everyone plays a part in the cybersecurity of Korea (and the world), and anyone not considering the security of their devices are putting not only themselves, but also their friends/family/workplace/bank/government/etc. in danger.

Indeed, organizations can play a part in helping to improve cybersecurity. Police investigations, for example, can lead to catching cyber criminals, and thus potentially reduce on-line crime. But Police cannot be everywhere, and are inherently reactionary. And unless citizens want the government protecting the people from themselves (via pre-incident monitoring to make sure you don’t click on the ‘wrong’ link), then security of the country should be achieved through education of everyone.

The thing that every citizen, company and government entity needs to realize that your device probably will be compromised. So think of security as a function of time. With enough time, even the strongest security can be broken. So just give hackers less time. Change your passwords often, factory reset your phone and reformat your computers every 6 to 12 months, make sure your software is always up to date, use anti-viruses and firewalls on all your devices, and be very selective about software and websites you use. There is a lot of information available about on-line security, so there is really no reason not to understand and implement the basics. It doesn’t take a lot of time, and it could end up saving you, or someone you love, a lot of inconvenience later.

Remember, cybercrime is not static. Security that worked yesterday may not work today. So securing devices should become a way of life, not a once-off effort.

Security resources: www.google.com/intl/ko/goodtoknow, www.kisa.or.kr and www.ctrc.go.kr

Letter to Editor: [PDF ENG] [PDF KOR]

1 min read

Social Media and Intelligence Gathering


As seen on DigitalFIRE

Online social media has changed the way many people, businesses and even governments interact with each other. Because of Twitter’s popularity and its ability to broadcast small pieces of information to a large number of people, it is an effective form of mass communication. However, ease in communication that allows the public to freely communicate anything they wish can be used for both benefit and harm in a number of ways.

For example, in 2011 panic ensued as parents in Veracruz, Mexico rushed to pick up their children from school amongst reports of gang-related kidnapping and shootings [1]. During this time, it was reported that panic led to an increased number of car accidents and denial of service on emergency response numbers [2]. The panic, however, was based on (plausible) claims from two people who singleed about the false gang-related activity on Twitter, which later went ‘viral’.

In 2012 a teen gained worldwide notoriety by asking her followers on Twitter to call the police, claiming someone broke into her home [3]. Her case was later determined to be a runaway attempt, but was not discovered before reaching the number 2 most popular worldwide topic on Twitter for that time-period [4].

And in the 2008 terrorist attacks in Mumbai, India, claims emerged that the terrorists were monitoring social media outlets to extract operational intelligence to avoid police and potentially locate more victims [5][6].

These, and other similar cases, are not necessarily new. Abuse of emergency response numbers for non-emergencies are relatively common [7][8], and are even sometimes used as a way to attempt to distract police [9][10]. But just like emergency response numbers, social media can also be used to help in many situations.

For example, many law enforcement agencies, and even some communities themselves, have been creating and advocating the use of social networks to create a ‘virtual neighborhood watch’ that can consist of crime alerts from law enforcement and the public alike [11][12][13].

Even though social media was potentially used by the 2008 Mumbai terrorists, it was also used during the attacks by the public to report the news before traditional media outlets, warn of dangerous locations, communicate to loved ones, and even help organize support services such as blood donation [14][15]. This type of public emergency coordination was again demonstrated during the 2011 Mumbai bombings, where social media was used to track the bombings as well as organize support services for victims [16].

The negative aspects of online social media have prompted some countries to consider shutting down communication infrastructure services when they can be used against the public or state [17][18], with one extreme example being the 2011 Egyptian Internet outage during riots against the government in an attempt by the government to suppress information and disrupt public coordination [19]. However, some experts believe that the benefits of social media far outweigh any potential abuse. For example, Schneier [20] claims that “[t]errorist attacks are very rare, and it is almost always a bad trade-off to deny society the benefits of a communications technology just because the bad guys might use it too”.

How social media will continue to shape the public, governments and even crime remains to be seen. However, from a law enforcement perspective, the ability to communicate with and inform a large number of citizens at a time can be invaluable during a crisis. Further, intelligence about crime and criminals can often be gained via online social media sites such as Twitter. Again, gained intelligence can be used for positive or negative purposes depending on the perspective, but nevertheless, many users and criminals are constantly producing a stream of publically accessible data that may help investigations.

While the content of singleings should normally be considered hearsay and treated with caution, analysis of the produced meta-data may provide some potentially relevant information for investigations. Several related online social networking security awareness campaigns have been created to raise awareness for the amount of personal information people are – normally unwittingly – singleing.

One such site, “ICanStalkU.com” (I can stalk you), pulled Geo-Tagging information from the meta-data of pictures singleed on TwitPic.com. This Geo-Tagging information was then used to plot the user’s current location in real-time, and could potentially be used to track the current location and movements of a suspect, or help to place them at or near a location at the time of an incident. A more advanced, stand-alone program called “Creepy” [21] also uses the same Geo-Tagging information from many more social media outlets.

Another similar site that claims to raise awareness about over-sharing, is “PleaseRobMe.com”, which uses Twitter, Foursuare or Gowalla check-ins with associated times to attempt to report if the user is home or not. Similar location-tagging features now exist on many social media sites, and could potentially be used to gain intelligence about a particular user.

While location information may be relevant, perhaps an investigator needs to plan when an operation should take place. For this, the site “SleepingTime.org” may provide an analyst with the best time to find the user at home or away. SleepingTime.org uses a user’s Twitter account activity and time zone to estimate when the user is most likely to be asleep based on the time they normally do not have any twitter activity.

And finally, online social media is about social networks. Paterva’s Maltego [22] is a more advanced web mining application that can work with social network data, among others, to generate communication networks and conduct entity link analysis.

These are just some of the tools and potential intelligence that can be extracted from public sources for many users. Even without specific tools, publically available information about a particular user can oftentimes be mined with very little skill or time investment.

Because of social networking sites such as Twitter, a large amount of potentially valuable information can be provided to – and found about – the public, businesses, Law Enforcement, governments, and even criminals. Communication technologies can benefit the world; however, the same communication channels could also be abused. With the large amount of data being generated at present, and ability to easily communicate with a large population in near real-time, Law Enforcement should embrace social media outlets to more effectively share information, and also to receive intelligence that can help in the protection and prevention of crime.

Originally singleed in the Virtual Forum Against Cybercrime Newsletter, Issue 16 [PDF]

1. Miglierini, Julian. (2011) “Mexico Twitter terrorism chargers cause uproar.” BBC News. http://www.bbc.co.uk/news/world-latin-america-14800200.
2. (2011) “2 Mexicans face 30 years in prison for tweets that caused panic in violence-wracked city.” NY Daily News. http://articles.nydailynews.com/2011-09-04/news/30136979_1_tweet-panic-private-schools.
3. Murphy, Samantha. (2012) “Police: Teenage Girl’s Viral Tweet Was Kidnapping Hoax.” http://mashable.com/2012/10/01/teenage-girl-tweet-kidnapping/.
4. http://www.twee.co/topics/helpfindkara
5. (2008) “Terrorists turn technology into weapon of war in Mumbai.” http://www.couriermail.com.au/news/world-old/terrorists-and-technology/story-e6freop6-1111118178210.
6. Oh, Onook, Manish Agrawal, H. Raghav Rao. (2011) “Information control and terrorism: Tracking the Mumbai terrorist attack through twitter”. Information Systems Frontiers. Vol. 13. Issue 1. P. 33-43. Springer.
7. Nichols, Mike. (2008) “False 911 calls are alarmingly common.” Journal Sentinel Inc. http://www.jsonline.com/news/29432374.html.
8. Esposito, Richard, Christina Ng. (2012) “Police: Angry Ex-Girlfriend Triggered US Airways Bomb Hoax.” ABC News. http://abcnews.go.com/US/police-angry-girlfriend-triggered-us-airways-bomb-hoax/story?id=17170280#.UHu9A2lrZ3J.
9. FitzPatrick, Lauren. (2011) “Man pleads guild to making fake 911 call to try to help buddy.” Sun-Times Media, LLC. http://www.suntimes.com/news/6734423-418/man-pleads-guilty-to-making-fake-911-call-to-try-to-help-buddy.html.
10. (2012) “Smugglers Use Fake 911 Calls to Distract Police.” http://www.krgv.com/news/smugglers-use-fake-911-calls-to-distract-police/.
11. Catone, Josh. (2009) “Virtual Neighborhood Watch: How Social Media is Making Cities Safer.” http://mashable.com/2009/10/01/social-media-public-safety/.
12. Johnson, Kirk. (2012) “Hey, @SeattlePD: What’s the Latest?.” New York Times. http://www.nytimes.com/2012/10/02/us/seattle-police-department-uses-twitter-to-report-crime.html.
13. Barr, Meghan. (2009) “Neighbors Twitter, blog to keep criminals at bay.” NBC News. http://www.msnbc.msn.com/id/32372082/ns/technology_and_science-security/t/neighbors-twitter-blog-keep-criminals-bay/#.UHt5kGlrZ3J.
14. Beaumont, Claudine. (2008) “Mumbai attacks: Twitter and Flickr used to break news.” Telegraph Media Group Limited. http://www.telegraph.co.uk/news/worldnews/asia/india/3530640/Mumbai-attacks-Twitter-and-Flickr-used-to-break-news-Bombay-India.html.
15. Stelter, Brian, Noam Cohen. (2008) “Citizen Journalists Provided Glimpses of Mumbai Attacks.” New York Times. http://www.nytimes.com/2008/11/30/world/asia/30twitter.html.
16. Ribeiro, John. (2011) “Mumbai Uses Internet, Twitter to Cope with Terror Blasts.” IDG Consumer & SMB. http://www.pcworld.com/article/235672/article.html.
17. (2011) “British Government Considering Social Media Ban. Was China Right?” http://technode.com/2011/08/15/british-government-considering-social-media-ban-was-china-right/.
18. Phillip, Joji Thomas, Soma Banerjee. (2012) “Government for state-specific ban on social media, asks ISPs to build embedded technology.” Bennett, Coleman & Co. Ltd. http://articles.economictimes.indiatimes.com/2012-09-08/news/33696582_1_home-ministry-websites-social-media.
19. Bates, Theunis. (2011) “Protesters Left in the Dark as Egypt Blocks Internet.” http://www.aolnews.com/2011/01/28/protesters-left-in-the-dark-as-egypt-blocks-internet-cell-phone/
20. Schneier, Bruce. (2009) “Helping the Terrorists.” http://www.schneier.com/blog/archives/2009/01/helping_the_ter.html
21. http://ilektrojohn.github.com/creepy/
22. http://paterva.com/web6/

6 min read

Encrypted backup and the importance of redundancy

As more online storage is made available, it is often convenient to store our personal documents on the web to share between devices or with friends, family, co-workers, etc. How much you trust these services is entirely up to you. There are many benefits such as convenience and accessibility, as well as drawbacks, such as privacy concerns, denial of service and others.

I have noticed that with online storage, more people are backing up their documents (consciously or not), than were before online storage became so seamless. There is also an assumption that their data will be available when they want/need it. The important thing to remember, however, is that no backup solution is perfect. Be it online storage in the Cloud or backup to a local disk, there are potential risks to access and integrity of your data.

In my case, I rarely use Cloud-based storage services. I find that I don’t always have access to an Internet connection, and many of the files I want to back up are unlikely to be accessed in the short term. Basically, I am archiving some of my data, and there is really no benefit for me to archive to the Cloud. So for backup and archiving I have two external drives that I keep synced with rsync.

Both disks use full disk encryption. I do not keep personal information or secrets per se, but in 2010 I had an unencrypted disk stolen (and miraculously recovered). When it went missing, I was not worried about the information about me that was on the disk. I found that I was more worried about the pictures and movies of my friends and family (especially my niece and nephew), and what someone could potentially do with them. It is unlikely that a thief would use the pictures, but that was still my concern.

Since then most of my data is encrypted. However, there are still potential risks to the data, which I was reminded of recently. For data backup I have a primary backup drive to which the local machine is backed up daily, and a secondary drive that gets synced from the primary weekly.

Last week my system crashed while the computer was apparently in the process of backing up. Either the encrypted device or the file system got corrupted, and the backed-up data was effectively lost. Recovery might have been possible, but I didn’t take the time to do an analysis. This is because I could just recreate the primary encrypted disk, rsync from the secondary backup, then backup the local system like normal. Without the secondary backup, my current working documents would still exist, but I would potentially loose some archived information of sentimental value.

Long story short, no matter how you decide backup your data if you care about the data, keep at least a secondary backup in another location. This can minimize the risks to the access and integrity of your data since you never know what might happen.


Image courtesy of Renjith Krishnan / FreeDigitalPhotos.net

2 min read

Revisiting the Four Grand Challenges in Trustworthy Computing: Challenge 2

A while back we looked at Challenge 1 in the Four Grand Challenges in Trustworthy Computing from 2003. In my opinion, we have fallen quite short on Challenge 1, that is “eliminating epidemic attacks by 2014”. Today, we will look at Challenge 2.
<div class="separator" style="clear: both; text-align: center;"></div>
Challenge 2, is generally defined as “ensure[ing] that new, critical systems currently on the drawing board are immune from destructive attack”.

Challenge 2 looks at systems of critical importance that are currently being designed and implemented. Unlike Challenge 1, that focuses on systems that are already deployed, Challenge 2 focuses on security, reliability and trustability of systems that are (or were at that time) currently being developed.

The metric of success is based on the CIA model, focusing on systems that ensure:
<ul><li>Confidentiality</li><li>Integrity</li><li>Availability</li></ul>and is extended with:
<ul><li>“Auditability”</li><li>Global Accessibility</li></ul><div>The group identified a number of critical systems (Figure 1), and stated that “there is very little reason to believe that such systems, if developed under current technology, will be trustworthy”.</div><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Figure 1. Critical systems and infrastructure identified by the CRA group in 2003.</td></tr></tbody></table><div>This statement comes almost five years after the U.S. Presidential Directive 63, which had a national goal stating:
<blockquote class="tr_bq">No later than the year 2000, the United States shall have achieved an initial operating capability and no later than five years from today the United States shall have achieved and shall maintain the ability to protect the nation’s critical infrastructures from intentional acts that would significantly diminish the abilities of:
<ul><li>the Federal Government to perform essential national security missions and to ensure the general public health and safety;</li><li>state and local governments to maintain order and to deliver minimum essential public services.</li><li>the private sector to ensure the orderly functioning of the economy and the delivery of essential telecommunications, energy, financial and transportation services.</li></ul></blockquote>The Colloquium for Information Systems Security Education in 2008 again put critical systems, and specifically SCADA systems, as a priority area in need of organized research. There has been a growing amount of research into critical system defense, security and forensics, but the 2011 alleged hacking of an Illinois water system, as well as some infrastructures we have seen, lead me to believe that research is not being practically implemented.

From discussions with people dealing with critical infrastructure, there seems to be an attitude much like a home computer user. They know there is a risk, but in many cases don’t feel like there is a big enough risk to justify investing the amount of money necessary to update, secure and monitor their systems (even some physical systems). In the U.S., government regulation to that would allow the DHS to “enforce minimum cybersecurity standards on infrastructure computer systems that, if damaged, would lead to mass casualties or economic loss”. Regulation, however, was opposed.

I somewhat understand why some critical infrastructure providers may find it hard to justify large investment in cybersecurity. Last year, 198 cyber incidents were reported to DHS across all critical infrastructure sectors, most of which were reportedly spear-phishing attempts. Granted, many more attacks probably took place that were not discovered / reported, but with numbers like that, a director may be thinking that it is statistically unlikely that they would get hit.

For me, the takeaway is that critical systems are still not being designed with cybersecurity, and sometimes even physical security, in mind. Further, critical infrastructure providers have the same problems as any other business; their people - as well as technology - can be a security gap. Since critical infrastructure is a hot topic right now, I hope security and risk awareness increased, but I have yet to see any real changes implemented in many countries. Almost 10 years after the grand challenge was proposed, I would say that not only are we not designing systems that are “immune from destructive attack”, but we are still not designing critical systems with basic cybersecurity in mind.

</div>
Image: FreeDigitalPhotos.net

3 min read

Future Crimes Ted Talk

[Update] See Bruce Schneier’s response

Our friends at FutureCrimes.com recently had a good Ted talk about technology, crime and a potential way to fight crime in the future.

<div class="separator" style="clear: both; text-align: center;"></embed></div>
From Ted.com: “Marc Goodman imagines the future crime and terrorism challenges we will all face as a result of advancing technologies. He thinks deeply about the disruptive security implications of robotics, artificial intelligence, social data, virtual reality and synthetic biology. Technology, he says, is affording exponentially growing power to non-state actors and rogue players, with significant consequences for our common global security. How to respond to these threats? The crime-fighting solution might just lie in crowdsourcing.”

Bookmark and Share

~1 min read

Revisiting the Four Grand Challenges in Trustworthy Computing: Challenge 1

Almost a decade ago, the Computing Research Association published Four Grand Challenges in Trustworthy Computing. Working in a rapidly-evolving digital field it is easy to think everything we see is new, especially when it comes to digital crime. On a technological level this may be true, but if we look at higher level concepts has anything changed in 9 years, and what progress have we made?

In the introduction to the challenges, references are made to “increasingly portable systems in every aspect of life”, cyber defense/war, threats relating to power outages, transportation and communications system (critical infrastructure) denial of service (DoS); insider attacks; “loss of privacy; alteration of critical data; and new forms of theft and fraud on an unprecedented scale”. Each of the mentioned advancement of technology topics and identified threat areas are still relevant today. Most of which are gaining increasing awareness from the general public.

<div>The group’s overall goal was “to create an alternative future in which spam, viruses and worms… have been eliminated. In this vision of the future individuals would control their own privacy and could count on the infrastructure to deliver uninterrupted services… In such a world, policy and technology fit together in a rational way, balancing human needs with regulation and law enforcement.”</div><blockquote class="tr_bq">“In short, it would be a world in which information technology could be trusted.”</blockquote>To this end, the four identified grand challenges of trustworthy computing are:
<ol><li>Develop new approaches for eradicating widespread, epidemic attacks in cyberspace.</li><li>Ensure that new, critical systems currently on the drawing board are immune from destructive attack.</li><li>Provide tools to decision-makers in government and industry to guide future investment in information security.</li><li>Design new computing systems so that the security and privacy aspects of those systems are understandable and controllable by the average user.</li></ol><div>This single will focus on Challenge 1. The other challenges will be looked at in later singles.</div><div>
</div>Challenge 1: Eliminate Epidemic Attacks by 2014
Epidemic attacks, or “Cyber Epidemics”, in this case are basically categorized as viruses and worms, spam, and Distributed Denial of Service (DDoS) attacks.

Suggested approaches to achieve this goal are summarized as follows:
<ul><li>Immune System for Networks - respond to and disable viruses and worms by dynamically managed connectivity</li><li>Composability - two systems operating together will not introduce vulnerabilities that neither have individually</li><li>Knowledge Confinement - partitioning information so an attacker never has enough knowledge to propagate through the network</li><li>Malice Tolerance - continue to operating in spite of arbitrarily destructive behavior of a minority of system components</li><li>Trusted Hardware - tie software and services guarantees to the physical security of hardware devices</li></ul>With the 9th of July 2012 recently past, the first DoS attack that comes to mind is DNSchanger. And with the increase of cyber activism / terrorism, DDoS is a relatively common issue for businesses and government entities. Looking at the Q1 2012 reports from McAfee and Trend Micro, viruses and malware are at all time highs for all platforms. Spam is also still increasing, with Norton’s Cyber Crime Index projecting spam to be about 68% of all sent email traffic as total email traffic continues to increase.

Some advancements have been made in the suggested areas, for example Malice Tolerance, has been improved as a natural consequence to massively distributed computing. Trusted hardware became a big topic when Windows Visa used trusted platform modules (TPM) with BitLocker for disk encryption. TPM is still used, mostly in businesses, but did not really catch on with the public, seemingly because of privacy concerns and general lack of interest.

Immune System for Networks has advanced, but there are limitations. Security management platforms have to account for activities from many different sources, where malicious activities may not be known, or even suspicious. For businesses, targeted attacks using social engineering are increasing. The weak link continues to be a lack of awareness on the part of the user that allows many attacks to be successful. In these situations, current security systems are not robust enough to detect, and deal with, all types of incidents introduced by the users, while at the same time allowing flexibility and access that they require. Further, many organizations have not given proper consideration, or devoted enough resources, to their cyber security. For example, many business do not fully consider their cyber security, and are being targeted because of it [Irish Times][Infosec Island][CBS News].

As far as systems operating together that will not introduce vulnerabilities that neither have individually, I immediately think of application programming interfaces (API). APIs have become a common way for software components to communicate with each other, but many additional risks exist on both client and server sides when APIs are used [Dark Reading]. The Cloud Security Alliance even listed “Insecure Interfaces and APIs” as one of the top threats to Cloud computing (v1.0).

Finally, making sure an attacker never has enough information to propagate through the network. Unfortunately, this reminds me of Stuxnet. In the case of Stuxnet, the the malware propagates across Windows machines via network/USB, and checks for Seimens software. If the software is known to run on, or interface with, Windows machines, then the attacker just needs to take advantage of a vulnerability (or three) that is likely to exist in the system. There is not really much information that is needed to propagate through the network if network connectivity exits, and both systems are vulnerable to the same attack. Knowledge confinement could potentially be done by using non-standard configurations, but, then again, Verizon claim that in 2011, “97% of breaches were avoidable through simple or intermediate controls”.

So looking at Challenge 1 as it was defined in 2003, eliminating cyber epidemics by 2014 seems unrealistic at this stage. While some of the suggested approaches have been developed, the application of these ideas into the practices of people, businesses, and even governments has not come to fruition on a large scale. This does not mean we are less secure. WhiteHat Security claims that in Q1 of 2012 there are less (web) vulnerabilities, and those that are identified are being fixed faster than in previous years. But, like Verizon, they also found that basic mitigation techniques, such as application firewalls, could have reduced the risk of 71% of all custom Web application vulnerabilities.

Until everyone begins to understand cyber security and their role (and take it seriously), the challenge will not be met. Will the Challenge 1 recommendations ever completely eliminate cyber epidemics? I don’t think so. They can most definitely help, just like implementing basic security measures, but the Internet is not a closed system, and it only takes one weak link.

Bookmark and Share
Image: FreeDigitalPhotos.net

5 min read

Predictive Policing and Online Crime

FutureCrimes.com just passed on the single Sci-fi policing: predicting crime before it occurs. Crime modeling used by the LAPD appears to have contributed to, or is the result of, a 13% decrease in crime in the area in which it was being tested.

The crime model is apparently based on models to predict earthquake aftershocks. While I’ve not yet found any publications specific to the method, I am assuming the model deals with predicting crime based on the likelihood reoccurrence in a particular area.

I have discussed this type of modeling with officers from Chile before, who had been working on something similar before 2008. Crime types and the locations in which these were likely to happen could be accurately predicted, they said. The issue with disruption on crime hot-spots, however, is that the overall amount of crime is actually not reduced, but instead dispersed to other areas. For example, if a model predicts crime is likely to happen in a certain area, and officers begin patrolling the area, the crime is probably not tied specifically to the physical location. Yes, the patrol is a deterrent, and crime is reduced in the area, but what the Chile officers found what that the crime moved to other areas in the city.

In the LAPD’s case, crime was modeled and measured within a specific area. The article submits that “[c]rimes were down in the area 13 percent following the rollout compared to a slight uptick across the rest of the city where the program wasn’t being used”. The question is, was the “slight uptick” in other areas in the city a result of naturally increasing crime, or is it the result of displacement of crime from identified hot-spots? Based on Chile’s experiences, I am guessing the latter.

Predictive policing is an interesting concept that seems to be a natural extension of data mining over crime data. However, I have not yet seen research dealing with predictive policing of online crime. Just like a victim of opportunistic crime is likely to be re-victimized in the physical world, is it also likely that a victim of opportunistic crime online would be more likely to be re-victimized? If so, what methods to ‘cyber cops’ have to disrupt such crime? And finally, if disruption were possible online, would the crimes just be dispersed instead of reduced?


Related:

<ul><li>Stopping Crime Before it Starts</li></ul>

Google Scholar Search:

<ul><li>Predictive Policing</li><li>Modeling Crime</li></ul>

Image: FreeDigitalPhotos.net

1 min read

International Symposium on Cybercrime Response (ISCR) 2012

I’m just back from the 1st INTERPOL NCRP Cybercrime Training Workshop and International Symposium on Cybercrime Response 2012, held in Seoul, South Korea. The joint INTERPOL and Korea National Police (KNP) conference was hosted by the KNP Cyber Terror Response Center (CTRC).

ICSR 2012 Agenda

<div class="separator" style="clear: both; text-align: center;"></div>The first day was a look at Law Enforcement (LE) communication networks, including INTERPOL, the G8 24/7 High Tech Crime Network1, and even more informal communication channels. The overall consensus seems to be that the more formal networks are too slow to deal with the requirements of international cybercrime investigation requests. This appears to be partially a limitation with the efficiency of the networks as well as the ability of receiving countries to process the requests either because of resource issues or laws (or lack of) in the requested country to deal with the investigation request.

It was determined that informal channels of LE communication are currently more effective since they bypass international bureaucracy. These channels appeared to be created mostly by networking (conferences, etc.), and luck.

There essentially seemed to be three camps: Formal communication networks like INTERPOL and G8 24/7, less formal networks created via bilateral agreements, and LE social networks (p2p). Each camp had success stories, and I know each has had failures.

The question is, how can the situation be improved? Criminal communication networks at an international level work much more efficiently than law enforcement networks. There are many reasons why, but what can be done?

The issue of trust in LE communication was brought up, where if you are requesting information or cooperation the person with whom you are communicating should be more than just a name on a list. This is an interesting point to me. If LE is given a list of contact points per country from a formal communication network, do they question the contact point? I think they would automatically trust the contact point via the reputation of the network referring them, even without meeting the contact personally. The issue comes when these contacts are slow or fail to respond to requests from the network. Trust, then, comes from showing you are reliable when something is requested, whether or not you physically meet the contact representative.

Another interesting point was the concept of “exercising” your team(s) in international request response. LE basically creates an incident response (IR) plan for international requests. Incident response is a huge topic in network security. If you read this article, for example, it is geared (at a high level) towards setting up an incident response plan. Each of the tips, however, could be directly transposed into international LE response. The discussed point of exercising your team would be the final testing requirement. Unfortunately, this is the phase that is often neglected, usually due to time and resources. In the case of LE, especially at an international level, it would be difficult to coordinate and perhaps even justify the time needed just for testing communication when it was not really requested.

The topic of international LE communication came down to looking at a few different questions (and I added a few): What exactly is the problem, and has a solution been identified? What type of information is needed? Who has legal authority? Have international procedures been established? Are all concerned bodies part of the procedure and willing to cooperate? How do we test the procedure? How do we measure success? Who is responsible for updates?

These questions are not exactly easy to answer, even within a single organization, and working with multiple organizations in multiple jurisdictions to find answers to these questions is even more difficult and time consuming. In my opinion, this is where providers of formal networks should be filling in the gaps. I should not expect my local investigators to create their own international networks, and unless this process is centralized then different procedures will be created, incomplete networks will be formed and there will be much duplication of effort.

The rest of the conference further discussed communication and law, examined current threats, and some gave case studies (success stories) involving international communication and collaboration between international law enforcement, private sector and sometimes academia.

Overall the conference is directed at practitioners. It did not get very technical nor theoretical, and could probably be understood by anyone regardless of their familiarity with cybercrime. Some cybercrime damage estimates were given, although how to accurately measure is a problem that was not addressed. The estimates looked impressively dramatic, but felt like the stats from different presentations did not relate to each other well.

Similarly, definitions used in each presentation were quite different for the same terminology. The group was composed of people from many different countries, all practitioners, but a lack of consistency in the use (and scope) of terms was an obvious communication problem, even for terms as general as “cybercrime”. Sometimes nonstandard term usage made it difficult for me to know exactly what the speaker really meant. This made me realize that even in the same area of cybercrime investigation, we are speaking different languages. How do we expect to be able to communicate at a practical level when it is so difficult to accurately communicate our needs in a way that can be understood by everyone in the area?

Many case studies were given by law enforcement that dealt with international communication, but other than “we need more / better communication” I really did not see any actionable solution proposed beyond ad-hoc cooperation. From these great case studies and information from the private sector, I was still left with a feeling of where do we start?

Overall, I found the conference to be interesting. Topics were mostly on communication, but, unfortunately, with little actionable items discussed. Case studies are useful for understanding problems and potential solutions. Some slightly more technical presentations outlined how technologies can potentially be used to help law enforcement’s current situation when dealing with cybercrime. The (potentially) most useful benefit of the conference, however, was the contacts made. There was not enough time to talk to everyone as much as I would have liked, but there appears to be potential in the group to help drive effective law enforcement communication on a global scale.


Image: FreeDigitalPhotos.net

1. The G8 24/7 High Tech Crime Network (HTCN) is an informal network that provides around-the-clock, high-tech expert contact points: IT Law Wiki 

5 min read
Back to Top ↑

Research

What I’m Reading: A functional reference model of passive systems for tracing network traffic

What I’m Reading: Today we are talking about ‘A functional reference model of passive systems for tracing network traffic’ by Thomas E. Daniels. This paper deals with network traffic origin analysis using passive methods.

T. E. Daniels, “A functional reference model of passive systems for tracing network traffic,” Digit. Investig., vol. 1, no. 1, pp. 69–81, Feb. 2004.

Link: http://www.sciencedirect.com/science/article/pii/S1742287603000045

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/z-jthlQB6sE' frameborder='0' allowfullscreen></iframe></div></div></div>
Audio only:

<iframe seamless="" src="https://bandcamp.com/EmbeddedPlayer/track=2817486589/size=small/bgcol=ffffff/linkcol=0687f5/transparent=true/" style="border: 0; height: 42px; width: 100%;">WIR:E01 - A functional reference model of passive systems for tracing network traffic by Joshua James</iframe>

~1 min read

Paid Graduate Positions Available: Digital Investigations in Internet of Things

The Legal Informatics and Forensic Science (LIFS) Institute in the College of International Studies at Hallym University, South Korea, currently has openings for full-time researchers at the Masters, Ph.D. and Postdoctoral levels.

These positions deal with Internet of Things (IoT) digital forensic investigations. The following skills are necessary:
  • Programming skills (any language)
  • Ability to plan and carry out research
  • Ability to work with a team

The following skills are preferred but not required:
  • Knowledge of embedded systems
  • Embedded system programming experience
  • Computer / Network administration experience
  • Competency in Linux / Unix systems
  • Knowledge of Digital Forensic Investigation techniques (esp. acquisition)

These positions include a full scholarship as well as a monthly living stipend. Candidates should be willing to relocate to Chuncheon, South Korea.

To apply for the Master’s and Ph.D. positions, please do the following:
  1. Send an email with your CV and links to any research papers you have published to [email protected] with the subject “IoT Graduate Application”.
  2. Apply for a graduate position with Hallym University [http://bit.ly/20Fvvi4] by November 10th, 2016.
    • Download the application files [http://bit.ly/2eWHVSM]
    • Complete the basic application files
    • Mail the application files to [email protected] and CC [email protected]
    • Other documents can be provided later (such as passport info, diploma, etc.)
    • No Visa will be issued until certified copies of supporting documents are provided.

To apply for a Post-doctoratein Digital Forensic Investigation of IoT Devices, please do the following:
  1. Send an email with your CV and links to any research papers you have published to [email protected] with the subject “IoT Postgraduate Application”.
    • Candidates must have already completed a PhD degree.
1 min read

사물인터넷(IoT) 디지털 수사 관련 전액장학금 지원 석/박사 직위 공모

한림대학교 국제학부의 정보법과학전공에서는 현재 석사, 박사, 그리고 박사후과정생을 대상으로 정규직 연구원을 모집하고 있습니다.
해당 직위는 사물인터넷(IoT) 디지털 포렌식 수사에 관련된 연구를 담당하므로, 다음과 같은 자격을 요합니다.
<ul><li>프로그래밍 실력 (언어 무관)
</li><li>연구설계 및 실행능력
</li><li>팀워크 능력

</li></ul>다음 사항은 필수 자격요건은 아니나 권장되는 능력입니다:
<ul><li>임베디드 시스템에 대한 지식
</li><li>임베디드 시스템 프로그래밍 경험
</li><li>컴퓨터/네트워크 관리 경험
</li><li>리눅스/유닉스 시스템에 대한 이해
</li><li>디지털 포렌식 기법 (특히 획득기법)에 대한 지식


</li></ul>해당 직위는 전액 장학금 및 생활비가 제공됩니다. 후보자들은 거주지를 강원도 춘천시로 옮기는 것이 권장됩니다.


석사 및 박사과정생 직위에 응모하기 위해서는 다음과 같이 지원해주시기 바랍니다.
<ol><li>자신의 이력서 1, 출판된 연구결과물에 대한 링크를 “IoT Graduate Application”을 제목으로 하여 [email protected]r 로 이메일을 보내주시기 바랍니다.
</li><li>20161110일까지 다음 링크 [http://bit.ly/20Fvvi4]로 한림대학교 대학원에 지원해주시기 바랍니다.
다음 링크 [http://bit.ly/2eWHVSM]를 통해 지원서 파일을 다운받은 후,
기본 지원서 파일을 작성하시기 바랍니다.
완성된 지원서 파일을 [email protected] 로 발송하면서 [email protected] 로 참조를 걸어주시기 바랍니다.

여권 및 신분증, 학위증명서 등 기타 다른 문서는 나중에 제출하셔도 됩니다.</li></ol>박사후과정직위에 응모하기 위해서는 다음과 같이 지원해주시기 바랍니다:

1. 자신의 이력서 1, 출판된 연구결과물에 대한 링크를 “IoT Graduate Application”을 제목으로 하여[email protected]r로 이메일을 보내주시기 바랍니다.
후보자들은 반드시 박사학위를 취득하였어야 합니다

~1 min read

Finding private IP addresses in Email Headers

In some cases it may be necessary or helpful to find the private IP of a suspect. This can be difficult, especially since NAT is common in most networks. However, if a suspect is sending emails from a local client, the private, as well as public, address may be available in the email header.


If gmail is used with a local client (like Thunderbird, Outlook, etc.) then the email header should have the private IP address. Note that it is possible that some of the information is stripped by the client or client network before reaching the SMTP server. Take a look below:

—– Mail sent from Thunderbird using googlemail SMTP —–
Received: from [10.0.0.101] ([211.111.111.111]) <— here you can see the private (10.0.0.101) and public (211.111.111.111) IP address of the sender connecting to the SMTP server.
by smtp.googlemail.com with ESMTPSA id <– this line tells you that the message was received by SMTP
for <[email protected]>
Mon, 02 Nov 2015 23:01:38 -0800 (PST)
To: Joshua James <[email protected]>
From: “Joshua I. James” <[email protected]>


If the email is sent from the Gmail web interface (in the browser), the private IP address is NOT available. Google’s server only sees the suspect’s public IP address access the google web server.

——- Sent from gmail web interface ——
Received: by 10.50.10.233 with HTTP; <—- “with HTTP” means received via web interface on server 10.50.10.233 (google). The sender’s IP is not shown.
Date: Tue, 3 Nov 2015 16:08:03 +0900
Subject: test2
From: “Joshua I. James” <[email protected]>
To: “Joshua I. James” <[email protected]>

If the header is only showing google’s address, then the suspect must have been accessing the web interface (check for “with HTTP”). In that case, google will only have the public IP of the suspect.

1 min read

ICDF2C 2015 in Seoul, South Korea Final Program Now Available

The 7th EAI International Conference on Digital Forensics & Cyber Crime will be held OCTOBER 6–8, 2015 in SEOUL, SOUTH KOREA.

The final program is now available at http://d-forensics.org/2015/show/program-final
Be sure to register so you don’t miss the exiting talks and tutorials!

Keynote speakers include Max Goncharov from Trend Micro, Inc, and Dr. Dave Dampier from Mississippi State University:

<div class="separator" style="clear: both; text-align: center;"></div>Max Goncharov is a senior security Virus Analyst with Trend Micro Inc., and is responsible for cybercrime investigations, security consulting to business partners (internal, external), creation of security frameworks, designing technical security architecture, overseeing the build out of an enterprise incident response process, and creation of the enterprise risk management program. During his 15 years with Trend Micro Inc, he has participated as a speaker in various conferences and training seminars on the topic of cybercrime and related issues. He has especially focues on cyberterrorism, cybersecurity, underground economy; such as DeepSec, VB, APWG, etc.


Dr. Dave Dampier is a Professor of Computer Science & Engineering at Mississippi State University specializing in Digital Forensics and Information Security. He currently serves as Director of the Distributed Analytics and Security Institute, the university level research center charged with Cyber Security Research. In his current capacity, Dr. Dampier is the university lead for education and research in cyber security. Prior to joining MSU, Dr. Dampier spent 20 years active duty as an Army Automation Officer. He has a B.S. Degree in Mathematics from the University of Texas at El Paso, and M.S. and Ph.D. degrees in Computer Science from the Naval Postgraduate School. His research interests are in Cyber Security, Digital Forensics and Software Engineering.


There will also be three tutorials on investigation, open source hardware for digital investigations and setting up a research environment for mobile malware research:

<ul><li>Tutorial 1: DUZON – Desktop Exercise: Crafting Information from Data</li><li>Tutorial 2: Pavel Gladyshev – FIREBrick; an open forensic device</li><li>Tutorial 3: Nikolay Akatyev – Researching mobile malware</li></ul><div>After the first day of the conference we are also holding a special discussion session with Seoul Tech Society called “Safe Cyberspace”, with the panel consisting of the winners of the ICDF2C/STS essay contest. Everyone is welcome to join!</div><div>
</div><div>I hope to see you at ICDF2C in Seoul, South Korea! Don’t miss this exciting opportunity.</div>

1 min read

Ashley Madison Data and Ethical Use

On August 19th, the Impact Team released data of millions of alleged Ashley Madison users. Ashley Madison is a type of social networking website that promotes extra-marital affairs. After the main user data, the source code for the website, and emails from the founder were also released.

The data was initially released on the Dark Web, but has since been added to many clear web sites.

<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;">Impact Teams .onion site on Tor where the data can be downloaded</td></tr><tr><td class="tr-caption" style="text-align: center;">Impact Team’s .onion site</td></tr></tbody></table>
The data contains information about users names, email addresses, locations, addresses, credit card numbers, credit card transactions, sexual preferences, and much, much more.

If you are thinking about looking up your friends and neighbors, think about the following first:
<h3>You cant trust most versions of the data</h3><div>Many people are interested in this data. Hackers and criminals know that it will be very popular, so they will add viruses and other malware to the data. It is also possible that copied versions had records added specifically to frame people. If you are going to use any version, make sure it came from Impact Team.</div><div><h3>You cant trust websites that let you search the data</h3></div><div>Even before the data was released, some websites were created to be able to single the data if and when it was released. Some of these websites are created by trusted security researchers, some are created by hackers, some are created by people who just want to make money off of the situation. The result is that you should only use trusted websites when evaluating data like this. Other sites may have malware, and some sites may collect any email addresses, names, phone numbers that you enter to “check” and resell that information to advertising companies. Be careful with websites you don’t know.</div><div><h3>The original data could have been fake or tampered with</h3></div><div>Data directly from Impact Team is the ‘most reliable’ version that we will get. However, this does not mean that it has not been tampered with. They may have added or modified entries.</div><div>
</div><div>Further, some accounts that exist in the system are likely to be fake anyway. The only accounts we can be reasonably sure of are attached to credit card transactions, and even those may possibly have been created by a stolen card.</div><h2>Think about what you are doing</h2><div>With data like this, there are a lot of things we can learn. I have a copy of the data, and I did not look up my friends or co-workers. Why? Because I don’t care. Many websites are using the data to find who is cheating on who. That question is not interesting. What is interesting is, for example, why people are cheating. We might even ask is cheating a bad thing? For 39 million people, apparently it isn’t. Other interesting questions include how to prevent an attack like this in the future? What are the most common passwords? Etc.</div><div>
</div><div>While the data is useful for information security to learn from its mistake, making the data easily accessible for the sake of gossip is not useful, and could potentially cause mental and physical damage. Consider this ‘help’ that a woman received from radio talk show hosts. As soon as the woman found out her husband was cheating, the host even admitted he felt like a jerk.</div><div>
</div><iframe frameborder="0" height="573" id="molvideoplayer" scrolling="no" src="http://www.dailymail.co.uk/embed/video/1207589.html" title="MailOnline Embed Player" width="698"></iframe>

I completely agree with the approach from the people at haveibeenpwned.com who explain in their blog single that it is not the job of security researchers to out people. It is our job to protect people.

Every time there is a data leak, the information is used for all sorts of scams, and criminals are already using the AM data. The people involved in this breach could have their entire lives destroyed by releasing all of their information. Some people will say that they deserve it for being on such a site. Thats a matter of opinion. But as security researchers if we don’t look for ways to use (and release) data responsibility, we may be hurting people to find the ‘juicy bits’ rather than improving security, privacy and freedom for everyone.

3 min read

Revisiting REAPER: Automating digital forensic investigations

The Rapid Evidence Acquisition Project for Event Reconstruction [1] was one of the first projects that I worked on during my PhD. It started around 2008, when I got interested in trying to completely automate digital forensic investigations. Yes, it sounds impossible, but I wanted to see how far we could automatically handle digital evidence.<div class="separator" style="clear: both; text-align: center;"></div><div>
</div><div>This was a little before digital forensic triage [2] and preliminary analysis gained popularity.</div><div>
<div>The idea was that once the process started, the investigator would not need to interact with the system. At the end of the automated investigation process, the “smoking gun” would be presented to the investigator in context.</div><div>
</div><div>Literally push-button forensics.</div><div>
</div><div>The Process</div><div>An investigator would insert a forensic live CD into the suspect’s computer (single mortem). After starting the computer, the live CD (with attached external disk) would provide only an information panel to see the stage of the investigation process.</div><div>
</div><div>First, REAPER would check the suspect computer to see what disks it could access, and if there was encryption / hidden data. If hidden / encrypted data was detected, it would try to recover / access the data. With toy examples, this worked, but how it would work on real systems - especially now - I’m not sure. All detectable media would be hashed, and verbose logging was on by default (for every action).</div><div>
</div><div>Next, all detectable media would be automatically imaged to the investigator’s external disk. Once complete, the images would be verified. If verification failed, the disk would be re-imaged.</div><div> </div><div>Next, REAPER would start standard carving, parsing and indexing. The Open Computer Forensic Architecture was used to extract as much data as possible. OCFA is an extremely powerful architecture, but the open source version is a bit difficult to use (especially from a live CD). I understand that the NFI has a commercial front-end that makes working with it much easier.</div><div>
</div><div>Once all data has been acquired, verified and processed, the actual investigation / analysis should take place.</div><div>
</div><div>Here is where things get tricky.</div><div>
</div><div>First, we have to know what the investigation question is, and we have to ‘tell’ the system what the investigation question is. We currently do this by specifying the type of investigation generally. For example, “hacking” or “child exploitation”. We then have a (manually) pre-set list of tasks related to those particular types of crimes. Either that or we could search for ‘all crimes’.</div><div>
</div><div>Here, some basic analysis could take place. For example, we could automatically determine attack paths of intrusions based on processed data [3]. We could also test whether it was possible / impossible for a certain statement to be true based on the current state of the system [4]. Also, by building up ‘knowledge’ (models) about systems before an investigation, we could also accurately, automatically determine user actions using traces that are difficult for humans to analyze [5].</div><div>
</div><div>Where it falls apart</div><div>The problem is, we are still essentially in the processing phase of the investigation. We are condensing the available information into a useable form, but we are not yet saying what this information means in the context of the investigation. While we can gain more information about the data in an automated way, a human still needs to ‘make sense’ of the information.</div><div>
</div><div>Even though we are not there yet, automation has been shown to be useful for investigations [6], and can help reduce the time for investigations while improving the accuracy [7] of the investigation. For more comments on automation in investigations, please see [8].</div><div>
</div><div>
</div><div><ol><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., Koopmans, M., & Gladyshev, P. (2011). Rapid Evidence Acquisition Project for Event Reconstruction. In The Sleuth Kit & Open Source Digital Forensics Conference. McLean, VA: Basis Technology. Retrieved from http://www.basistech.com/about-us/events/open-source-forensics-conference/2011/presentations/ </div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">Koopmans, M. B., & James, J. I. (2013). Automated network triage. Digital Investigation, 1–9. http://doi.org/10.1016/j.diin.2013.03.002</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">Shosha, A. F., James, J. I., & Gladyshev, P. (2012). A novel methodology for malware intrusion attack path reconstruction. In P. Gladyshev & M. K. Rogers (Eds.), Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering (Vol. 88 LNICST, pp. 131–140). Springer Berlin Heidelberg. http://doi.org/10.1007/978-3-642-35515-8_11</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J., Gladyshev, P., Abdullah, M. T., & Zhu, Y. (2010). Analysis of Evidence Using Formal Event Reconstruction. In Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering (pp. 85–98). Springer Berlin Heidelberg. http://doi.org/10.1007/978-3-642-11534-9_9</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., & Gladyshev, P. (2014). Automated inference of past action instances in digital investigations. International Journal of Information Security. http://doi.org/10.1007/s10207-014-0249-6</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., & Gladyshev, P. (2013). A survey of digital forensic investigator decision processes and measurement of decisions based on enhanced preview. Digital Investigation, 10(2), 148–157. http://doi.org/10.1016/j.diin.2013.04.005</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;"></div><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., Lopez-Fernandez, A., & Gladyhsev, P. (2014). Measuring Accuracy of Automated Parsing and Categorization Tools and Processes in Digital Investigations. In Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering (pp. 147–169). Springer International Publishing. http://doi.org/10.1007/978-3-319-14289-0_11</div></li><li><div style="margin-left: 24pt; text-indent: -24.0pt;"></div><div style="margin-left: 24pt; text-indent: -24.0pt;">James, J. I., & Gladyshev, P. (2013). Challenges with Automation in Digital Forensic Investigations, 17. Computers and Society. Retrieved from http://arxiv.org/abs/1303.4498</div></li></ol></div><div><div style="margin-left: 24pt; text-indent: -24.0pt;">
</div><div style="margin-left: 24pt; text-indent: -24.0pt;">
</div></div><div>
</div><div>
</div><div>
</div><div>
</div><div>
</div></div>

4 min read

Philipp Amann interviewed about robustness and resilience in digital forensics laboratories

Forensic Focus recently interviewed Philipp Amann, Senior Strategic Analyst, Europol about our DFRWS EU 2015 paper “Designing robustness and resilience in digital investigation laboratories”. Philipp and his team are doing some great work that is definitely worth following. See the full interview here.
<div class="separator" style="clear: both; text-align: center;"></div>

~1 min read

[BoB] Anti Forensics Techniques and eForensics Mag


<div style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;">
</div>
As a mentor with KITRI’s “Best of the Best v2.0” information security education program, I was/am a mentor for a digital forensic analysis research group. This group was specifically focusing on anti-forensic action detection, which fits pretty closely with my dissertation work. The first group members produced a brief survey of anti-forensics encountered in the ‘wild’ by Korean Law enforcement. The main contents of the survey are in Korean because I forgot to single an English version…

From two groups working on the same project, a number of similar tools have been created. I’ve forked the main modules that can be found under IoAF at github. Please feel free to contribute or even fork the projects. We are continuing the project this summer, so hopefully cleaner, consolidated code will be available.

<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;">eForensics Magazine - Anti Forensics Techniques</td></tr><tr><td class="tr-caption" style="text-align: center;">eForensics Magazine: Anti Forensics Techniques</td></tr></tbody></table>While the first IoAF group is working on a paper for Digital Investigation, the second group decided to write an article about A general approach to anti-forensic activity detection. This article gives a pretty good literature review about some of the work done in general anti-forensic detection, then shows the investigators how to determine traces created by anti-forensic programs. The work is somewhat similar to the work of Geiger on ‘counter forensics’, but - I believe - the proposed method is easier for investigators to implement or even automate.

Their article can be found in eForensics Magazine Vol. 3 No. 5.
<div style="margin-left: 24pt; text-indent: -24.0pt;">
</div>While the developed tools are currently available on github, the next few months will see them refined. Stay tuned!

1 min read

Help us understand Mutual Legal Assistance and win a FIREBrick Write Blocker

Please help DigitalFIRE Labs understand the current state of Mutual Legal Assistance Requests relating to digital evidence, and be entered for a chance to win a FIREBrick write-blocker or an Amazon gift card.

The survey on Mutual Legal Assistance Requests Concerning Digital Evidence can be found here: http://goo.gl/gnrJtN

Mutual Legal AssistanceThis survey has been commissioned by the United Nations Office on Drugs and Crime (UNODC) in conjunction with the Digital Forensic Investigation Research Laboratory (DigitalFIRE) to assess existing approaches to requesting and obtaining electronic evidence in international cooperation under the conditions of Mutual Legal Assistance Treaties.

The survey consists of 36 questions, which will take approximately 20 minutes to complete.

For any questions or comments about the following survey, please email [email protected]

To help improve the effectiveness of mutual legal assistance requests, please share this survey with your colleagues, thank you.

Image courtesy of mrpuen / FreeDigitalPhotos.net

~1 min read

Comparing Similarity of Images using SIFT Features

I’ve been playing around with VLFeat, and specifically SIFT to compare images using sift feature extraction. A while back I was looking at comparing files and images using sdhash and ssdeep, and they did not work well with images (which completely makes sense sense!).

So I was looking at some computer vision implementations, and found programming computer vision with python. From a basic example in the book, we can now visually compare similarity on the kitty corpus used last time.

5a762d8cdf4f1beae208595e79990a01 /corpus/kitty_hex.jpg
1704cd46c5c0f994278769e533015525 /corpus/kitty_sm.jpg
bcbed42be68cd81b4d903d487d19d790 /corpus/kitty_text.jpg
6d5663de34cd53e900d486a2c3b811fd /corpus/kitty_orig.jpg
4312932e8b91b301c5f33872e0b9ad98 /corpus/kitty_whirl.jpg

comparing corpus/kitty_text.jpg corpus/kitty_sm.jpg number of matches = 107
comparing corpus/kitty_text.jpg corpus/kitty_orig.jpg number of matches = 375
comparing corpus/kitty_text.jpg corpus/kitty_hex.jpg number of matches = 375
comparing corpus/kitty_text.jpg corpus/kitty_whirl.jpg number of matches = 358
comparing corpus/kitty_sm.jpg corpus/kitty_orig.jpg number of matches = 108
comparing corpus/kitty_sm.jpg corpus/kitty_hex.jpg number of matches = 108
comparing corpus/kitty_sm.jpg corpus/kitty_whirl.jpg number of matches = 88
comparing corpus/kitty_orig.jpg corpus/kitty_hex.jpg number of matches = 389
comparing corpus/kitty_orig.jpg corpus/kitty_whirl.jpg number of matches = 343
comparing corpus/kitty_hex.jpg corpus/kitty_whirl.jpg number of matches = 343

Just extracting SIFT features and comparing which features match we can do pretty well at identifying similar images. As a reference, see an unrelated image compared to a kitty image:

comparing corpus/kitty_text.jpg corpus/cheese.jpg number of matches = 0

It is interesting to note that even if the image is modified, the swirled face for example, similarity to the original image is still relatively high. The lowest performance was seen when the image size was reduced, which is probably because fewer features would be extracted from the smaller image. Note in this experiment the only image pre-processing we are doing is conversion to gray scale. We are not resizing, doing PCA or anything like that.

I also want to point out something:
comparing corpus/kitty_text.jpg corpus/cheese_text.jpg number of matches = 2

In this case there are two different images that have a small bit of text inserted into the image. Feature detection was able to determine some similarity (the text looked the same) on completely different images. This could potentially be used to determine if a string or watermark was added to a group of pictures in a directory.

1 min read

South Korean National Cyber Terrorism Prevention Act (English)

An unofficial English translation of the proposed South Korean National Cyber Terrorism Prevention Act.

The recently proposed South Korean National Cyber Terrorism Prevention Act: [Korean PDF]
[English PDF]


<div style="text-align: center;">National Anti-Cyberterrorism Act (Legislative Bill)</div>(April 9, 2013)
Rationale
The cyberspace, which is a virtual space connecting computers and other information technology
devices through networks, now has not only become a common space for the daily lives of all
citizens, but also transcends borders among states and public/private sector.
Due to its uniqueness, both public and private sectors cannot unilaterally block all cyber-attacks
that occur beyond space and time. Furthermore, unlike disorder in the physical world, a single
person can potentially cause national crisis by launching a cyber terror attack.
Cyber crisis can have enormous influence on the whole society. Attacks on national major
information communications network as shown in the case of 1.25 Internet crisis(Slammer Worm),
and organized cyber terrorism from overseas that can lead to leakage of national secrets and
technology, is increasing everyday, and is more likely to occur in the future.
However, Korea has yet to establish a systematic national anti-cyberterrorism plan or policies and
specific procedures to manage such crises. In case of a cyber crisis, this situation may lead to severe
danger and serious damage to national security and national interest.
Therefore this bill proposes the establishment of a national comprehensive response system
involving the government and the private sector, hence detect cyber terror attacks in advance to
prevent national cyber crisis and concentrate national capacity in times of crisis to take prompt
action.
Executive Summary
A. The Director of the National Intelligence Service (hearafter NIS) may organize and operate a
public-private consultative body in order to efficiently manage cyber crisis and mutually share
information related to cyber-attacks (Article 5).
B. The Director of the NIS should establish the master plan of national cyber terror prevention and
crisis management and construct an implementation plan to distribute to the Heads of responsible
agencies (Article 7).
C. Establishment of the National Cyber Security Center under the Director of the NIS for a
comprehensive and systematic national response to cyber terrorism and management of cyber crisis
(Article 9).
D. Heads of responsible organizations should either establish and operate a Security Management
Centre capable of detecting and analyzing cyber-attack information and promptly response to such
1threats, or depute the task to a Security Management Centre established and operated by other
institutions (Article 12).
E. Heads of central administrative agencies should immediately carry out investigations when
damage occurs due to a cyber terror attack, and should notify the Heads of related administrative
agencies and the Director of the NIS in cases of critical damage (Article 13).
F. The Director of the NIS can issue the Cyber Crisis Alert in the interest of systematic response to
cyber terrorism, and the Heads of responsible organizations should take appropriate measures to
minimize or restore damage (Article 15).
G. The government may organize and operate Cyber Crisis Response Headquarters consisting of
related institutions and professionals for analysis, investigation, immediate response, and damage
restoration when a Cyber Crisis Alert higher than Alert state is issued (Article 16).
H. The government may carry forward policy measures necessary for cyber crisis management,
such as technology development, international cooperation, industry development, and human
resource development (Article 19, 20, and 21).
I. The government may give monetary reward to those who provide information on cyber terror
attempts or report cyber terrorists (Article 24).
J. Those who divulge official secrets are sentenced to maximum five years in prison or fined to
maximum 30 million won. Those who have not established a Security Management Centre may be
charged with maximum 20 million won for negligence (Article 25 and 26).
*Legislative Bill for National Anti-Cyberterrorism Act
Chapter 1 General Provisions
Article 1 (Purpose)
The purpose of this Act is to contribute to national security and national interest by stipulating the
fundamental situations of national cyber terror prevention in order to prevent cyber terror which
threatens national security and enable prompt response by concentrating national capacity in cases
of cyber crises.
Article 2 (Definitions)
1) The definition of the terms used in this Act is as follows.
1. “Cyber Terrorism” refers to all offensive actions that intrude, disturb, paralyze, or destroy
information telecommunication infrastructure, or actions related to information theft, damage and
2dissemination of distorted information by electronic means such as hacking, computer virus,
denial of service, and electromagnetic wave.
2. “Cyber Security” refers to measures and responses through administrative, physical,
technological means in order to protect information telecommunication infrastructure and
information from cyber terrorism, and includes cyber crisis management.
3. “Cyber Crisis” refers to a situation in which a cyber terror attack causes serious disruption to the
function of the state and society or could potentially spread damage nationwide.
4. “Cyber Terror information” refers to information of certain actions determined by information
systems and information security system (including software) as cyber terrorism, and includes IP
addresses and MAC addresses used to identify the source of the cyber attack.
5. “Cyber Crisis Alert” refers to alerts issued in order to provide cyber threat information when a
cyber crisis is expected to occur, thereby enabling related organizations to take appropriate
measures according to the threat level.
6. “Cyber Crisis Management” refers to all national level measures and actions pertaining to cyber
terror detection, response, investigation, restoration of damage, mock exercise, issuance of
warning and cooperation among agencies in order to prevent cyber crisis and systematically take
prompt action in case of cyber crisis.
7. “Agencies Responsible of Cyber Terror Prevention and Crisis Management (“Responsible
Agencies”)” refers to the following agencies performing tasks related to cyber terror prevention
and crisis management.
a. <Constitution (대한민국헌법)>, <Government Organization Act (정부조직법)>, other
national agencies and local autonomous entities established by other legislation, and public
institutions according to Article 3, paragraph 10 of <Framework Act on National
Informatization (국가정보화 기본법)>
b. Agencies in charge of managing major information and communications infrastructure
according to Article 5, paragraph 1 of <Act on the Protection of Information and
Communications Infrastructure (정보통신기반 보호법)>
c. Business operator of clustered information and communications facilities according to Article
46, paragraph 1, and major providers of information and communications services according
to Article 47-3, paragraph 2 of <Act on Promotion of Information and Communications
Network Utilization and Information Protection, etc.(정보통신망 이용촉진 및 정보보호
등에 관한 법률)>
3d. Corporations or research institutes which hold National Core Technology according to
Article 9 of <Act on Prevention of Diligence and Protection of Industrial Technology
(산업기술의 유출방지 및 보호에 관한 법률)>
e. Defense contractors according to Article 3, paragraph 9, and specialized research institutes
according to Article 3, paragraph 10 of <Defense Acquisition Program Act (방위사업법)>
8. “Support Institution for Anti-Cyberterrorism and Crisis Management (“Support Institution”)”
refers to the following institutions or corporations which support prompt detection, response,
investigation, and damage restoration of cyber terror.
a. Electronics and Telecommunications Research Institute (ETRI) Affiliated research institute
according to Article 8 of<Act for the Establishment, Management and Promotion of
Government-supported Research Institutes in the Science and Technology Sector
(과학기술분야 정부출연연구기관 등의 설립 운영 및 육성에 관한 법률)>
b. Korea Internet Security Agency (KISA) according to Article 52 of <Act on Promotion of
Information and Communications Network Utilization and Information Protection, etc.
(정보통신망 이용촉진 및 정보보호 등에 관한 법률)>
c. Software business operators who produce or sell antivirus software according to Article 24 of
<Software Industry Promotion Act (소프트웨어산업 진흥법)>
d. Producers or importers of information protection system according to Article 3, paragraph 9
of <Framework Act on National Informatization (국가정보화 기본법)>
e. Designated consulting companies specialized in knowledge and information security
according to Article 9 of <Act on the Promotion of Information and Communications
Infrastructure (정보통신기반보호법)>
f. Companies specialized in security management designated by the Head of related
administrative agencies
2) Apart from the terms defined above, other terms used in this Act accords to <Act on the
Protection of Information and Communications Infrastructure (정보통신기반 보호법)>, <Act
on Promotion of Information and Communications Network Utilization and Information
Protection, etc (정보통신망 이용촉진 및 정보보호 등에 관한 법률)>, <Framework Act on
National Informatization (국가정보화 기본법)>, <Telecommunications Business Act
(전기통신사업법)>.
Article 3 (Relations to other legislations)
4This Act shall be applied with the exception of clauses of other legislations that regulate the
prevention of cyber terror and crisis management. However, if a cyber crisis occurs, this Act shall
supercede other legislations in application.
Article 4 (Obligation of Responsible Agencies)
Chief of Responsible Agencies are responsible for maintaining safety of information
communications facilities within their jurisdiction in order to prevent cyber terror. Accordingly,
they should devise measures to secure professional workforce dedicated to and budget necessary for
the prevention of cyber terrorism and crisis management.
Article 5 (Public-Private Cooperation)
1) The government may organize joint consultative bodies with private institutes according to the
Presidential decree in order to efficiently manage cyber crisis and cooperate to prevent cyber terror
and take appropriate countermeasures.
2) Other considerations necessary for paragraph 1 shall be prescribed by Presidential Decree.
Chapter 2 National Cyber Terror Prevention and Crisis Management System
Article 6 (National Cyber Security Strategy Council)
1) The National Cyber Security Strategy Council (hereinafter refer to as“Strategy Council”) shall be
established under the Director of the NIS to deliberate important matters regarding national anti-
cyberterrorism and crisis management.
2) The Director of the NIS chairs the Strategy Council.
3) The Strategy Council shall have vice-minister class officials of central administrative agencies
and those appointed by the Chairman of the Strategy Council (= Director of the NIS).
4) The Strategy Council shall deliberate the following.
1. Establishment and improvement of cyber terror prevention and crisis management strategies,
policies, and institutions
2. Problems pertaining to role adjustment among agencies related to cyber terror prevention and
crisis management
3. Problems pertaining to sharing and protecting cyber threat information according to Article
12, paragraph 2
4. Other problems presented by the Chairman of the Strategy Council (= Director of the NIS) or
submitted by council members
55) The Strategy Council may set up a National Cyber Security Countermeasure Council (hereinafter
refer to as “Countermeasure Council”) in order to efficiently operate the Strategy Council.
6) Other specific matters pertaining to the organization and operation of the Strategy Council or the
Countermeasure Council shall be prescribed by Presidential Decree.
Article 7 (Establishment of a Master Plan for National Anti-cyberterrorism and Crisis
Management)
1) The Government should establish and implement National Anti-cyberterrorism and Crisis
Management Master Plan (hereinafter refer to as “Master Plan”) in order to efficiently and
systematically push forward measures for cyber terror prevention and crisis management.
2) The Master Plan shall be outlined by the deliberation of the Strategy Council and discussion
among the Director of the NIS and Heads of related central administrative agencies according to
Presidential Decree.
3) The Head of central administrative agency should devise and disseminate National Anti-
cyberterrorism and Crisis Management Implementation Plan (hereinafter refer to as
“Implementation Plan”) to Heads of sub-agencies in order to enable them to utilize the Master
Plan according to paragraph 1.
Article 8 (Confirmation of Compliance to the Implementation Plan and Report to the
National Assembly)
1) The Head of central administrative agency should annually confirm the compliance of its sub-
agencies to the Implementation Plan.
2) The Director of the NIS gather the confirmation results of paragraph 1 and conduct an inspection
of the actual condition of national cyber terror prevention and crisis management, and report the
results to the National Assembly. However, inspection and assessment of the National Assembly,
Court, Constitutional Court, National Election Commission shall be conducted only when
requested by each entity.
3) Procedures and means necessary for paragraph 1 and 2 shall be prescribed by Presidential Decree.
Article 9 (Establishment of National Cyber Security Center)
1) For a comprehensive and systematic state- level prevention of and response to cyber terrorism
and cyber crisis management, National Cyber Security Center (hereinafter refer to as “Security
Center”) shall be established under the Director of the NIS.
2) The Security Center shall undertake the following duties.
1. Establishment of national anti-cyberterrorism and crisis management policies
62. Support for the Strategy Council and the operation of the Countermeasure Council
3. Collect, analyze and disseminate information related to cyber terror
4. Secure the safety of national information communications network
5. Devise and disseminate national cyber terror prevention and crisis management manual
6. Investigate cyber terror cases and support recovery from damage
7. Cooperation with other countries in sharing cyber attack information
3) The Director of the NIS may set up and operate a Private-Public-Military Joint Response Team
(hereinafter refer to as “Joint Response Team”) in order to support comprehensive judgment,
situation management, threat causation analysis, investigation, etc. according to paragraph 1
when necessary.
4) The Director of the NIS may request the Head of central administrative agency or other support
agencies to dispatch workforce and provide equipment necessary for the establishment and
operation of the Joint Support Team when needed.
Chapter 3. Anti-cyberterrorism and Cyber Crisis management ac
Article 10. (Establishment and Operation of Anti-Cyberterrorism policy)
1) To secure the safety and reliability of information and its network, heads of Responsible
Agencies shall establish Anti-cyberterrorism policy.
2) The Director of the NIS may prepare and deliver a necessary guideline for establishment of Anti-
cyberterrorism prevention policy. In this case, The Director of NIS shall consult with a head of
central administrative agencies concerned in advance.
3) In the case of paragraph 2) applied to central administrative agencies of the National Assembly,
court, constitutional court and national election commission, it is applied only in a case that the
head of concerning agency or institution considers it necessary.
Article 11 (Precluding spread of malicious program)
1) If the Government becomes aware of websites or software including malicious program, it shall
provide related information to operators so that they can take measures for security necessary to
preclude spread of malicious program.
2) The Government may, if it judges the compromised websites or software is highly likely to be
misused and potentially dangerous despite measures under paragraph 1), delete or block them by
using anti-virus program.
3) Specific details necessary measures pursuant to paragraph 10 or paragraph 2) shall be prescribed
by the Presidential Decree.
7Article 12 (Establishment of Security Control Center etc)
1) Head of a Responsible Agency shall establish and manage an organization which is able to detect
and analyze cyberterrorism information, and immediately response, or entrust the work to a security
control center which agencies, following subparagraph, establish and manage. Provided,
Information Sharing & Analysis Center, in accordance with Article 16 of Act on the Protection of
Information and Communications Infrastructure, is deemed as a security control center
1. Relevant Central Administrative Agencies
2. National Intelligence Service
3. Specialized Security Control Corporation in accordance to Article 2-Paragraph 1) -
Subparagraph f
2) Head of Responsible agencies shall share cyberterrorism information, and vulnerability of
information networks or software etc, in accordance with Paragraph 1) (hereinafter refer to as
“Cyber Threat information”), with heads of relevant agencies and the Director of the NIS
3) For efficient management and operation of Cyber Threat information, in accordance with
Paragraph 2, The Director of the NIS may establish and manage Cyber Threat information
integration system with heads of relevant agencies.
4) Any person shall fairly use and manage the information shared in accordance with paragraph 2, ,
only within the scope of Cyber Crisis management.
5) Security Control Center, in accordance with paragraph 1, establishment and management of
Cyber Threat information integration system’s establishment and management, and matters
concerning information management, in accordance with paragraph 3, scope, process and method of
Cyber Terror information sharing, in accordance with paragraph 2, shall be prescribed by the
Presidential decree.
Article 13 (Incident investigation)
1) In the case when damages made to the jurisdiction of a central administration agency, the head of
the central administration agency shall conduct incident investigation swiftly on the cause and
damages, and in the case of severe damages or concerns over the possibility of expansion of
damages, he/she shall immediately notify the results to the head of related central administration
agency and the Director of the NIS.
2) Despite the paragraph 1), when the case is deemed to cause severe impacts on national security
and interest, the Director of the NIS can consult the related central administration agency and
conduct the investigation on its own.
83) In the case when swift actions are deemed to be necessary for restoration and prevention of
further damages according to the notification of the investigation result regarding paragraph 1) or
the NIS’s own investigation regarding paragraph 2), the Director of the NIS may request necessary
measures to heads of responsible agencies. The responsible agency’s heads must abide by the
request if there is no substantial reason not to.
4) No person shall delete/damage/modify data related to cyber terror before the incident
investigation regarding paragraph 1) and 2)
Article 14 (Response Training)
1) The Government shall execute training programs to prevent cyber terror from happening and
respond systematically and efficiently.
2) Trainings regarding paragraph 1) may either be executed regularly every year or be conducted as
occasion demands, and regular training may be executed together with Emergency Preparedness
Training regarding Article 14 of <Emergency Preparedness Resources Management Act
(비상대비자원관리법)>
3) The necessary issues including the execution method and procedure of the training regarding
paragraph 1) etc. shall be prescribed in the Presidential Decree.
Article 15 (Announcement of Cyber Crisis Alert)
1) To systematically prepare and respond to cyber terror, the Director of the NIS may announce
four-stage Cyber Crisis Alert (moderate/substantial/secure/critical), based either on the request of
the head of a responsible agency or on the complement/ judgment of intelligence gathered regarding
Article 12 paragraph 2).
2) In the case when a cyber terror is deemed to cause serious damage to national security, the
Director of the NIS may consult the Senior Secretary to the President for National Crisis Control in
the National Security Department to announce critical-level alert, and in the case of the
announcement of a critical-level Alert, it shall inform the reason of the announcement to the
National Assembly.
3) The Director of the NIS shall consult the level of Cyber Crisis Alert prior to the announcement.
4) The head of responsible agencies should take measures to minimize occurring damages and to
foster restoration immediately after the announcement of Cyber Crisis Alert regarding paragraph 1).
5) The necessary issues regarding the procedure, criteria, measures by heads of responsible agencies,
etc. shall be prescribed in the Presidential Decree.
Article 16 (Organizing Cyber Crisis Response Headquarters)
91) In the case of a Cyber Crisis Alert above the “Caution-level”, the Government may organize and
operate Cyber Crisis Response Headquarters (hereinafter refer to as “Response HQ”), in which
experts from private sectors, the Government and the Military participate to concentrate national
capacity, for swift responding measures such as analysis of causes, investigation, emergency
response, recovery of damages, and etc.
2) The head of Response HQ (hereinafter refer to as “Head”) shall be the Director of the NIS, and
the necessary issues regarding the organization and operation of Response HQ shall be determined
after the Director of the NIS consults with related central administrative agencies’ heads.
3) The Head may request needed workforce and equipment from responsible agencies and
supporting agency heads in order to organize/operate the Response HQ stated in paragraph 1). The
responsible agencies and supporting agencies’ heads must abide by the request if there is no
substantial reason not to.
4) The Head may provide occurred expenses to the head of agencies that dispatched workforces and
equipments to the Response HQ.
Article 17 (Technical Assistance)
1) In the case the head of a responsible agency need support to operate on duties regarding Article
12 paragraph 1) and Article 15 paragraph 4), he/she may request support from the head of the
related central administrative agency or the Director of the NIS.
2) The head of related central administrative agency or the Director of the NIS shall provide
necessary measures including technical assistance when it received the request regarding paragraph
1), so to insure swift responses.
3) Regarding the support in paragraph 2), the head of related central administrative agency or the
Director of the NIS may request support of the requesting agency, and shall provide the content and
period of support beforehand.
4) The head of related central administrative agency or the Director of the NIS may provide
incurred expenses to the head of agencies for the support regarding paragraph 3).
Chapter 4. R&D, Support etc
Article 18(Provision for Responsible Agency)
The Government may provide the Responsible agencies necessary technical assistance, equipments
and other needed support in order to protect information & communications network.
10Article 19(R&D)
1) The Government may promote the following policies for technology development and
improvement necessary for the prevention of a Cyber Terror and Cyber Crisis management;
1. Establishment and operation of national research and development plan for Cyber Terror
prevention and Cyber Crisis management
2. Survey on the needs of Cyber Terror prevention and Cyber Crisis management technology /
project about trend analysis
3. Cyber Terror prevention and Cyber Crisis management technology’s development, supply and
distribution project
4.
And so forth, necessary matters concerning Cyber Terror prevention and Cyber Crisis
management technology’s development and improvement
2) In accordance the paragraph 1, the Director of NIS may establish research institute or designate
agency established by other act as a specialized agency.
3) The Director of the NIS may settle detailed matters concerning process and method of Cyber
Terror prevention and Cyber Crisis management technology
Article 20(Industry Promotion)
1) For industry promotion support on Cyber Terror prevention and Cyber Crisis management
technology, Government shall establish and operate policies as follow, and find necessary finance
securing method.
1. Support for policy establishment about Cyber Terror prevention and Cyber Crisis
management industry
2. Support of market invigoration for Cyber Terror prevention and Cyber Crisis management
technology development
3. Establishment of industry-academic cooperation for Cyber Terror prevention and Cyber
Crisis management industry promotion
4.
Support for International exchange and cooperation, and overseas expansion of Cyber Terror
prevention and Cyber Crisis management industry
2) Government may let a research institute or a specialized agency, stated in Paragraph 2) of Article
19, operate a necessary task for industry promotion, stated in paragraph 1).
Article 21(Education, Training and Public Awareness)
For base-establishment of Cyber Terror prevention and Cyber Crisis management, and
improvement of public awareness about Cyber Crisis, Government shall devise measures as follow;
111. Manpower training about Cyber Terror prevention and Cyber Crisis management
2. Publicity and education about Cyber Terror prevention and Cyber Crisis management
3.
And so forth, necessary matters concerning education, training and public awareness about
Cyber Terror prevention and Cyber Crisis management
Article 22 (International Cooperation)
The Government may execute the following matters to strengthen cooperation with international
organizations, agencies and foreign countries in the case of Anti-cyberterrorism and Cyber Crisis
management.
1. Construct mutual cooperation system for prevention of cyber terror and Cyber Crisis
management
2. Sharing of Information and response coordination in technologies regarding Anti-
cyberterrorism and Cyber Crisis management
3.
Secondment/training of personnel-in-charge for Anci-cyberterrorism and Cyber Crisis
management
Article 23 (Confidentiality)
Any person who engages or engaged in a job related to the prevention of cyber terror and Cyber
Crisis management affairs shall not divulge to another person any secret that he/she has learned
while performing his/her duties, nor use it for any purpose other than performance of his/her duties.
Article 24 (Reward, etc.)
1) Regarding the prevention of cyber terror and Cyber Crisis management, the Director of the NIS
may award a prize to a person that suits the followings, and may provide rewards within the
boundary permitted by its budget.
1. Person who provided information on the attempt of a cyber terror
2. Person who reported on the actor(s) of cyber terror
3. Person who provided an outstanding service in detecting, responding to, and recovering
from a cyber terror
2)The criteria, manner and procedure, concrete amount of the money paid, and other necessary
issues in awarding a prize and reward will be determined by the Director of the NIS.
Chapter 5. Penal Provisions
12Article 25 (Penal Provisions)
1) Any person falling under any of the following subparagraphs shall be punished by imprisonment
with prison labor for at most five years or by fine not exceeding 30 million won;
1. Any person in violation of Article 12 paragraph 2) and Article 12 paragraph 4)
2. Any person in violation of Article 13 paragraph 4)
3. Any person in violation of Article 23
2) Any person falling under any of the subparagraph 1. due to negligence in the conduct of business
shall be punished by imprisonment with prison labor for at most two years or by fine not exceeding
10 million won.
Article 26 (Penalty)
1) Any person falling under any of the following subparagraphs shall be charged not exceeding 20
million won for penalty;
1. Any person in violation of Article 13 paragraph 1)
2. Any person in violation of Article 16 paragraph 3)
2) Any person in violation of Article 13 paragraph 3) shall be charged not exceeding 10 million
won for penalty.
3) Penalty being charged due to paragraphs 1) and 2) shall be imposed and collected by related
National Administration Agency as prescribed in the Presidential Decree.

This unofficial translation is made for academic researches and reviews.

Any comments and/or suggestion will be very helpful for the improvement of cyber-security of
Korea and the World. Please leave a comment, or email me.
19 min read

South Korean National Cyber Terrorism Prevention Act (Korean)

The recently proposed South Korean National Cyber Terrorism Prevention Act: [Korean PDF] [English PDF]


êµ?? ?¬ì´ë²„í…Œ??ë°©ì???ê´€??법률??br />(?œìƒê¸°ì˜???€?œë°œ??
????br />ë²???br />4459
발의?°ì›”??: 2013. 4. 9.
ë°?br />??br />??: ?œìƒê¸?조명ì²??¤ìž¬??br />ê°•ì????¤ìƒ???•ë¬¸??br />민병ì£??¬í•™ë´??´ì² ??br />ê¹€?¥ì‹¤.? ê²½ë¦?ê¹€?¬ê²½
?¡ì˜ê·??˜ì›(13??
?œì•ˆ?´ìœ 
?¬ì´ë²„공간ì? ?•ë³´?µì‹ ê¸°ìˆ ??비약?ì¸ 발전ê³??”불???•ë³´ê¸°ê¸°?€
컴퓨??그리ê³??¸í„°???±ì˜ ?¤íŠ¸?Œí¬ë¡??°ê²°??ê°€?ì˜ 공간?¼ë¡œ ?´ë?
êµ?? ?í™œ??보편?ì¸ ?ì—­?¼ë¡œ ?ë¦¬ë§¤ê??˜ì?ê³? êµ?²½??초월?˜ì—¬ ë²?br />지구적?´ë©´???•ë??€ 민간부분이 ?í˜¸ ë°€?‘히 ?°ê³„?˜ì–´ ?ˆìŒ.
?´ëŸ¬???¹ìˆ˜?±ìœ¼ë¡?말ë??”ì•„ 복잡.고도?”되ë©? ?œê³µê°„의 ?œì•½??ë²?br />?´ë‚˜ 발생?˜ëŠ” 모든 ?¬ì´ë²„공격을 ?•ë??€ 민간 ?´ëŠ ?˜ë‚˜???¨ë…?¼ë¡œ
차단?˜ê¸°?ëŠ” 분명???œê³„ê°€ ?ˆìŒ. 게다가 ?¬ì´ë²„í…Œ?¬ë¡œ 초래?˜ëŠ” ?¬ì´
버상???„기???„실?¸ê³„??물리??질서?¼ë?ê³??¬ë¦¬ ?¹ì •ê°œì¸???€??br />것일지?¼ë„ êµ???„ì²´???„기ë¡??•ë??????ˆìŒ.
- 1 – 2 -
그리ê³?과거 1.25 ?¸í„°???€?€ê³?ê°™ì? ?„êµ­?ì¸ 규모??êµ?? 주요
?•ë³´?µì‹ ë§?마비?¬íƒœ 발생ê³??´ì™¸ë¡œë???조직?ì¸ ?¬ì´ë²„í…Œ?¬ë¡œ êµ??
기ë? ë°?첨단기술??? ì¶œ ??êµ??.?¬íšŒ ?„ë°˜??중ë????í–¥??미칠
???ˆëŠ” ?¬ì´ë²„위ê¸?발생 ê°€?¥ì„±??? ë¡œ 증ë??˜ê³  ?ˆìŒ.
그러???°ë¦¬?˜ë¼???„직 êµ??차원?ì„œ ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?br />?…무ë¥?체계?ìœ¼ë¡??˜í–‰?????ˆëŠ” ?œë„?€ 구체??방법.?ˆì°¨ê°€ ?•ë¦½
?˜ì–´ ?ˆì? ?Šì•„ ?¬ì´ë²„위ê¸?발생 ??êµ???ˆë³´?€ êµ?µ??중ë????„í—˜ê³?br />막ë????í•´ë¥??¼ì¹  ?°ë ¤ê°€ ?ˆìŒ.
?°ë¼????법에?œëŠ” ?•ë??€ 민간??참여??êµ??차원??종합?ì¸ ?€??br />체계ë¥?구축?˜ë„ë¡??˜ê³ , ?´ë? ?µí•˜???¬ì´ë²„í…Œ?¬ë? ?¬ì „???ì??˜ì—¬
?¬ì´ë²„위ê¸?발생 ê°€?¥ì„±??조기??차단?˜ë©°, ?„기 발생 ??êµ??????Ÿ‰??br />결집?˜ì—¬ ? ì†???€?‘í•  ???ˆë„ë¡??˜ê³ ????
주요?´ìš©
ê°€. êµ???•ë³´?ìž¥?€ ?¬ì´ë²„위기ë? ?¨ìœ¨?ìœ¼ë¡?관리하ê³??¬ì´ë²„ê³µê²?br />ê´€?¨ì •ë³´ë? ?í˜¸ 공유?˜ê¸° ?„하??ë¯?ê´€ ?‘의체ë? 구성.?´ì˜??br />???ˆìŒ(????ì¡?.
?? êµ???•ë³´?ìž¥?€ êµ???¬ì´ë²„í…Œ??ë°©ì? ë°??„기관??기본계획??br />?˜ë¦½?˜ê³  ?´ì— ?°ë¼ ?œí–‰ê³„획???‘성?˜ì—¬ 책임기ê????¥ì—ê²?ë°?br />?¬í•˜?¬ì•¼ ??????ì¡?.?? ?¬ì´ë²„í…Œ?¬ì— ?€??êµ??차원??종합?ì´ê³?체계?ì¸ ?€?‘ê³¼ ??br />?´ë²„?„기관리ë? ?„하??êµ???•ë³´?ìž¥ ?Œì†?¼ë¡œ êµ???¬ì´ë²„안?„센
?°ë? ??????ì¡?.
?? 책임기ê????¥ì? ?¬ì´ë²„ê³µê²??•ë³´ë¥??ì?.분석?˜ì—¬ 즉시 ?€??br />?????ˆëŠ” 보안관?œì„¼?°ë? 구축.?´ì˜?˜ê±°???¤ë¥¸ 기ê???êµ?br />ì¶??´ì˜?˜ëŠ” 보안관?œì„¼?°ì— ê·??…무ë¥??„탁?˜ì—¬????????2
ì¡?.
ë§? 중앙?‰ì •ê¸°ê????¥ì? ?¬ì´ë²„í…Œ?¬ë¡œ ?¸í•´ ?¼í•´ê°€ 발생??경우??br />??? ì†?˜ê²Œ ?¬ê³ ì¡°ì‚¬ë¥??¤ì‹œ?˜ê³ , ?¼í•´ê°€ 중ë???경우 ê´€ê³?ì¤?br />?™í–‰?•ê¸°ê´€????ë°?êµ???•ë³´?ìž¥?ê²Œ ê·?ê²°ê³¼ë¥??µë³´?˜ì—¬????br />(????3ì¡?.
ë°? êµ???•ë³´?ìž¥?€ ?¬ì´ë²„í…Œ?¬ì— ?€??체계?ì¸ ?€??ë°??€ë¹„ë? ??br />?˜ì—¬ ?¬ì´ë²„위기경보ë? 발령?????ˆìœ¼ë©? 책임기ê????¥ì? ?¼í•´
발생??최소?”하거나 ?¼í•´ë³µêµ¬ 조치ë¥?취해????????5ì¡?.
?? ?•ë???경계?¨ê³„ ?´ìƒ???¬ì´ë²„위기경보ê? 발령??경우 ?ì¸ë¶?br />?? ?¬ê³ ì¡°ì‚¬, 긴급?€?? ?¼í•´ë³µêµ¬ ?±ì„ ?„하??ê´€ê³?기ê? ë°???br />문인?¥ì´ 참여?˜ëŠ” ?¬ì´ë²„위기ë?책본부ë¥?구성.?´ì˜?????ˆìŒ
(????6ì¡?.
?? ?•ë????¬ì´ë²„위기ê?리에 ?„ìš”??기술개발.êµ? œ?‘력·?°ì—…?¡ì„±Â·
?¸ë ¥?‘성 ???„ìš”???œì±…??추진?????ˆìŒ(????9ì¡? ??0ì¡?ë°?br />??1ì¡?.
- 3 – 4 -
?? ?•ë????¬ì´ë²„í…Œ??기도??ê´€???•ë³´ë¥??œê³µ?˜ê±°???¬ì´ë²„í…Œ??br />ë¥?ê°€???ë? ? ê³ ???ì— ?€?˜ì—¬ ?¬ìƒê¸ˆì„ 지급할 ???ˆìŒ(??br />??4ì¡?.
ì°? 직무??비ë????„설??경우?ëŠ” 5???´í•˜??징역 ?ëŠ” 3천만??br />?´í•˜??벌금??처하ê³? 보안관?œì„¼?°ë? 구축?˜ì? ?„니??경우??br />??2천만???´í•˜??과태료에 처할 ???ˆìŒ(????5ì¡?ë°???6ì¡?.법률 ??br />??br />êµ?? ?¬ì´ë²„í…Œ??ë°©ì???ê´€??법률??br />????총칙
??ì¡?목적) ??법ì? êµ?? ?¬ì´ë²„í…Œ??ë°©ì???ê´€??기본?ì¸ ?¬í•­??br />규정?˜ì—¬ êµ???ˆë³´ë¥??„협?˜ëŠ” ?¬ì´ë²„í…Œ?¬ë? ?ˆë°©?˜ê³  ?¬ì´ë²??„기
발생 ??êµ?? ??Ÿ‰??결집?˜ì—¬ ? ì†?˜ê²Œ ?€ì²˜í•¨?¼ë¡œ??êµ?????ˆì „
보장ê³??´ìµë³´í˜¸???´ë°”지?¨ì„ 목적?¼ë¡œ ?œë‹¤.
??ì¡??•ì˜) 1 ??법에???¬ìš©?˜ëŠ” ?©ì–´???•ì˜???¤ìŒê³?같다.
1. ?œì‚¬?´ë²„?ŒëŸ¬?ë? ?´í‚¹Â·ì»´í“¨??바이?¬ìŠ¤Â·?œë¹„?¤ë°©?´Â·ì „?ê¸°????br />?„자???˜ë‹¨???˜í•˜???•ë³´?µì‹ ?œì„¤??침입·교ë?·마비·?Œê´´?˜ê±°??br />?•ë³´ë¥??ˆì·¨Â·?¼ì†Â·?œê³¡?„파 ?˜ëŠ” ??모든 공격?‰ìœ„ë¥?말한??
2. ?œì‚¬?´ë²„?ˆì „?ì´?€ ?¬ì´ë²„í…Œ?¬ë¡œë¶€???•ë³´?µì‹ ?œì„¤ê³??•ë³´ë¥?보호
?˜ê¸° ?„하???˜í–‰?˜ëŠ” 관리적.물리??기술???˜ë‹¨ ë°??€?‘ì¡°ì¹?br />?±ì„ ?¬í•¨???œë™?¼ë¡œ???¬ì´ë²„위기ê?리ë? ?¬í•¨?œë‹¤.
3. ?œì‚¬?´ë²„?„기?ë? ?¬ì´ë²„í…Œ?¬ë¡œ ?¸í•˜??êµ??.?¬íšŒê¸°ëŠ¥???¬ê°??br />지?¥ì„ 초래?˜ê±°???¼í•´ê°€ ?„êµ­?ìœ¼ë¡??•ì‚°??ê°€?¥ì„±???ˆëŠ” ê²?br />?°ë? 말한??
- 5 – 6 -
4. ?œì‚¬?´ë²„?ŒëŸ¬?•ë³´?ë? ?•ë³´?œìŠ¤??ë°??•ë³´ë³´í˜¸?œìŠ¤???Œí”„?¸ì›¨?´ë?
?¬í•¨?œë‹¤) ?±ì— ?˜í•´ ?¬ì´ë²„í…Œ???‰ìœ„ë¡??ë‹¨?˜ëŠ” ?•ë³´ë¡œì„œ ?¬ì´ë²?br />?ŒëŸ¬ 근원지ë¥??Œì•…?˜ê¸° ?„í•œ ?¸í„°?·í”„로토콜주??IP)?€ ?¤íŠ¸?Œí¬
카드주소(MAC)ë¥??¬í•¨?œë‹¤.
5. ?œì‚¬?´ë²„?„기경보?ë? ?¬ì´ë²„í…Œ??징후ë¥??ë³„?˜ê±°???¬ì´ë²„위ê¸?br />발생???ˆìƒ?˜ëŠ” 경우 ê·??„í—˜ ?ëŠ” ?„협?˜ì???부?©ë˜??조치ë¥?br />?????ˆë„ë¡?미리 ?•ë³´ë¥??œê³µ?˜ê³  경고?˜ëŠ” 것을 말한??
6. ?œì‚¬?´ë²„?„기관리”ë? ?¬ì´ë²„위기ë? ?ˆë°©?˜ê³  ?¬ì´ë²„위기ê? 발생?˜ì?
??경우 ? ì†?˜ê³  체계?ìœ¼ë¡??€?‘하ê¸??„í•œ ?¬ì´ë²„í…Œ???ì?.?€??
?¬ê³ ??조사.복구, 모의?ˆë ¨, 경보발령 ë°?관계기관 간의 ?‘ì¡° ??br />êµ??차원??모든 ?œë™??말한??
7. ?œì‚¬?´ë²„?ŒëŸ¬ ë°©ì? ë°??„기관ë¦?책임기ê?(?´í•˜ ?œì±…?„기관?ì´??br />?œë‹¤)?ì´?€ ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리에 ê´€???…무ë¥??˜í–‰??br />ê³??ˆëŠ” ?¤ìŒ ê°?목의 기ê???말한??
ê°€. ?Œë??œë?êµ?—Œë²•ã€? ?Œì •ë¶€ì¡°ì§ë²•ã€? ê·?밖의 법령???°ë¼ ?¤ì¹˜??br />êµ??기ê?ê³?지방자치단ì²?및「국가?•ë³´??기본법」제3ì¡°ì œ10??br />???°ë¥¸ 공공기ê?
?? ?Œì •ë³´í†µ? ê¸°ë°?보호법ã€???ì¡°ì œ1??— ?°ë¥¸ 주요?•ë³´?µì‹ ê¸?br />반시?¤ì„ 관리하??기ê?
?? ?Œì •ë³´í†µ? ë§ ?´ìš©ì´‰ì§„ ë°??•ë³´ë³´í˜¸ ?±ì— ê´€??법률????6ì¡?br />????— ?°ë¥¸ 집적?•ë³´?µì‹ ?œì„¤?¬ì—…??ë°?ê°™ì? ë²???7조의4????— ?°ë¥¸ 주요?•ë³´?µì‹ ?œë¹„???œê³µ??br />?? ?Œì‚°?…기? ì˜ ? ì¶œë°©ì? ë°?보호??ê´€??법률????조에 ?°ë¥¸
êµ???µì‹¬ê¸°ìˆ ??보유??기업체나 ?°êµ¬ê¸°ê?
ë§? ?Œë°©?„사?…법????ì¡°ì œ9?¸ì— ?°ë¥¸ 방위?°ì—…ì²?ë°?ê°™ì? ë²???ì¡?br />??0?¸ì— ?°ë¥¸ ?„문?°êµ¬ê¸°ê?
8. ?œì‚¬?´ë²„?ŒëŸ¬ ë°©ì? ë°??„기관ë¦?지?ê¸°ê´€(?´í•˜ ?œì??ê¸°ê´€?ì´??br />?œë‹¤)?ì´?€ ?¬ì´ë²„í…Œ?¬ì— ?€??? ì†???ì?·?€??ë°??¬ê³ ì¡°ì‚¬Â·
복구 ?±ì„ 지?í•˜???¤ìŒ ê°?목의 기ê? ?ëŠ” ?…ì²´ë¥?말한??
ê°€. ?Œê³¼?™ê¸°? ë¶„???•ë?출연?°êµ¬ê¸°ê? ?±ì˜ ?¤ë¦½Â·?´ì˜ ë°??¡ì„±??br />ê´€??법률????조에 ?°ë¥¸ ?œêµ­?„자?µì‹ ?°êµ¬??부?¤ì—°êµ¬ì†Œ
?? ?Œì •ë³´í†µ? ë§ ?´ìš©ì´‰ì§„ ë°??•ë³´ë³´í˜¸ ?±ì— ê´€??법률????2ì¡?br />???°ë¥¸ ?œêµ­?¸í„°?·ì§„?¥ì›
?? ?Œì†Œ?„트?¨ì–´?°ì—… 진흥법」제24조에 ?°ë¼ ?Œí”„?¸ì›¨?´ì‚¬?…자ë¡?br />? ê³ ????ì¤?컴퓨?°ë°”?´ëŸ¬??백신?Œí”„?¸ì›¨?´ë? ?œìž‘ ?ëŠ” ?ë§¤
?˜ëŠ” ??br />?? ?Œêµ­ê°€?•ë³´??기본법ã€???ì¡°ì œ6?¸ì˜ ?•ë³´ë³´í˜¸?œìŠ¤?œì„ ?œìž‘
?˜ê±°???˜ìž…?˜ëŠ”??br />ë§? ?Œì •ë³´í†µ? ê¸°ë°?보호법」제9조에 ?°ë¼ 지?•ëœ 지?ì •ë³´ë³´??br />컨설?…전문업ì²?br />ë°? ê´€ê³??‰ì •ê¸°ê????¥ì´ 지?•í•œ 보안관?œì „문업ì²?br />2 ??법에???¬ìš©?˜ëŠ” ?©ì–´???•ì˜??????—???•í•˜??것을 ?œì™¸
- 7 – 8 -
?˜ê³ ???Œì •ë³´í†µ? ê¸°ë°?보호법」·「정보통? ë§ ?´ìš©ì´‰ì§„ ë°??•ë³´ë³´í˜¸
?±ì— ê´€??법률?Â·ã€Œêµ­ê°€?•ë³´??기본법」·「전기통? ì‚¬?…법?ì—??br />?•í•˜??바에 ?°ë¥¸??
??ì¡??¤ë¥¸ 법률과의 ê´€ê³? ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리에 ê´€?˜ì—¬
?¤ë¥¸ 법률???¹ë³„??규정???ˆëŠ” 경우ë¥??œì™¸?˜ê³ ????법에???•í•˜??br />바에 ?°ë¥¸?? ?¤ë§Œ, ?¬ì´ë²„위기ê? 발생??경우?ëŠ” ?¤ë¥¸ 법률???°ì„ 
?˜ì—¬ ?ìš©?œë‹¤.
??ì¡?책임기ê???책무) 책임기ê????¥ì? ?¬ì´ë²„í…Œ?¬ë? ?¬ì „ ?ˆë°©?˜ê¸°
?„하???Œê? ?•ë³´?µì‹ ?œì„¤???ˆì „??? ì???ì±…ìž„???ˆìœ¼ë©? ?´ë? ??br />?˜ì—¬ ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦??…무ë¥??„ë‹´?˜ëŠ” ?„문?¸ë ¥ ë°?br />?ˆì‚° ?•ë³´ ?±ì˜ ?„ìš”??조치ë¥?강구?˜ì—¬???œë‹¤.
??ì¡?민·ê? ?‘ë ¥) 1 ?•ë????¬ì´ë²„í…Œ??ë°©ì? ë°??€?‘에 ?‘ë ¥?˜ê³  ??br />?´ë²„?„기ë¥??¨ìœ¨?ìœ¼ë¡?관리하ê¸??„하???€?µë ¹?¹ìœ¼ë¡??•í•˜??바에
?°ë¼ 민간기ê? ?±ê³¼ ?‘의체ë? 구성.?´ì˜?????ˆë‹¤.
2 ????— ê´€?˜ì—¬ ?„ìš”???¬í•­ ?±ì? ?€?µë ¹?¹ìœ¼ë¡??•í•œ??
????êµ?? ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?추진체계
??ì¡?êµ???¬ì´ë²„안?„ì „?µíšŒ??) 1 êµ???¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?br />??ê´€??중요?¬í•­???¬ì˜?˜ê¸° ?„하??êµ???•ë³´?ìž¥ ?Œì†?˜ì— êµ???¬ì´ë²„안?„ì „?µíšŒ???´í•˜ ?œì „?µíšŒ?˜â€ë¼ ?œë‹¤)ë¥??”다.
2 ?„ëžµ?Œì˜???˜ìž¥?€ êµ???•ë³´?ìž¥???œë‹¤.
3 ?„ëžµ?Œì˜???„원?€ 중앙?‰ì •ê¸°ê???ì°¨ê?ê¸?공무?ê³¼ ?„ëžµ?Œì˜
?˜ìž¥???„ì´‰?˜ëŠ” ?ë¡œ ?œë‹¤.
4 ?„ëžµ?Œì˜???¤ìŒ ê°??¸ì˜ ?¬í•­???¬ì˜?œë‹¤.
1. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?ê´€???„ëžµ.?•ì±….?œë„ ?˜ë¦½ ë°?ê°?br />??br />2. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?ê´€??기ê?ê°???• ì¡°ì •??ê´€???¬í•­
3. ??2ì¡°ì œ2??— ?°ë¥¸ ?¬ì´ë²„위?‘ì •ë³?공유 ë°?보호??ê´€???¬í•­
4. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?ê´€???€?µë ¹ 지?œì‚¬??— ?€??조치
방안
5. ê·?밖에 ?„ëžµ?Œì˜ ?˜ìž¥??부?˜í•˜ê±°ë‚˜ ?„원???œì¶œ???¬í•­
5 ?„ëžµ?Œì˜???¨ìœ¨???´ì˜???„하???„ëžµ?Œì˜??êµ???¬ì´ë²„안?„ë?ì±?br />?Œì˜(?´í•˜ ?œë?책회?˜â€ë¼ ?œë‹¤)ë¥??????ˆë‹¤.
6 ?„ëžµ?Œì˜ ë°??€ì±…회?˜ì˜ 구성.?´ì˜ ?±ì— ê´€?˜ì—¬ ?„ìš”??구체?ì¸
?¬í•­?€ ?€?µë ¹?¹ìœ¼ë¡??•í•œ??
??ì¡?êµ???¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?기본계획 ?˜ë¦½ ?? 1 ?•ë???br />?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦??€ì±…의 ?¨ìœ¨??체계??추진???„하
??êµ???¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?기본계획(?´í•˜ ?œê¸°ë³¸ê³„?â€ì´??br />?œë‹¤)???˜ë¦½.?œí–‰?˜ì—¬???œë‹¤.
2 기본계획?€ ?€?µë ¹?¹ìœ¼ë¡??•í•˜??바에 ?°ë¼ êµ???•ë³´?ìž¥??ê´€
- 9 – 10 -
ê³?중앙?‰ì •ê¸°ê????¥ê³¼ ?‘의?˜ì—¬ ?„ëžµ?Œì˜ ?¬ì˜ë¥?거쳐 마련?œë‹¤.
3 중앙?‰ì •ê¸°ê????¥ì? ????˜ 기본계획???°ë¼ ?Œê? 책임기ê???br />?¥ì´ ?œìš©?????ˆë„ë¡?êµ???¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦??œí–‰ê³„획
(?´í•˜ ?œì‹œ?‰ê³„?â€ì´?¼ê³  ?œë‹¤)???‘성?˜ì—¬ ?Œê? 책임기ê????¥ì—ê²?br />배포?˜ì—¬???œë‹¤.
??ì¡??œí–‰ê³„획???´í–‰?¬ë? ?•ì¸ ë°?êµ?šŒ ë³´ê³ ) 1 중앙?‰ì •ê¸°ê????¥ì?
?Œê? 책임기ê????€?˜ì—¬ 매년 ?œí–‰ê³„획???´í–‰?¬ë?ë¥??•ì¸?˜ì—¬??br />?œë‹¤.
2 êµ???•ë³´?ìž¥?€ ????˜ ?•ì¸ê²°ê³¼ë¥?종합?˜ì—¬ êµ???¬ì´ë²„í…Œ??br />ë°©ì? ë°??„기관ë¦??¤íƒœë¥??ê?.?‰ê??˜ê³  ê·?ê²°ê³¼ë¥?êµ?šŒ??ë³´ê³ ??br />?¬ì•¼ ?œë‹¤. ?¤ë§Œ, êµ?šŒ, 법원, ?Œë²•?¬íŒ?? 중앙? ê±°ê´€ë¦¬ìœ„?íšŒ???€
???ê?.?‰ê???êµ?šŒ ?¬ë¬´ì´ìž¥, 법원?‰ì •ì²˜ìž¥, ?Œë²•?¬íŒ???¬ë¬´ì²?br />??ë°?주앙? ê±°ê´€ë¦¬ìœ„?íšŒ ?¬ë¬´ì´ìž¥???”ì²­??경우???œí•œ??
3 ????ë°?????˜ ?ˆì°¨?€ 방법 ?±ì— ê´€?˜ì—¬ ?„ìš”???¬í•­?€ ?€??br />?¹ë ¹?¼ë¡œ ?•í•œ??
??ì¡?êµ???¬ì´ë²„안?„센?°ì˜ ?¤ì¹˜) 1 ?¬ì´ë²„í…Œ?¬ì— ?€??êµ??차원??br />종합?ì´ê³?체계?ì¸ ?ˆë°©.?€?‘ê³¼ ?¬ì´ë²„위기ê?리ë? ?„하??êµ??
?•ë³´?ìž¥ ?Œì†?¼ë¡œ êµ???¬ì´ë²„안?„센???´í•˜ ?œì•ˆ?„센?°â€ë¼ ?œë‹¤)ë¥?br />?”다.
2 ?ˆì „?¼í„°???¤ìŒ ê°??¸ì˜ ?…무ë¥??˜í–‰?œë‹¤.
1. êµ???¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦??•ì±…???˜ë¦½2. ?„ëžµ?Œì˜ ë°??€ì±…회???´ì˜???€??지??br />3. ?¬ì´ë²„í…Œ??ê´€???•ë³´???˜ì§‘.분석.?„파
4. êµ???•ë³´?µì‹ ë§ì˜ ?ˆì „???•ë³´
5. êµ?? ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?매뉴???‘성.배포
6. ?¬ì´ë²„í…Œ?¬ë¡œ ?¸í•˜??발생???¬ê³ ??조사 ë°?복구 지??br />7. ?¸êµ­ê³¼ì˜ ?¬ì´ë²?공격 ê´€???•ë³´???‘ë ¥
3 êµ???•ë³´?ìž¥?€ ????˜ ?ˆì „?¼í„°ë¥??´ì˜?¨ì— ?ˆì–´ êµ??차원??br />종합?ë‹¨, ?í™©ê´€?? ?„협?”인 분석, ?¬ê³  조사 ?±ì„ ?„í•´ ë¯?ê´€.
êµ??©ë™?€?‘í?(?´í•˜ ?œí•©?™ë??‘í??ì´???œë‹¤)???¤ì¹˜.?´ì˜??????br />??
4 êµ???•ë³´?ìž¥?€ ?©ë™?€?‘í????¤ì¹˜.?´ì˜?˜ê¸° ?„하???„ìš”??ê²?br />?°ì—??중앙?‰ì •ê¸°ê? ë°?지?ê¸°ê´€???¥ì—ê²??¸ë ¥???Œê²¬ê³??¥ë¹„??지??br />???”ì²­?????ˆë‹¤.
?????¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦??œë™
??0ì¡??¬ì´ë²„í…Œ??ë°©ì??€ì±…의 ?˜ë¦½.?œí–‰) 1 책임기ê????¥ì? ?Œê?
?•ë³´?µì‹ ë§ê³¼ ?•ë³´ ?±ì˜ ?ˆì „??ë°?? ë¢°???•ë³´ë¥??„í•œ ?¬ì´ë²„í…Œ??br />ë°©ì??€ì±…을 강구?˜ì—¬???œë‹¤.
2 êµ???•ë³´?ìž¥?€ ????— ?°ë¥¸ ?¬ì´ë²„í…Œ??ë°©ì??€ì±…의 ?˜ë¦½????br />- 11 – 12 -
?”í•œ 지침을 ?‘성 배포?????ˆë‹¤. ??경우, êµ???•ë³´?ìž¥?€ 미리
ê´€ê³?중앙?‰ì •ê¸°ê????¥ê³¼ ?‘의?˜ì—¬???œë‹¤.
3 ????„ ?ìš©???Œì—??êµ?šŒ, 법원, ?Œë²•?¬íŒ?? 중앙? ê±°ê´€ë¦¬ìœ„
?íšŒ???‰ì •?¬ë¬´ë¥?처리?˜ëŠ” 기ê???경우?ëŠ” ?´ë‹¹ 기ê????¥ì´ ??br />?”하?¤ê³  ?¸ì •?˜ëŠ” 경우?ë§Œ ?ìš©?œë‹¤.
??1ì¡??…성?„로그램???•ì‚° 차단) 1 ?•ë????…성?„로그램 ?±ì´ ?¬í•¨
???¹ì‚¬?´íŠ¸ ?ëŠ” ?Œí”„?¸ì›¨???±ì„ 발견??경우??ê·??´ì˜?ì—ê²?br />?…성?„로그램??감염 ?•ì‚°ë°©ì? ?±ì— ?„ìš”??보안조치ë¥??????ˆë„
ë¡?ê´€???•ë³´ë¥??œê³µ?˜ì—¬???œë‹¤.
2 ?•ë???????˜ 조치?ë„ 불구?˜ê³  ?¬ì´ë²„í…Œ?¬ì— ?…ìš©?˜ê±°??br />?„í—˜?±ì´ ?’다ê³??ë‹¨?˜ëŠ” 경우??백신?„로그램??? í¬ ?±ì„ ?µí•´
?…성?„로그램 ?±ì„ ?? œ ?ëŠ” 차단?˜ê²Œ ?????ˆë‹¤.
3 ????ë°?????— ?°ë¼ ?„ìš”??조치??구체?ì¸ ?¬í•­?€ ?€?µë ¹
?¹ìœ¼ë¡??•í•œ??
??2ì¡?보안관?œì„¼???±ì˜ ?¤ì¹˜) 1 책임기ê????¥ì? ?¬ì´ë²„í…Œ???•ë³´ë¥?br />?ì?·분석?˜ì—¬ 즉시 ?€??조치ë¥??????ˆëŠ” 기구(?´í•˜ ?œë³´?ˆê???br />?¼í„°?ë¼ ?œë‹¤)ë¥?구축·?´ì˜?˜ê±°???¤ìŒ ê°??¸ì˜ 기ê???구축·?´ì˜
?˜ëŠ” 보안관?œì„¼?°ì— ê·??…무ë¥??„탁?˜ì—¬???œë‹¤. ?¤ë§Œ, ?Œì •ë³´í†µ??br />기반 보호법ã€???6조에 ?°ë¥¸ ?•ë³´ê³µìœ Â·ë¶„석?¼í„°??보안관?œì„¼?°ë¡œ
본다.
1. ê´€ê³?중앙?‰ì •ê¸°ê?2. êµ???•ë³´??br />3. ??ì¡°ì œ1?? œ8?¸ë°”목의 보안관?œì „문업ì²?br />2 책임기ê????¥ì? ????— ?°ë¥¸ ?¬ì´ë²„í…Œ???•ë³´?€ ?•ë³´?µì‹ ë§?
?Œí”„?¸ì›¨?´ì˜ 취약???±ì˜ ?•ë³´(?´í•˜ ?œì‚¬?´ë²„?„협?•ë³´?ë¼ ?œë‹¤)ë¥?br />ê´€ê³?중앙?‰ì •ê¸°ê?????ë°?êµ???•ë³´?ìž¥ê³?공유?˜ì—¬???œë‹¤.
3 êµ???•ë³´?ìž¥?€ ????˜ ?¬ì´ë²„위?‘정보의 ?¨ìœ¨?ì¸ ê´€ë¦?ë°???br />?©ì„ ?„하??관계기관???¥ê³¼ 공동?¼ë¡œ ?¬ì´ë²„위?‘정보통?©ê³µ? ì²´
계ë? 구축·?´ì˜?????ˆë‹¤.
4 ?„구? ì? ????— ?°ë¼ 공유?˜ëŠ” ?•ë³´???€?˜ì—¬???¬ì´ë²„위기ê?
리ë? ?„하???„ìš”???…무범위???œí•˜???•ë‹¹?˜ê²Œ ?¬ìš© 관리하?¬ì•¼
?œë‹¤.
5 ????— ?°ë¥¸ 보안관?œì„¼?°ì? ????— ?°ë¥¸ ?¬ì´ë²„위?‘ì •ë³?br />?µí•©ê³µìœ ì²´ê³„ 구축·?´ì˜ ë°??•ë³´ 관리에 ê´€???¬í•­ê³?????—
?°ë¥¸ ?¬ì´ë²„í…Œ???•ë³´??공유??ê´€??범위.?ˆì°¨.방법 ?±ì—
ê´€???¬í•­?€ ?€?µë ¹?¹ìœ¼ë¡??•í•œ??
??3ì¡??¬ê³ ì¡°ì‚¬) 1 중앙?‰ì •ê¸°ê????¥ì? ?¬ì´ë²„í…Œ?¬ë¡œ ?¸í•˜???Œê?
분야???¼í•´ê°€ 발생??경우?ëŠ” ê·??ì¸ê³??¼í•´?´ìš© ?±ì— ê´€?˜ì—¬
? ì†???¬ê³ ì¡°ì‚¬ë¥??¤ì‹œ?˜ê³ , ?¼í•´ê°€ 중ë??˜ê±°???•ì‚°???°ë ¤ê°€ ??br />??경우 즉시 ê´€ê³?중앙?‰ì •ê¸°ê?????ë°?êµ???•ë³´?ìž¥?ê²Œ ê·?ê²?br />ê³¼ë? ?µë³´?˜ì—¬???œë‹¤.
2 êµ???•ë³´?ìž¥?€ ????—??불구?˜ê³  êµ???ˆë³´ ë°??´ìµ??중ë???br />- 13 – 14 -
?í–¥??미친?¤ê³  ?ë‹¨?˜ëŠ” 경우 ê´€ê³?중앙?‰ì •ê¸°ê????¥ê³¼ ?‘의??br />??직접 ê·??¬ê³ ì¡°ì‚¬ë¥??¤ì‹œ?????ˆë‹¤.
3 êµ???•ë³´?ìž¥?€ ????— ?°ë¼ ?¬ê³ ì¡°ì‚¬ ê²°ê³¼ë¥??µë³´ë°›ê±°????
??— ?°ë¼ ?¬ê³ ì¡°ì‚¬ë¥???ê²°ê³¼, ?¼í•´??복구 ë°??•ì‚°ë°©ì?ë¥??„하??br />? ì†???œì •???„ìš”?˜ë‹¤ê³??ë‹¨?˜ëŠ” 경우 책임기ê????¥ì—ê²??„ìš”
??조치ë¥??”ì²­?????ˆë‹¤. ??경우 책임기ê????¥ì? ?¹ë³„???¬ìœ ê°€
?†ëŠ” ???´ì— ?°ë¼???œë‹¤.
4 ?„구? ì? ????ë°?????— ?°ë¥¸ ?¬ê³ ì¡°ì‚¬ë¥??„료?˜ê¸° ?„에
?¬ì´ë²„í…Œ?¬ì? ê´€?¨ëœ ?ë£Œë¥??„의ë¡??? œÂ·?¼ì†Â·ë³€ì¡°í•˜?¬ì„œ???„니
?œë‹¤.
??4ì¡??€?‘훈?? 1 ?•ë????¬ì´ë²„í…Œ?¬ë? ë°©ì??˜ê³  ?¬ì´ë²„위기에
체계?ì´ê³??¨ìœ¨?ìœ¼ë¡??€?‘하ê¸??„하???ˆë ¨???¤ì‹œ?˜ì—¬???œë‹¤.
2 ????˜ ?ˆë ¨?€ 매년 ?•ê¸° ?ëŠ” ?˜ì‹œë¡?구분?˜ì—¬ ?¤ì‹œ?????ˆìœ¼ë©?
?•ê¸° ?ˆë ¨?€ ?Œë¹„?ë?비자??관리법????4조에 ?°ë¥¸ 비상?€ë¹„훈??br />ê³??¨ê»˜ ?¤ì‹œ?????ˆë‹¤.
3 ????˜ ?ˆë ¨ ?¤ì‹œë°©ë²• ë°??ˆì°¨ ?±ì— ê´€?˜ì—¬ ?„ìš”???¬í•­?€ ?€
?µë ¹?¹ìœ¼ë¡??•í•œ??
??5ì¡??¬ì´ë²„위기경보의 발령) 1 êµ???•ë³´?ìž¥?€ ?¬ì´ë²„í…Œ?¬ì— ?€??br />체계?ì¸ ?€ë¹„ì? ?€?‘을 ?„하??책임기ê????¥ì˜ ?”ì²­ê³???2ì¡°ì œ2
??— ?°ë¼ ?˜ì§‘???•ë³´ë¥?종합·?ë‹¨?˜ì—¬ ê´€??·ì£¼?˜Â·ê²½ê³„·심ê°??¨ê³„
???¬ì´ë²„위기경보ë? 발령?????ˆë‹¤.2 êµ???•ë³´?ìž¥?€ ?¬ì´ë²„í…Œ?¬ê? êµ???ˆë³´??중ë????„í•´ë¥?초래??br />것으ë¡??ë‹¨?˜ëŠ” 경우?ëŠ” êµ???ˆë³´?¤ì˜ êµ???„기?í™© ?…무ë¥??´ë‹¹
?˜ëŠ” ?˜ì„ë¹„서관ê³??‘의?˜ì—¬ ?¬ê° ?˜ì???경보ë¥?발령?????ˆìœ¼ë©?
?¬ê°?¨ê³„???¬ì´ë²„위기경보ë? 발령??경우?ëŠ” ê·??¬ìœ ë¥?êµ?šŒ??br />?µë³´?˜ì—¬???œë‹¤.
3 êµ???•ë³´?ìž¥?€ ?¬ì´ë²„위기경보ë? 발령??경우 관계기관???¥ê³¼
경보 ?˜ì????¬ì „ ?‘의?˜ì—¬???œë‹¤.
4 책임기ê????¥ì? ????— ?°ë¥¸ ?¬ì´ë²„위기경보ê? 발령??경우
즉시 ?¼í•´ë°œìƒ??최소??ë°??¼í•´ë³µêµ¬ë¥??„í•œ 조치ë¥?취하?¬ì•¼ ?œë‹¤.
5 ?¬ì´ë²„위기경ë³?발령???ˆì°¨Â·ê¸°ì? ë°?책임기ê????¥ì˜ 조치 ?±ì—
ê´€?˜ì—¬ ?„ìš”???¬í•­?€ ?€?µë ¹?¹ìœ¼ë¡??•í•œ??
??6ì¡??¬ì´ë²„위기ë?책본부??구성) 1 ?•ë???경계?¨ê³„ ?´ìƒ???¬ì´
버위기경보ê? 발령??경우 ?ì¸ë¶„석, ?¬ê³ ì¡°ì‚¬, 긴급?€?? ?¼í•´ë³µêµ¬
?±ì˜ ? ì†??조치ë¥?취하ê¸??„하??êµ?? ??Ÿ‰??결집??ë¯?ê´€.êµ?br />?„문가가 참여?˜ëŠ” ?¬ì´ë²„위기ë?책본부(?´í•˜ ?œë?책본부?ë¼ ?œë‹¤)ë¥?br />구성·?´ì˜?????ˆë‹¤.
2 ?€ì±…본부?????´í•˜ ?œë?책본부?¥â€ì´???œë‹¤)?€ êµ???•ë³´?ìž¥?¼ë¡œ
?˜ë©°, ?€ì±…본부??구성·?´ì˜ ?±ì— ê´€?˜ì—¬ ?„ìš”???¬í•­?€ êµ???•ë³´??br />?¥ì´ ê´€ê³?중앙?‰ì •ê¸°ê????¥ê³¼ ?‘의?˜ì—¬ ?•í•œ??
3 ?€ì±…본부?¥ì? ????— ?°ë¥¸ ?€ì±…본부ë¥?구성·?´ì˜?˜ê¸° ?„하??ì±?br />?„기관 ë°?지?ê¸°ê´€???¥ì—ê²??„ìš”???¸ë ¥???Œê²¬ ë°??¥ë¹„???œê³µ
- 15 – 16 -
???”ì²­?????ˆë‹¤. ??경우 책임기ê? ë°?지?ê¸°ê´€???¥ì? ?¹ë³„??br />?¬ìœ ê°€ ?†ëŠ” ???´ì— ?°ë¼???œë‹¤.
4 ?€ì±…본부?¥ì? ????— ?°ë¼ ?¸ë ¥???Œê²¬ ë°??¥ë¹„???œê³µ????br />기ê????¥ì—ê²?ê·??Œìš”경비ë¥?지?í•  ???ˆë‹¤.
??7ì¡?기술지?? 1 책임기ê????¥ì´ ??2ì¡°ì œ1??ë°???5ì¡°ì œ4??˜
?…무ë¥??˜í–‰?¨ì— ?ˆì–´ ?„ìš”??경우 ê´€ê³?중앙?‰ì •ê¸°ê?????ë°?êµ??
?•ë³´?ìž¥?ê²Œ 지?ì„ ?”ì²­?????ˆë‹¤.
2 ê´€ê³?중앙?‰ì •ê¸°ê?????ë°?êµ???•ë³´?ìž¥?€ ????— ?°ë¥¸ 지??br />???”청받았????? ì†???€?‘이 ?´ë£¨?´ì§ˆ ???ˆë„ë¡?기술지????br />?„ìš”??조치ë¥??˜ì—¬???œë‹¤.
3 ????— ?°ë¥¸ 조치ë¥??„하??ê´€ê³?중앙?‰ì •ê¸°ê?????ë°?êµ????br />보원?¥ì? ?´ë‹¹ 지?ê¸°ê´€???¥ì—ê²??„ìš”??지?ì„ ?”ì²­?????ˆìœ¼ë©?
??경우, 지?ê¸°ê´€???¥ì—ê²?지?í•  ?´ìš©ê³?기간??미리 ?µë³´?˜ì—¬??br />?œë‹¤.
4 ê´€ê³?중앙?‰ì •ê¸°ê?????ë°?êµ???•ë³´?ìž¥?€ ????— ?°ë¼ 지??br />????기ê????¥ì—ê²?ê·??Œìš”경비ë¥?지?í•  ???ˆë‹¤.
?????°êµ¬ê°œë°œ ë°?지??br />??br />??8ì¡?책임기ê????€??지?? ?•ë???책임기ê????€?˜ì—¬ ?•ë³´?µì‹ ë§?br />??보호?˜ê¸° ?„하???„ìš”??기술???´ì „, ?¥ë¹„???œê³µ ë°?ê·?밖의?„ìš”??지?ì„ ?????ˆë‹¤.
??9ì¡??°êµ¬ê°œë°œ) 1 ?•ë????¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리에 ?„ìš”??br />기술개발ê³?기술?˜ì????¥ìƒ???„하???¤ìŒ ê°??¸ì˜ ?œì±…??추진??br />???ˆë‹¤.
1. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리에 ê´€??êµ?? ?°êµ¬ê°œë°œ 계획????br />ë¦??œí–‰
2. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리기???˜ìš”조사 ë°??™í–¥ë¶„석 ?±ì—
ê´€???¬ì—…
3. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리에 ê´€??기술??개발.보급.??br />???¬ì—…
4. ê·?밖에 ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?ê´€??기술개발 ë°?기술
?¥ìƒ ?±ì— ê´€?˜ì—¬ ?„ìš”???¬í•­
2 êµ???•ë³´?ìž¥?€ ????— ?°ë¼ ?°êµ¬?Œë? ?¤ë¦½?˜ê±°?? ?¤ë¥¸ 법령??br />?˜í•˜???¤ë¦½??기ê????„문기ê??¼ë¡œ 지?•í•  ???ˆë‹¤.
3 êµ?? ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?기술???°êµ¬ê°œë°œ??ê´€????br />차·방ë²????¸ë??¬í•­?€ êµ???•ë³´?ìž¥???°ë¡œ ?•í•œ??
??0ì¡??°ì—…?¡ì„±) 1 ?•ë????¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리에 ?„ìš”??br />?°ì—…???¡ì„±??지?í•˜ê¸??„하???¤ìŒ ê°??¸ì˜ ?œì±…???˜ë¦½.?œí–‰?˜ê³ 
?´ì— ?„ìš”???¬ì› ?•ë³´ 방안 ?±ì„ 마련?˜ì—¬???œë‹¤.
1. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦??°ì—… ?•ì±…?˜ë¦½ 지??br />2. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦??°ì—… 발전???„í•œ ? í†µ?œìž¥ ?œì„±
- 17 – 18 -
??지??br />3. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦??°ì—… ?¡ì„±???„í•œ ??????br />?‘력체계 구축
4. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦??°ì—… ê´€??êµ? œêµë¥˜.?‘ë ¥ ë°???br />?¸ì§„출의 지??br />2 ?•ë?????9ì¡°ì œ2??˜ ?°êµ¬??ë°??„문기ê??¼ë¡œ ?˜ì—¬ê¸?????˜
?°ì—…?¡ì„±???„ìš”???…무ë¥??˜í–‰?˜ê²Œ ?????ˆë‹¤.
??1ì¡??¸ë ¥?‘성 ë°?교육?ë³´) ?•ë????¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리의
기반??조성?˜ê³  ?¬ì´ë²„위기에 ?€??êµ?????¸ì‹???œê³ ?˜ê¸° ?„하??br />?¤ìŒ ê°??¸ì˜ ?œì±…??강구?˜ì—¬???œë‹¤.
1. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?ê´€???„문?¸ë ¥???‘성
2. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리에 ê´€???€êµ?? ?ë³´?œë™ ë°?교육
3. ê·?밖에 ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?ê´€???„문?¸ë ¥ ?‘성
ë°?교육?ë³´ ?±ì— ê´€?˜ì—¬ ?„ìš”???¬í•­
??2ì¡?êµ? œ?‘ë ¥) ?•ë????¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리에 ê´€?˜ì—¬ êµ? œ
기구·?¨ì²´ ë°??¸êµ­ê³¼ì˜ ?‘ë ¥??증진?˜ê¸° ?„하???¤ìŒ ê°??¸ì˜ ?…무
ë¥??˜í–‰?????ˆë‹¤.
1. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리ë? ?„í•œ ?í˜¸ê°??‘력체계 구축
2. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦?기술??ê´€???•ë³´??교류?€ 공동?€??br />3. ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관ë¦??„ë‹´?¸ë ¥???í˜¸ê°??Œê²¬êµìœ¡
??3ì¡?비ë? ?„수???˜ë¬´) ??법에 ?°ë¼ ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리업무에 종사?˜ê±°??종사?˜ì????ëŠ” ê·?직무???Œê²Œ ??비ë????€??br />?ê²Œ ?„설?˜ê±°??직무??목적 ?¸ì— ?´ë? ?¬ìš©?˜ì—¬?œëŠ” ?„니 ?œë‹¤.
??4ì¡??¬ìƒ ?? 1 êµ???•ë³´?ìž¥?€ ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리ì?
ê´€?¨í•˜???¤ìŒ ê°??¸ì˜ ?´ëŠ ?˜ë‚˜???´ë‹¹?˜ëŠ” ?ì— ?€?˜ì—¬ ?¬ìƒ??br />ê³? ?ˆì‚°??범위?ì„œ ?¬ìƒê¸ˆì„ 지급할 ???ˆë‹¤.
1. ?¬ì´ë²„í…Œ??기도??ê´€???•ë³´ë¥??œê³µ????br />2. ?¬ì´ë²„í…Œ?¬ë? ê°€???ë? ? ê³ ????br />3. ?¬ì´ë²„í…Œ?¬ì˜ ?ì? ë°??€?‘·복구에 공이 많ì? ??br />2 ????— ?°ë¥¸ ?¬ìƒê³??¬ìƒê¸?지급의 기ì?·방법ê³??ˆì°¨, 구체?ì¸
지급액 ???„ìš”???¬í•­?€ êµ???•ë³´?ìž¥???•í•œ??
????벌칙
??5ì¡?벌칙) 1 ?¤ìŒ ê°??¸ì˜ ?´ëŠ ?˜ë‚˜???´ë‹¹?˜ëŠ” ?ëŠ” 5???´í•˜
??징역 ?ëŠ” 3천만???´í•˜??벌금??처한??
1. ??2ì¡°ì œ2??ë°???2ì¡°ì œ4??„ ?„ë°˜????br />2. ??3ì¡°ì œ4??„ ?„ë°˜????br />3. ??3ì¡°ë? ?„ë°˜????br />2 ?…무??과실ë¡??¸í•˜??????˜ 죄ë? 범한 ?ëŠ” 2???´í•˜??징역
?ëŠ” 1천만???´í•˜??벌금??처한??
??6ì¡?과태ë£? 1 ?¤ìŒ ê°??¸ì˜ ?´ëŠ ?˜ë‚˜???´ë‹¹?˜ëŠ” ?ëŠ” 2천만??br />?´í•˜??과태료에 처한??
- 19 – 20 -
1. ??3ì¡°ì œ1??„ ?„ë°˜????br />2. ??6ì¡°ì œ3??„ ?„ë°˜????br />2 ??3ì¡°ì œ3??„ ?„ë°˜???ëŠ” 1천만???´í•˜??과태료에 처한??
3 ????ë°?????— ?°ë¥¸ 과태료는 ?€?µë ¹?¹ì´ ?•í•˜??바에 ?°ë¼
ê´€ê³?중앙?‰ì •ê¸°ê???부ê³?징수?œë‹¤.
부
ì¹?br />??ì¡??œí–‰?? ??법ì? 공포 ??6개월??경과??? ë????œí–‰?œë‹¤.
??ì¡?보안관?œì „문업체에 ?€??경과조치) ??ë²??œí–‰ ?¹ì‹œ 책임기ê???br />??2ì¡???˜ 보안관?œì„¼???…무ë¥??œê³µ?˜ê³  ?ˆëŠ” ?…ì²´????ë²??œí–‰
?¼ë???6개월 ?´ì— ?‰ì •ê¸°ê????¥ì—ê²?보안관?œì „문업체로 지??ë°?br />?„야 ?œë‹¤.êµ?? ?¬ì´ë²„í…Œ??ë°©ì???ê´€??법률??비용추계??미첨부 ?¬ìœ ??br />1. ?¬ì •?˜ë°˜?”인
?œì •?ˆì? êµ???•ë³´?ìž¥ ?Œì†?˜ì— êµ???¬ì´ë²„안?„센?°ë? ?¤ì¹˜?˜ë„ë¡??˜ê³  ??br />ê³?????ì¡?, ?¬ê°?¨ê³„???¬ì´ë²„위기경보ê? 발령??경우?ëŠ” ?¬ì´ë²„위기ë?책본
부ë¥?별도ë¡?구성·?´ì˜?˜ë˜ 본ë??¥ì? ?¸ë ¥???Œê²¬ ë°??¥ë¹„ë¥??œê³µ??기ê?????br />?ê²Œ ê·??Œìš”경비ë¥?지?í•  ???ˆë„ë¡?규정?˜ê³  ?ˆìœ¼ë©?????6ì¡?, 책임기ê???br />?¥ì˜ 기술지???”ì²­???°ë¼ 지?ì„ ??기ê????¥ì—ê²?ê·??Œìš”경비ë¥?지?í•  ??br />?„록 ?˜ê³  ?ˆê³ (????7ì¡?, 책임기ê????€?˜ì—¬ 기술???´ì „, ?¥ë¹„???œê³µ ?±ì˜
지?ì„ ?????ˆë„ë¡??˜ê³  ?ˆìœ¼ë©?????8ì¡?, ?¬ì´ë²„í…Œ??ë°©ì? ë°??„기관리ì?
ê´€?¨í•˜???¬ìƒê¸ˆì„ 지급할 ???ˆë„ë¡??˜ê³  ?ˆë‹¤(????4ì¡?.
2. 미첨부 근거 규정
?Œì˜?ˆì˜ 비용추계 ?±ì— ê´€??규칙?ì œ3ì¡°ì œ1?? œ2??비용추계???€?ì´ êµ??
?ˆì „보장·군사기ë???ê´€???¬í•­??경우) ë°?????기술?ìœ¼ë¡?추계가 ?´ë ¤??ê²?br />?????´ë‹¹?œë‹¤.
3. 미첨부 ?¬ìœ 
?œì •?ˆì— ?°ë¼ ?¬ì •?Œìš”ê°€ 발생??것으ë¡??ˆìƒ?˜ëŠ” 부ë¶?ì¤?첫째, êµ???¬ì´ë²?br />?ˆì „?¼í„°???¤ì¹˜Â·?´ì˜ë¹„ìš©ê³?ê´€?¨í•˜?¬ì„œ???„재 ?˜êµ­ê°€?¬ì´ë²„안?„ê?리규???€??br />?¹í›ˆ????22????ì¡??™ì— ?˜í•˜???´ë? 조직???œë™ 중에 ?ˆìœ¼ë©? ?œì •?ˆì˜ ??br />?‰ì— ?°ë¼ 별도ë¡?추ê????¸ë ¥?´ë‚˜ ?œì„¤???†ì„ 것이?´ì„œ 추ê?비용 발생?€ ??br />- 21 – 22 -
견되지 ?ŠëŠ”?¤ëŠ” ?´ë‹¹?ì˜ ?µë????ˆì—ˆ?? ?í•œ ?„재 ?´ì˜ 중인 조직?´ì˜ ë°???br />?…ë‚´??? êµ???ˆì „보장ê³?ê´€?¨ëœ ?œë™?´ì—­???¬í•¨?˜ì–´ ?ˆìœ¼ë¯€ë¡??´ë? 공개?˜ê¸°
곤ë??˜ë‹¤ê³??˜ëŠ” ??êµ???€?ŒëŸ¬?¼í„°ë¥??¤ì¹˜Â·?´ì˜ë¹„ìš©??추정?˜ëŠ”???„실?ìœ¼ë¡?br />?œê³„ê°€ 존재?œë‹¤.
?˜ì§¸, ?¬ì´ë²„위기ë?책본부??구성ê³??´ì˜???€???Œìš”경비 지?ì? ?¨ê??°ì •??br />참고??과거 ? ì‚¬?¬ë?(?¬ê°?¨ê³„???¬ì´ë²„위기경보ê? 발령???¬ë?)ê°€ 충분??ì¡?br />?¬í•˜ì§€ ?Šê³ , ?¬ì´ë²„위기의 종류?€ ?¼í•´???•ë„·범위 ë°?그에 ?°ë¥¸ ?¸ë ¥???Œê²¬
ê³??¥ë¹„???œê³µ?¼ë¡œ ?¸í•´ 발생?˜ëŠ” ?Œìš”경비ë¥??°ì •?˜ëŠ”???„ìš”???©ë¦¬??기ì?
?±ì´ 부족하???„재?œì ?ì„œ ? ë¹™???ˆëŠ” 추계ë¥??‘성?˜ê¸°ê°€ 곤ë??˜ë‹¤.
?‹ì§¸, 책임기ê? 지?ê¸°ê´€???€??지?? 책임기ê????€??지??ë°??¬ì´ë²„í…Œ??br />ë°©ì??€ ê´€?¨í•œ ?¬ìƒê¸?지급과 ê´€?¨í•˜?¬ì„œ???„재 ?œì ?ì„œ 책임기ê???지?ìš”
ì²??´ìš©?´ë‚˜ ê·?규모ë¥??ˆì¸¡?˜ê±°??책임기ê????€??지?ì˜ 구체?ì¸ ?´ìš©??br />?ˆìƒ?˜ëŠ” 것이 ?¬ì‹¤??곤ë??˜ê³ , ?¬ì´ë²„í…Œ??ë°©ì??€ ê´€?¨í•œ ?¬ìƒê¸?지급에 ê´€??br />기ì? ë°?지급액 ?±ì´ 미정???íƒœ?´ê¸° ?Œë¬¸??ê´€??비용??추계?˜ê¸°ê°€ 곤ë???br />??
4. ?‘성??br />?œìƒê¸??˜ì›???¥ì˜¤??보좌관 (02-788-2271)
7 min read

Signature Based Detection of User Events for Post-Mortem Forensic Analysis

As seen on DigitalFIRE
<div>
<div class="separator" style="clear: both; text-align: center;"></div>The concept of signatures is used in many fields, normally for the detection of some sort of pattern. For example, antivirus and network intrusion detection systems sometimes implement signature matching to attempt to differentiate legitimate code or network traffic from malicious data. The principle of these systems that that within a given set of data, malicious data will have some recognizable pattern. If malicious code, for example, has a pattern that is different in some way to non-malicious data, then the malicious data may be able to be differentiated with signature-based methods. In terms of malware, however, signature based methods are becoming less effective as malicious software gains the ability to alter or hide malicious patterns. For example, polymorphic or encrypted code.

This work suggests that signature based methods may also be used to detect patterns or user actions of a digital system. This is based on the principle that computer systems are interactive. This means that when a user interacts with the system, the system is immediately updated. In this work, we analyzed a user’s actions in relation to timestamp updates on the system.

During experimentation, we found that timestamps on a system may be updated for many different reasons. Our work, however, determined that there are at least three major timestamp update patterns given a user action. We define these as Core, Supporting and Shared timestamp update patterns.

Core timestamps are timestamps that are updated each time, and only when, the user action is executed.

Supporting timestamps are timestamps that are updated sometimes, and only when, the user action is executed.

Shared timestamps are timestamps that are shared between multiple user actions. So, for example, the timestamps of a single file might be updated by two different user actions. With shared timestamps it is impossible to determine which action updated the timestamp without more information.

By categorizing timestamps into these three primary categories, we can construct timestamp signatures to detect if and when a user action must have happened. For example, since only one action can update Core timestamps, the time value of the timestamp is approximately the time in which the user action must have taken place.

The same can be said for Supporting timestamps, but we would expect Supporting timestamps values to be at or before the last instance of the user action.

Using this categorization system, and finding associations of timestamps to user actions, user actions in the past can be reconstructed just by using readily available meta-data in a computer system.

For more information, please see our article on this topic:

James, J., P. Gladyshev, and Y. Zhu. (2011) “Signature Based Detection of User Events for Post-Mortem Forensic Analysis”. Digital Forensics and Cyber Crime: Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering. Volume 53, pp 96-109. Springer. [PDF][arXiv:1302.2395]</div>

Image courtesy of Salvatore Vuono / FreeDigitalPhotos.net

2 min read

Automata Intersection to Test Possibility of Statements in Investigations

As seen on DigitalFIRE.

When conducting an investigation, many statements are given by witnesses and suspects. A “witness” could be considered as anything that provides information about the occurrence of an event. While a witness may traditionally be a human, a digital device - such as a computer or cell phone - could also help to provide information about an event. Once a witness provides a statement, the investigator needs to evaluate the level of trust he or she places in the validity of the statement. For example, a statement from a witness that is known to lie may be considered less trustworthy Similarly, in the digital realm, information gathered from a device may be less trustworthy if the device has been known to be compromised by a hacker or virus.

When an investigator gets statements from witnesses, the investigator can then begin to restrict possibilities of happened events based on the information. For example, if a trustworthy witness says she saw a specific suspect at a specific time, and the suspect claims to be out of the country at that time, these are conflicting statements. A witness statement may not be true for a number of reasons, but the statement may be highly probable. At a minimum when conflicting statements occur, these indicate that one or both statements should be investigated further to find either inculpatory or exculpatory evidence.

If an action happens that affects a computer system, observation of the affected data in the system could be used a evidence to reduce the possible states the system could have been in before it’s current state. Taking this further, if we create a complete model of a system then without any restriction on the model, any state of the system could possibly be reachable.

Computer systems can be modeled as finite state automata (FSA). In this model, each state of the system is represented as a state in the FSA. The set of all states is defined as Q. Each action that alters the state of the system can be represented as a symbol in the alphabet (Σ) of the automaton. Moving from one state to another is controlled by a transition function δ where δ: Q × Σ → Q.

In the case of an investigation of a computer system, the investigator may be able to directly observe only the final state of the system. The set of final, or accepting, states is defined as F, where F ⊆ Q. The start state (q0, where q0∈ Q) is likely to be unobservable, and may be unknown. Because of this, any state in the model may potentially be a start state. To account for this, a generic start state g, where  g ∉ Q, can be defined. g is a generic start state with a tradition to each state in Q on each input leading to that particular state. The result of this process is a model of the system that allows for any possible transitions in the system that result in the observed final state from any starting state.

This FSA of the system can then be used to test statements about interactions with the system. As a very basic example, consider an FSA with only two states. The first state is the system before a program is ran, and no prefetch entry has been created (!PrefetchX). The second state is after a program has been ran, and a prefetch entry has been created (PrefetchX). The transition symbol is defined as “Run_Program_X”. The FSA can be visualized as:

(!PrefetchX) -> Run_Program_X -> (PrefetchX)

For the sake of this example, it is known that a prefetch entry will not be created unless a program is ran, so the start state is defined as (!PrefetchX). An investigator observes that in the final state of the system PrefetchX did exist, so the final accepting state is (PrefetchX).

A suspect who normally uses the system is asked whether they executed Program X, and she claims she did not. Her statement may then also be modeled in terms of the previous FSA, where any transition is allowed except “Run_Program_X”. Her statement can be visualized as: 

() -> !Run_Program_X -> ()

In this statement, she is claiming that any state and transition is possible except for “Run_Program_X”. When both the system and the suspect’s statement are modeled, the FSA can be intersected to determine if the final observed state of the system is reachable with the restrictions the suspect statement places on the model. In the given example, the only possible transition to get to the observed final state is Run_Program_X. If the system model were intersected with the suspect’s statement, the final state (PrefetchX) would not be reachable because the transition that leads to the final state would not be possible. In this case, the suspect statement is inconsistent with the observed final state, and should therefore be investigated further.

This very simple example can be applied to more complex situations and models; however, a challenge with using a computational approach to model real-world systems is a very large state-space to model even for relatively simple systems.

For a more in-depth explanation, please see Analysis of Evidence Using Formal Event Reconstruction.

[1] James, J., P. Gladyshev, M.T. Abdullah, Y. Zhu. (2010) “Analysis of Evidence Using Formal Event Reconstruction.” Digital Forensics and Cyber Crime 31: 85-98. [PDF][arXiv:1302.2308]

Image courtesy of Stuart Miles / FreeDigitalPhotos.net

4 min read

Korean National Police University International Cybercrime Research Center


<div class="p1">Today is the inauguration of the Korean National Police University (KNPU) International Cybercrime Research Center (ICRC). The inauguration ceremony will immediately be followed by the 1st International Cybercrime Research Seminar.</div><blockquote class="tr_bq">Grand opening: The International Cybercrime Research Center, Korean National Police University.
The opening ceremony will be hosted by the president of the Korean National Police University on September 18th, 2012. To commemorate the event, the Police orchestra will perform, and the 1st International Cybercrime Research Seminar will be held. The International Cybercrime Research Center will focus on multi-disciplinary research and support dealing with cyber-policing strategy, quality education and training. The official website for the International Cybercrime Research Center will be available soon.
For more information or to submit proposals for partnerships and collaboration, please contact: [email protected]r</blockquote><div class="p1">The 1st International Cybercrime Research Seminar will cover current trends and the future of training, education and research, child exploitation in South Korea (International Centre for Missing and Exploited Children: South Korean Chapter), and criminological aspects of digital crime.

Update:
<blockquote class="tr_bq">경찰대학(학장 치안정감 서천호)에서는 2012. 9. 18(화) 14:00 영상강의실에서 오전 1부행사로 국제 사이버범죄 연구센터 개소식에 이어 국내외 전문가들이 ‘유럽 사이버수사의 교육훈련 동향’ , ‘호주의 정보보안 침해범죄의 범죄학적 연구’, ‘인터넷 아동음란물 실태와 대응방안’ 등에 대한 제1회 국제사이버 범죄 세미나를 개최하였다.
경찰대학(학장 치안정감 서천호)에서는 2012. 9. 18(화) 14:00 영상강의실에서 오전 1부행사로 국제 사이버범죄 연구센터 개소식에 이어 국내외 전문가들이 ‘유럽 사이버수사의 교육훈련 동향’ , ‘호주의 정보보안 침해범죄의 범죄학적 연구’, ‘인터넷 아동음란물 실태와 대응방안’ 등에 대한 제1회 국제사이버 범죄 세미나를 개최하였다.</blockquote>
http://www.police.ac.kr/open/photo_news.jsp?SEQ=33307&BoardMode=view</div>

1 min read

Digital Forensic Investigation and Cloud Computing

We have a chapter in an upcoming book, Cybercrime and Cloud Forensics: Applications for Investigation Processes

Our chapter aims to be a high-level introduction into fundamental concepts of both digital forensic investigations and cloud computing for non-experts in one or both areas. Once fundamental concepts are established, we examine cloud computing security-related questions; specifically how past security challenges are inherited or solved by cloud computing models, as well as new security challenges that are unique to cloud environments. Next, an analysis is given of the challenges and opportunities cloud computing brings to digital forensic investigations. Finally, the Integrated Digital Investigation Process model is used as a guide to illustrate considerations and challenges during investigations involving cloud environments.

Bookmark and Share

~1 min read

Digital Crime Categorization using the AFP CCPM

When attempting to gather statistics about digital crime and investigations in different countries, and sometimes on a national level, the inconsistency in categorization of different digital crimes makes measurement and analysis difficult. Looking at various digital crime categorization schemes, like those described in Casey’s Digital Evidence and Computer Crime categorization is limited to categorizing the role of the technology in the crime. Crime categorization by law enforcement, however, is based on crime type. From what I have seen, digital investigators would categorize crimes they are interested in not based on the role of technology in the crime, but the type of crime committed. The digital component seems to rarely be a factor in categorization - the digital component is assumed - but practical classification instead relies on the type of crime.

The Australian Federal Police have made available their crime categorization and prioritization model (CCPM), and to my knowledge this is the only publicly available one that I have found. If you know of another case categorization and prioritization model that is available, either publicly or for law enforcement specifically, please leave a comment or contact us. The CCPM is a good starting point for the categorization of all types of crime.

We are currently looking at applying the AFP’s model as a template for the categorization and prioritization of crime with a digital component. Prioritization is an interesting problem, that will be discussed further later. But in attempting to prioritize, it is important to have crime categories clearly and meaningfully defined. We will be working on this more over the next few days.

Update: Financial fraud category reference: http://fightfraud.nv.gov/fin_fraud_types.htm

1 min read
Back to Top ↑

Infosec

[How to] GPG and Signing Data

GNU Privacy Guard (GPG) uses public and private keys to secure communications (public-key cryptography). Many people use it to encrypt their email or other documents. An email encrypted with a user's public key can then only be decrypted with the same user's private key. This provides end-to-end encryption of the message, meaning that it is impractical for anyone that is listening in on the conversation to get the message in transit.


This is, of course, good and bad. For example, Google and other email providers use email text to gain intelligence about the user, sell user information and do better ad targeting. This revenue stream keeps these services free, but users pay for it in terms of 'sold' privacy. Email using end-to-end encryption cannot be analyzed for useful marketing information. Because of this, these providers don't want to make it easy for mass encryption.

On the other hand, criminals also use Cloud-based email services. Making encryption somewhat difficult means that sloppy criminals are less likely to use encryption. If so, they may be easier to detect and catch.

Related Book: Lucas, Michael. PGP & GPG: Email for the Practical Paranoid. No Starch Press. 2006.

Whether you are paranoid and want all your emails encrypted (good luck), or you are trying to implement a personal or business data classification policy, GPG can help with encryption requirements.

Beyond encryption, GPG is useful for signing data. This is not exactly a signature that you would put on a document. Instead it is a signature that verifies that the data is correct. The video below describes how to sign data.




<div style="text-align: justify;">Signing data lets your contacts know that the data has not been modified from the time it left your possession. Signing is NOT encryption. Everyone could see the contents. Singing just allows your contact to know the data came from you, and it is in it’s original state.</div>

1 min read

[How to] Beginner Introduction to The Sleuth Kit (command line)

Today we will give a beginner-level introduction to The Sleuth Kit from command line. If this video is helpful, I highly recommend reading The Law Enforcement and Forensic Examiner’s Introduction to Linux.

<div class='embed-container'><iframe src='https://www.youtube.com/embed//R-IE2j04Chc' frameborder='0' allowfullscreen></iframe></div>

~1 min read

[How to] Installing and updating Linux in Virtualbox

Today we are going to install and update a Debian-based operating system in VirtualBox as a guest operating system.

The first video goes through creating a virtual machine in VirtualBox, and installing an operating system from an ISO disk image.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//-vVh550oaoI' frameborder='0' allowfullscreen></iframe></div></div>

The next video uses apt-get to update the software in the system, as well as ifconfig and ping to check if the network is working.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//w97PciH_XSw' frameborder='0' allowfullscreen></iframe></div></div>
The final video shows how to install VirtualBox Guest Additions to allow multiple features inside the guest operating system.

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed//tAElCds6tu8' frameborder='0' allowfullscreen></iframe></div></div>

~1 min read

What I’m Reading: A functional reference model of passive systems for tracing network traffic

What I’m Reading: Today we are talking about ‘A functional reference model of passive systems for tracing network traffic’ by Thomas E. Daniels. This paper deals with network traffic origin analysis using passive methods.

T. E. Daniels, “A functional reference model of passive systems for tracing network traffic,” Digit. Investig., vol. 1, no. 1, pp. 69–81, Feb. 2004.

Link: http://www.sciencedirect.com/science/article/pii/S1742287603000045

<div class="separator" style="clear: both; text-align: center;"><div class='embed-container'><iframe src='https://www.youtube.com/embed/z-jthlQB6sE' frameborder='0' allowfullscreen></iframe></div></div></div>
Audio only:

<iframe seamless="" src="https://bandcamp.com/EmbeddedPlayer/track=2817486589/size=small/bgcol=ffffff/linkcol=0687f5/transparent=true/" style="border: 0; height: 42px; width: 100%;">WIR:E01 - A functional reference model of passive systems for tracing network traffic by Joshua James</iframe>

~1 min read

No More Ransom - Detecting and unlocking ransomware without paying

Data is valuable. Ransomware takes advantage of the financial or sentimental value of our data, as well as the fact that most homes and organizations do not have adequate data backup solutions in place.

Once a computer is infected with ransomware, individual files are normally encrypted and users are asked to pay a ransom to unlock their data. If the victim pays, the data may or may not be unlocked. Ransomware started off like most viruses, targeting average computer users opportunistically. Ransomware groups, however, started targeting hospitals, police organizations and others.

<div class="separator" style="clear: both; text-align: center;">Use nomoreransom.org to unlock your data</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">So what can you do if you are infected with ransomware? Internet vendors and law enforcement have come together to create No More Ransom. This website gives users information about current types of ransomware, download unlocking tools (for free), provides prevention information and even has a tool to analyze your encrypted files and recommends which unlocking tool to use.</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">The Problem</div><div class="separator" style="clear: both; text-align: left;">Ransomware is possible because people do not have backups in place.</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">The Solution</div><div class="separator" style="clear: both; text-align: left;">Backups.</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">If you have an extra hard-drive that you are not using, or even another computer that is often on, CrashPlan is a pretty straightforward backup solution that is free if you save data to your own computers.</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">Note: DropBox or similar are not good backup solutions because they constantly sync changes. If ransomware infects your systems, the changes may be synced to your cloud storage. With a backup solution like CrashPlan, 1) backup is not instantaneous and 2) CrashPlan keeps track of prior versions of data. So if encrypted files were backed up, you can still restore prior versions. Best of all, CrashPlan provides end-to-end encryption (if enabled).</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">
</div><div class="separator" style="clear: both; text-align: left;">
</div>


<div class="alignleft"> </div>

1 min read

Paid Graduate Positions Available: Digital Investigations in Internet of Things

The Legal Informatics and Forensic Science (LIFS) Institute in the College of International Studies at Hallym University, South Korea, currently has openings for full-time researchers at the Masters, Ph.D. and Postdoctoral levels.

These positions deal with Internet of Things (IoT) digital forensic investigations. The following skills are necessary:
  • Programming skills (any language)
  • Ability to plan and carry out research
  • Ability to work with a team

The following skills are preferred but not required:
  • Knowledge of embedded systems
  • Embedded system programming experience
  • Computer / Network administration experience
  • Competency in Linux / Unix systems
  • Knowledge of Digital Forensic Investigation techniques (esp. acquisition)

These positions include a full scholarship as well as a monthly living stipend. Candidates should be willing to relocate to Chuncheon, South Korea.

To apply for the Master’s and Ph.D. positions, please do the following:
  1. Send an email with your CV and links to any research papers you have published to [email protected] with the subject “IoT Graduate Application”.
  2. Apply for a graduate position with Hallym University [http://bit.ly/20Fvvi4] by November 10th, 2016.
    • Download the application files [http://bit.ly/2eWHVSM]
    • Complete the basic application files
    • Mail the application files to [email protected] and CC [email protected]
    • Other documents can be provided later (such as passport info, diploma, etc.)
    • No Visa will be issued until certified copies of supporting documents are provided.

To apply for a Post-doctoratein Digital Forensic Investigation of IoT Devices, please do the following:
  1. Send an email with your CV and links to any research papers you have published to [email protected] with the subject “IoT Postgraduate Application”.
    • Candidates must have already completed a PhD degree.
1 min read

사물인터넷(IoT) 디지털 수사 관련 전액장학금 지원 석/박사 직위 공모

한림대학교 국제학부의 정보법과학전공에서는 현재 석사, 박사, 그리고 박사후과정생을 대상으로 정규직 연구원을 모집하고 있습니다.
해당 직위는 사물인터넷(IoT) 디지털 포렌식 수사에 관련된 연구를 담당하므로, 다음과 같은 자격을 요합니다.
<ul><li>프로그래밍 실력 (언어 무관)
</li><li>연구설계 및 실행능력
</li><li>팀워크 능력

</li></ul>다음 사항은 필수 자격요건은 아니나 권장되는 능력입니다:
<ul><li>임베디드 시스템에 대한 지식
</li><li>임베디드 시스템 프로그래밍 경험
</li><li>컴퓨터/네트워크 관리 경험
</li><li>리눅스/유닉스 시스템에 대한 이해
</li><li>디지털 포렌식 기법 (특히 획득기법)에 대한 지식


</li></ul>해당 직위는 전액 장학금 및 생활비가 제공됩니다. 후보자들은 거주지를 강원도 춘천시로 옮기는 것이 권장됩니다.


석사 및 박사과정생 직위에 응모하기 위해서는 다음과 같이 지원해주시기 바랍니다.
<ol><li>자신의 이력서 1, 출판된 연구결과물에 대한 링크를 “IoT Graduate Application”을 제목으로 하여 [email protected]r 로 이메일을 보내주시기 바랍니다.
</li><li>20161110일까지 다음 링크 [http://bit.ly/20Fvvi4]로 한림대학교 대학원에 지원해주시기 바랍니다.
다음 링크 [http://bit.ly/2eWHVSM]를 통해 지원서 파일을 다운받은 후,
기본 지원서 파일을 작성하시기 바랍니다.
완성된 지원서 파일을 [email protected] 로 발송하면서 [email protected] 로 참조를 걸어주시기 바랍니다.

여권 및 신분증, 학위증명서 등 기타 다른 문서는 나중에 제출하셔도 됩니다.</li></ol>박사후과정직위에 응모하기 위해서는 다음과 같이 지원해주시기 바랍니다:

1. 자신의 이력서 1, 출판된 연구결과물에 대한 링크를 “IoT Graduate Application”을 제목으로 하여[email protected]r로 이메일을 보내주시기 바랍니다.
후보자들은 반드시 박사학위를 취득하였어야 합니다

~1 min read

[CFP] CLOUDFOR extended submission deadline

CLOUDFOR 2016: Workshop on Cloud Forensics
In conjunction with the 9th IEEE/ACM International Conference on Utility and Cloud Computing (UCC), Tongji University, Shanghai, China.
6-9 December 2016

Scope and Purpose
=================
As a consequence of the sharp growth in the Cloud Computing market share, we can expect an increasing trend in illegal activities involving clouds, and the reliance on data stored in the clouds for legal proceedings. This reality poses many challenges related to digital forensic investigations, incident response and eDiscovery, calling for a rethink in traditional practices, methods and tools which have to be adapted to this new context.
This workshop aims to bring researchers and practitioners together as a multi-disciplinary forum for discussion and dissemination of ideas towards advancing the field of Cloud Forensics.

Topics of interest comprise, but are not limited to:
* Digital evidence search and seizure in the cloud
* Forensics soundness and the cloud
* Cybercrime investigation in the cloud
* Incident handling in the cloud
* eDiscovery in the cloud
* Investigative methodologies for the cloud
* Forensics readiness in the cloud
* Challenges of cloud forensics
* Legal aspect of cloud investigations
* Tools and practices in cloud forensics
* Case studies related to cloud forensics
* Forensics-as-a-Service
* Criminal profiling and reconstruction in the cloud
* Data provenance in the cloud
* Law enforcement and the cloud
* Big data implications of cloud forensics
* Economics of cloud forensics
* Current and future trends in cloud forensics
* Grid forensics

Important dates
===============
* Paper submission: 15 August 2016 (extended deadline)
* Notification of acceptance: 05 September 2016
* Camera-ready submission: 21 September 2016

Workshop chairs
===============
Virginia N. L. Franqueira
University of Derby, UK
v.franqueira[at]derby.ac.uk

Kim-Kwang Raymond Choo
University of South Australia, AU

Tim Storer
University of Glasgow, UK

Andrew Jones
University of Hertfordshire, UK

Raul H. C. Lopes
Brunel University (GriPP & CMS/CERN), UK

Program Committee
=================
George Grispos, The Irish Software Research Centre (LERO), IE
Andrew Marrington, Zayed University, AE
Kiran-Kumar Muniswamy-Reddy, Amazon Web Services, US
Joshua I. James, Hallym University, KR
Geetha Geethakumari, BITS Pilani, IN
Shams Zawoad, Visa Inc., US
Olga Angelopoulou, University of Hertfordshire, UK
Vrizlynn Thing, Institute for Infocomm Research, SG
Theodoros Spyridopoulos, University of the West of England, UK
Vassil Roussev, University of New Orleans, US
Yijun Yu, Open University, UK
Ibrahim Baggili, University of New Haven, US
Martin Schmiedecker, SBA Research, AT
Ben Martini, University of South Australia, AU
Hein S. Venter, University of Pretoria, ZA
Ruy de Queiroz, Federal University of Pernambuco, BR
Martin Herman, National Institute of Standards and Technology, US
Mark Scanlon, University College Dublin, IE

Submission
==========
Authors are invited to submit original, unpublished work which will be reviewed by three committee members. Submission should be blind, i.e., with no stated authors, or self-references. Papers should comply with the IEEE format, and have a maximum of 6 pages; guidelines are available at: http://www.ieee.org/conferences_events/conferences/publishing/templates.html
All accepted papers will be published in the IEEE conference proceedings – provided they are presented at the workshop.
Submission will be handled through EasyChair: https://easychair.org/conferences/?conf=cloudfor2016
2 min read

Facebook Capture the Flag Platform Now Available

Facebook’s hacking education platform and capture the flag is now available. See their release single here. Their goal is to educate about different types of web attacks by giving access to CTF infrastructure and letting more groups run hacking competitions. From their github repository:
<div class="separator" style="clear: both; text-align: center;"></div>
<ul><li>Organize a competition. This can be with as few as two participants, all the way up to several hundred. The participants can be physically present, active online, or a combination of the two.</li><li>Follow setup instructions below to spin up platform infrastructure.</li><li>Enter challenges into admin page</li><li>Have participants register as teams</li><ul><li>If running a closed competition:</li><ul><li>In the admin page, generate and export tokens to be shared with approved teams, then point participants towards the registration page</li></ul><li>If running an open competition:</li><ul><li>Point participants towards the registration page</li></ul></ul><li>Enjoy!</li></ul><div>I’m playing with it now, but it looks like it will be an amazing resource for students.</div>

~1 min read

Honeypot Fun

At the Legal Informatics and Forensic Science Institute, we are preparing to do some research on IoT smart homes. Part of that is setting up a slightly-less-secure system. I run some honeypots on my home networks, but I was interested to see what is coming in to the known University IP range.
<div>
</div><div class="separator" style="clear: both; text-align: center;"></div><div>I had an extra Raspberry Pi laying around, and decided to run cowrie (kippo) SSH honeypot. Mostly because it is very fast to set up, gives you an idea of where attacks are coming from, and also gives a list of usernames and passwords that people are trying. More on the setup of cowrie later.</div><div>
</div><div>After putting cowrie online, it took 28 minutes before the first connection. This is actually longer than I expected. Possibly because the IP was up before, but port 22 was not open.</div><div>
</div><div>After 12 hours, login attempts from the following addresses:</div><div>
</div><div><table> <thead><tr> <th class="tg-yw4l">Login Attempts</th> <th class="tg-yw4l">IP Address</th> <th class="tg-yw4l">Country</th> </tr></thead> <tbody><tr> <td class="tg-yw4l">1</td> <td class="tg-yw4l">146.66.163.107</td><td class="tg-yw4l">Russia</td> </tr><tr> <td class="tg-yw4l">3</td> <td class="tg-yw4l">185.103.252.14</td><td class="tg-yw4l">Russia</td> </tr><tr> <td class="tg-yw4l">9</td> <td class="tg-yw4l">195.154.58.76</td><td class="tg-yw4l">France</td> </tr><tr> <td class="tg-yw4l">18</td> <td class="tg-yw4l">159.122.123.183</td><td class="tg-yw4l">Germany</td> </tr><tr> <td class="tg-yw4l">40</td> <td class="tg-yw4l">117.102.109.18</td><td class="tg-yw4l">Indonesia</td> </tr><tr> <td class="tg-yw4l">41</td> <td class="tg-yw4l">193.201.227.200</td><td class="tg-yw4l">Ukraine</td> </tr><tr> <td class="tg-yw4l">91</td> <td class="tg-yw4l">94.79.5.102</td><td class="tg-yw4l">Russia</td> </tr><tr> <td class="tg-yw4l">126</td> <td class="tg-yw4l">193.201.227.86</td><td class="tg-yw4l">Ukraine</td> </tr><tr> <td class="tg-yw4l">336</td> <td class="tg-yw4l">202.83.25.95</td> <td class="tg-yw4l">India</td> </tr></tbody></table>
Remember that the country doesn’t actually mean anything. These could be proxies, tor, hacked servers, etc.

The top usernames and passwords are not very surprising.

<table class="tg"><tbody><tr> <th class="tg-yw4l">Tries</th> <th class="tg-yw4l">Username / Password</th> </tr><tr> <td class="tg-yw4l">21</td> <td class="tg-yw4l">[root/123456]</td> </tr><tr> <td class="tg-yw4l">19</td> <td class="tg-yw4l">[root/default]</td> </tr><tr> <td class="tg-yw4l">18</td> <td class="tg-yw4l">[admin/support]</td> </tr><tr> <td class="tg-yw4l">18</td> <td class="tg-yw4l">[admin/default]</td> </tr><tr> <td class="tg-yw4l">18</td> <td class="tg-yw4l">[admin/123123]</td> </tr><tr> <td class="tg-yw4l">8</td> <td class="tg-yw4l">[root/admin]</td> </tr><tr> <td class="tg-yw4l">6</td> <td class="tg-yw4l">[admin/admin]</td> </tr><tr> <td class="tg-yw4l">5</td> <td class="tg-yw4l">[test/test]</td> </tr><tr> <td class="tg-yw4l">5</td> <td class="tg-yw4l">[support/support]</td> </tr><tr> <td class="tg-yw4l">5</td> <td class="tg-yw4l">[root/qwerty]</td> </tr></tbody></table></div>
Probably the most interesting thing is that the first attack was that the first attack was trying some sort of buffer-overflow. Although they were connecting to SSH and sending (weird) user/pass combinations, after the connection was rejected they were sending really long strings. I suspect it is some sort of honeypot detection, or it exploits certain versions of SSH? Not sure.

Anyway, for a 1 hour project it is easy and interesting. Definitely something that students could do in an afternoon.

1 min read

[CFP] JDFSL Special issue on Cyberharassment Investigation: Advances and Trends

JDFSL Special issue on Cyberharassment Investigation: Advances and Trends.

Anecdotal evidence indicates that cyber harassment is becoming more prevalent as the use of social media becomes increasingly widespread, making geography and physical proximity irrelevant. Cyberharassment can take different forms (e.g., cyberbullying, cyberstalking, cybertrolling), and be motivated by the objectives of inflicting distress, exercising control, impersonation, and defamation. Investigation of these behaviours is particularly challenging because it involves digital evidence distributed across the digital devices of both alleged offenders and victims, as well as online service providers, sometimes over an extended period of time. As a result, little is currently known about the modus operandi of offenders.

This special issue invites original contributions from researchers and practitioners which focus on the state-of-the-art and state-of-the-practice of digital forensic investigation of cyberharassment of all kinds.  We particularly encourage multidisciplinary contributions that can help examiners to be more effective and efficient in cyberharassment investigations.
Topics of interest include, but are not limited to:
-Offender psychology and profiling
-Cyberharassment victimology
-Methodologies and process models specific to cyberharassment investigation
-Tools and techniques for dealing with the types of digital evidence encountered in cyberharassment investigation
-Cyberharassment indicators
-Challenges and particularities of different modalities of cyberharassment
-Trends and typologies of cyberharassment

Important dates:
-Paper Submission:                 1 June 2016
-Notification of Initial Decision: 30 June 2016
-Revision due:                     31 July 2016
-Notification of Final Decision:   31 August 2016
-Final Manuscript Due:             30 September 2016
-Publication Date:                 31 October 2016

Author instructions:
The submissions must be blind and original (i.e., must not have been published or be under review by any other publisher). Authors should refer to the following link for instructions: http://www.jdfsl.org/for-authors. The option “Cyberharassment Special Issue” must be selected as article type on JDFSL OJS Submission System.  Further queries can be directed to the guest editors.

Guest Editors:
Dr Joanne Bryce
School of Psychology
University of Central Lancashire

Dr Virginia Franqueira
College of Engineering and Technology
University of Derby

Dr Andrew Marrington
College of Technological Innovation
Zayed University

About JDFSL:


<div style="font-family: -webkit-standard;">The Journal of Digital Forensics, Security and Law (JDFSL) is a peer-reviewed, multidisciplinary journal focussing on the advancement of the cyber forensics field through the publication of both basic and applied research. JDFSL is a no-fee open access publication, indexed in EBSCOhost, ProQuest, DOAJ, DBLP, arXiv, OAJI, ISI Web of Science, Google Scholar, and other databases. JDFSL is published by the Association of Digital Forensics, Security and Law.</div>

1 min read

ICDF2C 2015 in Seoul, South Korea Final Program Now Available

The 7th EAI International Conference on Digital Forensics & Cyber Crime will be held OCTOBER 6–8, 2015 in SEOUL, SOUTH KOREA.

The final program is now available at http://d-forensics.org/2015/show/program-final
Be sure to register so you don’t miss the exiting talks and tutorials!

Keynote speakers include Max Goncharov from Trend Micro, Inc, and Dr. Dave Dampier from Mississippi State University:

<div class="separator" style="clear: both; text-align: center;"></div>Max Goncharov is a senior security Virus Analyst with Trend Micro Inc., and is responsible for cybercrime investigations, security consulting to business partners (internal, external), creation of security frameworks, designing technical security architecture, overseeing the build out of an enterprise incident response process, and creation of the enterprise risk management program. During his 15 years with Trend Micro Inc, he has participated as a speaker in various conferences and training seminars on the topic of cybercrime and related issues. He has especially focues on cyberterrorism, cybersecurity, underground economy; such as DeepSec, VB, APWG, etc.


Dr. Dave Dampier is a Professor of Computer Science & Engineering at Mississippi State University specializing in Digital Forensics and Information Security. He currently serves as Director of the Distributed Analytics and Security Institute, the university level research center charged with Cyber Security Research. In his current capacity, Dr. Dampier is the university lead for education and research in cyber security. Prior to joining MSU, Dr. Dampier spent 20 years active duty as an Army Automation Officer. He has a B.S. Degree in Mathematics from the University of Texas at El Paso, and M.S. and Ph.D. degrees in Computer Science from the Naval Postgraduate School. His research interests are in Cyber Security, Digital Forensics and Software Engineering.


There will also be three tutorials on investigation, open source hardware for digital investigations and setting up a research environment for mobile malware research:

<ul><li>Tutorial 1: DUZON – Desktop Exercise: Crafting Information from Data</li><li>Tutorial 2: Pavel Gladyshev – FIREBrick; an open forensic device</li><li>Tutorial 3: Nikolay Akatyev – Researching mobile malware</li></ul><div>After the first day of the conference we are also holding a special discussion session with Seoul Tech Society called “Safe Cyberspace”, with the panel consisting of the winners of the ICDF2C/STS essay contest. Everyone is welcome to join!</div><div>
</div><div>I hope to see you at ICDF2C in Seoul, South Korea! Don’t miss this exciting opportunity.</div>

1 min read

ICDF2C Revised Draft Program Released

7th International Conference on Digital Forensics and Cyber Crime (ICDF2C) updated program is now available here: http://bit.ly/1LsJpvM

<div class="separator" style="clear: both; text-align: center;"></div>
The conference will be held in Seoul, South Korea from October 6 - 8, 2015. You can register for the conference here: http://d-forensics.org/2015/show/registration

We offer discounts for Law Enforcement and Students.

We are also working with Seoul Tech Society to run an information security essay contest and panel discussion. For more information, please see the call for essays.

~1 min read

ICDF2C and SeoulTechSoc Call for Essays on Information Security

ICDF2C and Seoul Tech Society Essay Contest

Have you ever surfed the Dark Web? Are you worried about the security of your virtual property? Technology is changing, and for every good side, there is a dark side. With these new technologies, how can the public protect themselves? Should the public rely on their government, or take security into their own hands? Let us know what you think with the ICDF2C and Seoul Tech Society Cyber Crime Essay Contest.

<div class="separator" style="clear: both; text-align: center;"></div>

This year ICDF2C has two focus areas:

<ul><li>Usage, implications and investigation of the “Dark Web”</li><li>Preventing or investigating crimes using cryptocurrencies</li></ul>
Although these topics are recommended, essays are not limited to these topics. For the full list of conference topics, please see http://d-forensics.org/2015/show/cf-papers

Submission Instructions

<ul><li>Submissions should be in English</li><li>Submissions should be no longer than 3 pages (with references)</li><li>Submissions must be submitted as a PDF</li></ul>
Please send a PDF of your essay to Joshua at cybercrimetech.com

Important Dates

<ul><li>Submission Deadline: September 21, 2015 (any time zone)</li><li>Notification: October 1, 2015</li><li>ICDF2C/SeoulTech Discussion Session: October 6, 2015, 18:00 – 19:30</li></ul>
Rewards

<ul><li>The top 5 essays will present their ideas at the ICDF2C/SeoulTech Discussion Session</li><li>Selected essays will be published in discussion session proceedings, and made available on the Seoul Tech Society web page</li></ul><div>See d-forensics.org for more information.</div>

~1 min read

Ashley Madison Data and Ethical Use

On August 19th, the Impact Team released data of millions of alleged Ashley Madison users. Ashley Madison is a type of social networking website that promotes extra-marital affairs. After the main user data, the source code for the website, and emails from the founder were also released.

The data was initially released on the Dark Web, but has since been added to many clear web sites.

<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;">Impact Teams .onion site on Tor where the data can be downloaded</td></tr><tr><td class="tr-caption" style="text-align: center;">Impact Team’s .onion site</td></tr></tbody></table>
The data contains information about users names, email addresses, locations, addresses, credit card numbers, credit card transactions, sexual preferences, and much, much more.

If you are thinking about looking up your friends and neighbors, think about the following first:
<h3>You cant trust most versions of the data</h3><div>Many people are interested in this data. Hackers and criminals know that it will be very popular, so they will add viruses and other malware to the data. It is also possible that copied versions had records added specifically to frame people. If you are going to use any version, make sure it came from Impact Team.</div><div><h3>You cant trust websites that let you search the data</h3></div><div>Even before the data was released, some websites were created to be able to single the data if and when it was released. Some of these websites are created by trusted security researchers, some are created by hackers, some are created by people who just want to make money off of the situation. The result is that you should only use trusted websites when evaluating data like this. Other sites may have malware, and some sites may collect any email addresses, names, phone numbers that you enter to “check” and resell that information to advertising companies. Be careful with websites you don’t know.</div><div><h3>The original data could have been fake or tampered with</h3></div><div>Data directly from Impact Team is the ‘most reliable’ version that we will get. However, this does not mean that it has not been tampered with. They may have added or modified entries.</div><div>
</div><div>Further, some accounts that exist in the system are likely to be fake anyway. The only accounts we can be reasonably sure of are attached to credit card transactions, and even those may possibly have been created by a stolen card.</div><h2>Think about what you are doing</h2><div>With data like this, there are a lot of things we can learn. I have a copy of the data, and I did not look up my friends or co-workers. Why? Because I don’t care. Many websites are using the data to find who is cheating on who. That question is not interesting. What is interesting is, for example, why people are cheating. We might even ask is cheating a bad thing? For 39 million people, apparently it isn’t. Other interesting questions include how to prevent an attack like this in the future? What are the most common passwords? Etc.</div><div>
</div><div>While the data is useful for information security to learn from its mistake, making the data easily accessible for the sake of gossip is not useful, and could potentially cause mental and physical damage. Consider this ‘help’ that a woman received from radio talk show hosts. As soon as the woman found out her husband was cheating, the host even admitted he felt like a jerk.</div><div>
</div><iframe frameborder="0" height="573" id="molvideoplayer" scrolling="no" src="http://www.dailymail.co.uk/embed/video/1207589.html" title="MailOnline Embed Player" width="698"></iframe>

I completely agree with the approach from the people at haveibeenpwned.com who explain in their blog single that it is not the job of security researchers to out people. It is our job to protect people.

Every time there is a data leak, the information is used for all sorts of scams, and criminals are already using the AM data. The people involved in this breach could have their entire lives destroyed by releasing all of their information. Some people will say that they deserve it for being on such a site. Thats a matter of opinion. But as security researchers if we don’t look for ways to use (and release) data responsibility, we may be hurting people to find the ‘juicy bits’ rather than improving security, privacy and freedom for everyone.

3 min read
Back to Top ↑

Conferences 2012

Webinar: Pitfalls of Interpreting Forensic Artifacts in the Windows Registry


Forensic Focus Webinar concerning analysis and of the Windows Registry from UCD’s very own Jacky Fox titled: Pitfalls of Interpreting Forensic Artifacts in the Windows Registry. From ForensicFocus.com:
<blockquote>In our next webinar, Jacky Fox, student at UCD School of Computer Science and Informatics, presents the results of her dissertation on Windows Registry reporting - focusing on automating correlation and interpretation. After the webinar Jacky will be available in the Forensic Focus webinars forum to answer any questions. </blockquote><blockquote>Date: Thursday, November 1st 2012
Time: 12PM (midday) EDT US / 4PM GMT UK / 5PM CET Europe
Duration: 20 mins </blockquote><blockquote>There is no need to register for this webinar, simply visit http://www.forensicfocus.com/webinars at the above time (the webinar has been pre-recorded and will be archived for viewing later if you are unable to attend).</blockquote>I’ve worked with Jacky, and know her thesis well. It should very interesting for anyone who works with the Windows Registry.

~1 min read

InfoSecurity Russia 2012


Last week, Pavel and I gave an invited talk at InfoSecurity Russia 2012. From Digital FIRE:
<blockquote class="tr_bq">Our talk explored the issues of digital forensics in the cloud environment. The first part of the talk introduced the concepts of cyber crime investigations and the challenges faced by the digital forensic practitioners. The second part of the talk explored investigative difficulties posed by cloud computing. A possible approach to dealing with some of these difficulties based on I-STRIDE methodology was then outlined.</blockquote><div class="separator" style="clear: both; text-align: center;"></div>
Discussed security challenges with Cloud environments are further elaborated on in our chapter “Digital Forensics and Cloud Computing” that can be found in Cybercrime and Cloud Forensics: Applications for Investigation Processes. Some investigation challenges were introduced based on the work of our friends at CloudForensicsResearch.org, with a few of my own thoughts added. Finally, a very quick overview of the Investigation STRIDE (I-STRIDE) model was given to attempt to help investigators and first responders identify potential sources of evidence, their jurisdiction, and other factors that may effect the admissibility of evidence extracted from a Cloud Service Provider.

http://infosecurityrussia.ru/speakers/
Image from Koraxdc

~1 min read

Korean National Police University International Cybercrime Research Center


<div class="p1">Today is the inauguration of the Korean National Police University (KNPU) International Cybercrime Research Center (ICRC). The inauguration ceremony will immediately be followed by the 1st International Cybercrime Research Seminar.</div><blockquote class="tr_bq">Grand opening: The International Cybercrime Research Center, Korean National Police University.
The opening ceremony will be hosted by the president of the Korean National Police University on September 18th, 2012. To commemorate the event, the Police orchestra will perform, and the 1st International Cybercrime Research Seminar will be held. The International Cybercrime Research Center will focus on multi-disciplinary research and support dealing with cyber-policing strategy, quality education and training. The official website for the International Cybercrime Research Center will be available soon.
For more information or to submit proposals for partnerships and collaboration, please contact: [email protected]</blockquote><div class="p1">The 1st International Cybercrime Research Seminar will cover current trends and the future of training, education and research, child exploitation in South Korea (International Centre for Missing and Exploited Children: South Korean Chapter), and criminological aspects of digital crime.

Update:
<blockquote class="tr_bq">경찰대학(학장 치안정감 서천호)에서는 2012. 9. 18(화) 14:00 영상강의실에서 오전 1부행사로 국제 사이버범죄 연구센터 개소식에 이어 국내외 전문가들이 ‘유럽 사이버수사의 교육훈련 동향’ , ‘호주의 정보보안 침해범죄의 범죄학적 연구’, ‘인터넷 아동음란물 실태와 대응방안’ 등에 대한 제1회 국제사이버 범죄 세미나를 개최하였다.
경찰대학(학장 치안정감 서천호)에서는 2012. 9. 18(화) 14:00 영상강의실에서 오전 1부행사로 국제 사이버범죄 연구센터 개소식에 이어 국내외 전문가들이 ‘유럽 사이버수사의 교육훈련 동향’ , ‘호주의 정보보안 침해범죄의 범죄학적 연구’, ‘인터넷 아동음란물 실태와 대응방안’ 등에 대한 제1회 국제사이버 범죄 세미나를 개최하였다.</blockquote>
http://www.police.ac.kr/open/photo_news.jsp?SEQ=33307&BoardMode=view</div>

1 min read

CFP: OMICS Group International Conference on Forensic Research and Technology

Open call for abstracts (will receive DOI) for the International Conference on Forensic Research and Technology. Looks like an interesting mix of all forensic science topics, including Cyber.

October 15 - 17, 2012
Chicago-Northshore, U.S.A.

“Forensic Research-2012, a remarkable event that is bound to set new standards in research. This conference would witness new breakthroughs and new strategies in investigation followed by different investigational agencies both government and crime investigation bodies. Forensic Research-2012 is an apt platform for investigators to collaborate and share new ideas, innovations & strategies in fight against crime and terrorism. The scientific program paves a way to gather visionaries through the research talks and presentations and put forward many thought provoking strategies in Forensic Research & Technology.

Modern era of network and cyber evolution has emerged with a great outcome of profits as well as destructive phases which influence the economical barriers of the nations. Hence the field of digital forensics and cyber crime investigation has evolved as an important subject of interest for national security and to develop the security systems. Forensics is a multidisciplinary area that monitors and encompasses law, investigation for network hacking, cyber crimes, finance, telecommunications, digital data recovery, and data reconstruction.

This conference on Forensic Research brings together forensic medicine professionals and forensic scientists from diverse fields providing opportunities for business and intellectual engagement among attendees.”

Conference Highlights the following topics:
• Role of Psychology in Forensic Science  • Nuclear Forensics   • Forensic Medicine
• Forensic & Investigative Sciences   • Forensic Toxicology   • Cyber Investigation
• Forensic Research & Medical Ethics  • Forensic Analysis and Tools
• Environmental Forensic Tools  • Evolving Fields in Forensic Science
• Modern Developments in Forensic science  • Forensic Evaluation

1 min read

LawTech Europe Congress 2012

LawTech Europe Congress 2012

12 November, 2012
Prague, Czech Republic

Over the past few years there have been huge advances in Electronic Evidence support and guidelines for civil litigations in America and Western Europe. These advances have not been adequately mirrored in Central and Eastern Europe. As a result, multi-jurisdictional disputes have become more drawn out and complex in nature. In criminal proceedings, the lack of clear guidelines for the collection and processing of electronic evidence has lead to low crime detection rates and ineffective criminal prosecutions. What this annual congress aims to achieve is address the imbalance by bringing together, the brightest minds in technology, law, governance, and compliance.”

~1 min read

International Workshop on Digital Forensics in the Cloud 2012 (IWDFC 2012)

Unfortunately too late to submit a paper this year, but the conference may be interesting nonetheless!

From International Workshop on Digital Forensics in the Cloud (IWDFC 2012):

The 2012 workshop will comprise of presentations from a cloud forensic expert from industry, and a representative from one of the popular cloud platform developers (e.g. Nimbula director).  A lab session will also be carried out on setting up and management of a cloud infrastructure.”


<div class="MsoNormal" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px;">The work will run in conjunction with the 2012 ISSA conference ( http://www.infosecsa.co.za/) and will take place on the third day (17th August 2012).</div><div class="MsoNormal" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px;">
</div><div class="MsoNormal" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px;">Submissions are welcome for people who wish to present at the workshop.</div><div class="MsoNormal" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px;">
</div><div class="MsoNormal" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px;">Topics of interest must be based on the cloud and are but not limited to:</div><div class="MsoNormal" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px;">
</div><div class="MsoListParagraphCxSpFirst" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px; text-indent: -0.25in;">·         Cloud security</div><div class="MsoListParagraphCxSpMiddle" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px; text-indent: -0.25in;">·         Digital forensic tools</div><div class="MsoListParagraphCxSpMiddle" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px; text-indent: -0.25in;">·         Incident response</div><div class="MsoListParagraphCxSpMiddle" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px; text-indent: -0.25in;">·         Data visualization</div><div class="MsoListParagraphCxSpMiddle" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px; text-indent: -0.25in;">·         Digital evidence identification</div><div class="MsoListParagraphCxSpMiddle" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px; text-indent: -0.25in;">·         Digital forensic process</div><div class="MsoListParagraphCxSpLast" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px; text-indent: -0.25in;">·         Privacy</div><div class="MsoNormal" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px;">
</div><div class="MsoNormal" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px;">Important Dates:</div><div class="MsoNormal" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px;">Paper Submission: 20 April 2012</div><div class="MsoNormal" style="background-color: white; color: #434343; font-family: arial, helvetica; font-size: 12.222222328186035px;">Author notification: 29 June 2012</div>

~1 min read

Webinar: Industrial Espionage, Weaponized Malware, and State-Sponsored Cyber Attacks: How to Identify, Counter, and React

Webinar: Industrial Espionage, Weaponized Malware, and State-Sponsored Cyber Attacks: How to Identify, Counter, and React

Date: 31 July, 2012
Time: 15:00 PT (22:00 GMT, 23:00 IST)
Duration: 60 min.
Price: Free

From Info Security:
The information security industry has been issuing warnings of an increase in sophisticated state-sponsored cyber attacks in the wake of Flame.

Neither the U.S. or Israel have denied their role in the use of recently discovered weaponized malware. Add in to the mix the fact that India recently announced the empowerment of its government agencies to carry out state-sponsored cyberattacks, and suddenly political – and thus industrial – espionage has never been more of a threat. Join Infosecurity Magazine in a free one-hour webinar to take a detailed look at how weaponized malware is getting on to crucial systems and how you can protect against it.

You will learn:
<ol><li>How state-sponsored cyber attacks are succeeding</li><li>How weaponized malware is getting on to crucial systems</li><li>How you can protect against these attacks</li><li>How to recognize and identify these targeted attacks</li><li>How to counter against weaponized malware</li></ol>

~1 min read

CFP: IRISSCERT Cyber Crime Conference

The IRISSCERT Cyber Crime Conference will be held November 22, 2012 in Dublin, Ireland. More information can be found here.

They are currently running a call for papers on the topics below. Audience is business community within Ireland.

Submission deadline: July 20, 2012 17:00 GMT.
<ul><li>Cyber Crime</li><li>Cyber Security</li><li>Cloud Security</li><li>Incident Response</li><li>Data Protection</li><li>Incident Investigation</li><li>Information Security</li><li>Threats Information</li><li>Security Trends</li><li>Securing the Critical Network Infrastructure</li></ul><div>

Technical Streams:</div><div><ul><li>Security Tools</li><li>Application Security</li><li>Network Security</li><li>Cloud Security</li><li>Database Security</li><li>Electronic Device Security</li><li>Computer Forensics</li></ul></div>

~1 min read

Webinar: Finding Evidence in an Online World - Trends & Challenges in Digital Forensics

[Edit] A recording of the webinar can be found here: http://www.forensicfocus.com/DF_Multimedia/page=watch/id=79/d=1/

Resingle from: http://www.forensicfocus.com/News/article/sid=1898/

Learn about the methods and techniques used to recover Internet-related evidence left behind on hard drives and RAM by registering for a free Forensic Focus webinar delivered by Jad Saliba of JADsoftware, developers of Internet Evidence Finder (IEF). Jad will discuss a wide range of potential evidence sources including cloud, social networking, chat, web history, P2P, and webmail artifacts.

Date: Tuesday, July 17 2012Time: 11AM EDT US / 4PM BST UK / 15:00 GMT
Duration: 30 mins

All attendees will receive 10% off the purchase price of IEF until August 31st.

Register now at http://forensicfocus.enterthemeeting.com/m/4BGB7KYU

~1 min read

ICTTF - Cyber Threat Summit 2012

The ICTTF Cyber Threat Summit will be held in Dublin on September 20-21, 2012. Have a look at this years agenda. You can get a 10% registration discount if you use the code: SPGNSPXV.


For those of you who will not be able to attend, there is a free web broadcast that will be offered. Register for login details.

More information can be found at cyberthreatsummit.com.

From Cyber Threat Summit.com:
<blockquote>Following last years hugely successful event, ICTTF are proud to announce an enhanced event this year over two days.
The event will be a conference and exhibition. The syllabus will be delivered by over 20 of the world’s leading cyber security experts with a specific European perspective.
A cross industry master class developed for senior executives to understand the developments, strategies and best practice in cyber security. 
Learn to understand the types of threats, potential impacts, motivational factors and trends to observe in cyber security. 
Review best practice in protecting your organisation and develop appropriate cyber defence strategies. 
Network with your industry peers and discuss the latest cyber security technologies in the marketplace.</blockquote>

~1 min read

International Symposium on Cybercrime Response (ISCR) 2012

I’m just back from the 1st INTERPOL NCRP Cybercrime Training Workshop and International Symposium on Cybercrime Response 2012, held in Seoul, South Korea. The joint INTERPOL and Korea National Police (KNP) conference was hosted by the KNP Cyber Terror Response Center (CTRC).

ICSR 2012 Agenda

<div class="separator" style="clear: both; text-align: center;"></div>The first day was a look at Law Enforcement (LE) communication networks, including INTERPOL, the G8 24/7 High Tech Crime Network1, and even more informal communication channels. The overall consensus seems to be that the more formal networks are too slow to deal with the requirements of international cybercrime investigation requests. This appears to be partially a limitation with the efficiency of the networks as well as the ability of receiving countries to process the requests either because of resource issues or laws (or lack of) in the requested country to deal with the investigation request.

It was determined that informal channels of LE communication are currently more effective since they bypass international bureaucracy. These channels appeared to be created mostly by networking (conferences, etc.), and luck.

There essentially seemed to be three camps: Formal communication networks like INTERPOL and G8 24/7, less formal networks created via bilateral agreements, and LE social networks (p2p). Each camp had success stories, and I know each has had failures.

The question is, how can the situation be improved? Criminal communication networks at an international level work much more efficiently than law enforcement networks. There are many reasons why, but what can be done?

The issue of trust in LE communication was brought up, where if you are requesting information or cooperation the person with whom you are communicating should be more than just a name on a list. This is an interesting point to me. If LE is given a list of contact points per country from a formal communication network, do they question the contact point? I think they would automatically trust the contact point via the reputation of the network referring them, even without meeting the contact personally. The issue comes when these contacts are slow or fail to respond to requests from the network. Trust, then, comes from showing you are reliable when something is requested, whether or not you physically meet the contact representative.

Another interesting point was the concept of “exercising” your team(s) in international request response. LE basically creates an incident response (IR) plan for international requests. Incident response is a huge topic in network security. If you read this article, for example, it is geared (at a high level) towards setting up an incident response plan. Each of the tips, however, could be directly transposed into international LE response. The discussed point of exercising your team would be the final testing requirement. Unfortunately, this is the phase that is often neglected, usually due to time and resources. In the case of LE, especially at an international level, it would be difficult to coordinate and perhaps even justify the time needed just for testing communication when it was not really requested.

The topic of international LE communication came down to looking at a few different questions (and I added a few): What exactly is the problem, and has a solution been identified? What type of information is needed? Who has legal authority? Have international procedures been established? Are all concerned bodies part of the procedure and willing to cooperate? How do we test the procedure? How do we measure success? Who is responsible for updates?

These questions are not exactly easy to answer, even within a single organization, and working with multiple organizations in multiple jurisdictions to find answers to these questions is even more difficult and time consuming. In my opinion, this is where providers of formal networks should be filling in the gaps. I should not expect my local investigators to create their own international networks, and unless this process is centralized then different procedures will be created, incomplete networks will be formed and there will be much duplication of effort.

The rest of the conference further discussed communication and law, examined current threats, and some gave case studies (success stories) involving international communication and collaboration between international law enforcement, private sector and sometimes academia.

Overall the conference is directed at practitioners. It did not get very technical nor theoretical, and could probably be understood by anyone regardless of their familiarity with cybercrime. Some cybercrime damage estimates were given, although how to accurately measure is a problem that was not addressed. The estimates looked impressively dramatic, but felt like the stats from different presentations did not relate to each other well.

Similarly, definitions used in each presentation were quite different for the same terminology. The group was composed of people from many different countries, all practitioners, but a lack of consistency in the use (and scope) of terms was an obvious communication problem, even for terms as general as “cybercrime”. Sometimes nonstandard term usage made it difficult for me to know exactly what the speaker really meant. This made me realize that even in the same area of cybercrime investigation, we are speaking different languages. How do we expect to be able to communicate at a practical level when it is so difficult to accurately communicate our needs in a way that can be understood by everyone in the area?

Many case studies were given by law enforcement that dealt with international communication, but other than “we need more / better communication” I really did not see any actionable solution proposed beyond ad-hoc cooperation. From these great case studies and information from the private sector, I was still left with a feeling of where do we start?

Overall, I found the conference to be interesting. Topics were mostly on communication, but, unfortunately, with little actionable items discussed. Case studies are useful for understanding problems and potential solutions. Some slightly more technical presentations outlined how technologies can potentially be used to help law enforcement’s current situation when dealing with cybercrime. The (potentially) most useful benefit of the conference, however, was the contacts made. There was not enough time to talk to everyone as much as I would have liked, but there appears to be potential in the group to help drive effective law enforcement communication on a global scale.


Image: FreeDigitalPhotos.net

1. The G8 24/7 High Tech Crime Network (HTCN) is an informal network that provides around-the-clock, high-tech expert contact points: IT Law Wiki 

5 min read

ICDF2C 2012



The 4th International Conference on Digital Forensics and Cyber Crime (ICDF2C), hosted at Purdue University, will be held from October 24-26, 2012.

Website: http://d-forensics.org

Paper submission is the 1st of June 2012.

[Update]
Submission deadline: 6th July 2012
<div class="p1">Notification of Acceptance: 1st August 2012</div><div class="p1">Camera Ready: 1st September 2012</div><div class="p1">Conference Date: 24th and 26th October 2012 </div>

The following topics highlight the conference’s theme (from the conference page):
<ul><li>Business Applications of Digital Forensics</li><ul><li>e-Discovery</li><li>Civil Litigation Support</li><li>Incident Response</li><li>Cyber Crime Investigations</li><li>Online Fraud</li><li>Money Laundering</li><li>Hacking</li><li>Malware & Botnets</li><li>Sexual Abuse of Children on Internet</li><li>Software & Media Piracy</li></ul><li>Digital Forensics Techniques and Tools</li><li>Digital Forensics Process & Procedures</li><li>Cybercrime Investigation Management</li><li>Theoretical Foundations of Digital Forensics</li><li>Digital Forensics & Law</li><li>Mobile / Handheld Device & Multimedia Forensics</li><li>Digital Forensics Standardization & Accreditation</li><li>Cyber Criminal Psychology and Profiling</li><li>Cyber Culture & Cyber Terrorism</li><li>Information Warfare & Critical Infrastructure Protection</li></ul>

~1 min read

DFRWS 2009 - Montreal

Our group in the Centre for Cybercrime Investigation gave a presentation at the Digital Forensic Workshop 2009. The submitted paper can be found here. Also another paper from Damir Kahvedzic, also from CCI, was accepted. Bam.

Currently it is day 2 of the conference, and just before the “Forensic Rodeo”. I don’t really know what to expect. The presentations and keynotes so far have been quality. They have given me lots of ideas to apply to my own research, so I guess thats the point, eh?

I think some of the concepts that have been talked about can be applied (in some shape or form) to the REAPER project, but overall the focus of the community (represented via DFRWS) seems to be on distributed forensic systems, and more intelligent ways to represent data. Some automation was talked about, but not really as much as I expected. There was also a tool closed source tool that is similar to OCFA, but I cant find the project page right now. More on that later.

~1 min read
Back to Top ↑

CFP 2015

ICDF2C and SeoulTechSoc Call for Essays on Information Security

ICDF2C and Seoul Tech Society Essay Contest

Have you ever surfed the Dark Web? Are you worried about the security of your virtual property? Technology is changing, and for every good side, there is a dark side. With these new technologies, how can the public protect themselves? Should the public rely on their government, or take security into their own hands? Let us know what you think with the ICDF2C and Seoul Tech Society Cyber Crime Essay Contest.

<div class="separator" style="clear: both; text-align: center;"></div>

This year ICDF2C has two focus areas:

<ul><li>Usage, implications and investigation of the “Dark Web”</li><li>Preventing or investigating crimes using cryptocurrencies</li></ul>
Although these topics are recommended, essays are not limited to these topics. For the full list of conference topics, please see http://d-forensics.org/2015/show/cf-papers

Submission Instructions

<ul><li>Submissions should be in English</li><li>Submissions should be no longer than 3 pages (with references)</li><li>Submissions must be submitted as a PDF</li></ul>
Please send a PDF of your essay to Joshua at cybercrimetech.com

Important Dates

<ul><li>Submission Deadline: September 21, 2015 (any time zone)</li><li>Notification: October 1, 2015</li><li>ICDF2C/SeoulTech Discussion Session: October 6, 2015, 18:00 – 19:30</li></ul>
Rewards

<ul><li>The top 5 essays will present their ideas at the ICDF2C/SeoulTech Discussion Session</li><li>Selected essays will be published in discussion session proceedings, and made available on the Seoul Tech Society web page</li></ul><div>See d-forensics.org for more information.</div>

~1 min read

[CFP] DFRWS EU 2016

The DFRWS EU 2016 conference will be held in Lausanne, Switzerland from March 30th to April 1st, 2016.
<div class="separator" style="clear: both; text-align: center;"></div>
http://www.dfrws.org/2016eu

The DFRWS is dedicated to the advancement of digital forensics research through open sharing of knowledge and ideas. Ever since it organized the first open workshop in 2001, the DFRWS continues to bring leading researchers, developers, practitioners, and educators from around the world together in an informal collaborative environment. DFRWS conferences publicize and discuss high quality research outcomes selected in a thorough peer review process.

The DFRWS EU 2016 conference extends the 15-year tradition of research conferences organized by DFRWS.org, including the DFRWS US 2015 conference from August 9 to 13, 2015 in Philadelphia. Information on the upcoming USA the program and how to register can be found at http://dfrws.org/2015

The continued expansion of DFRWS EU conferences is intended as a focal point for the European digital forensic community, allowing participants to meet and exchange ideas without the need for transatlantic travel. The proceedings of DFRWS EU 2016 will be published in a special issue of Elsevier’s Digital Investigation journal, and will be freely available on the DFRWS website.

NOTE: Immediately before the conference, on March 29, rooms are available at the venue to be booked by research consortia to have meetings. If you are interested in reserving one of these meeting rooms, please contact us at eu-sponsorship <at> dfrws <dot> org.
<h3>Possibilities to contribute</h3>In recent years, DFRWS conferences have added practitioner presentations and hands-on tutorials/workshops taught by leading experts in the fields. Presentations are opportunities for industry researchers and practitioners who do not have the time to write a paper, but who have forensics information and experiences that would be of interest to DFRWS attendees. Presentation proposals undergo a light reviewing process to filter out sales pitches and ensure the topic is relevant to our audience.

We invite original contributions as research papers, presentation proposals, panel proposals, tutorial/workshop proposals, and demo or singleer proposals on the following topics:
<h3>TOPICS OF INTEREST:</h3>
<ul><li>“Big data” approaches to forensics, including data collection, data mining, and large scale visualization</li><li>Addressing forensic challenges of Systems-on-a-chip</li><li>Anti-forensics and anti-anti-forensics</li><li>Bridging the gap between analog and digital traces/evidences/investigators</li><li>Case studies and trend reports</li><li>Data hiding and discovery</li><li>Data recovery and reconstruction</li><li>Database forensics</li><li>Digital evidence and the law</li><li>Digital evidence storage and preservation</li><li>Event reconstruction methods and tools</li><li>Impact of digital forensics on forensic science</li><li>Incident response and live analysis</li><li>Interpersonal communications and social network analysis</li><li>Malware and targeted attacks: analysis, attribution</li><li>Memory analysis and snapshot acquisition</li><li>Mobile and embedded device forensics</li><li>Multimedia analysis</li><li>Network and distributed system forensics</li><li>Non-traditional forensic scenarios and approaches (e.g. vehicles, control systems, and SCADA)</li><li>Storage forensics, including file system and Flash</li><li>Tool testing and development</li><li>Triage, Prioritization, Automation: Efficiently processing large amounts of data in digital forensics</li><li>Typology of digital traces</li><li>Virtualized environment forensics, with specific attention to the cloud and virtual machine introspection</li></ul>
The above list is only suggestive. We welcome new, original ideas from people in academia, industry, government, and law enforcement who are interested in sharing their results, knowledge, and experience. Authors are encouraged to demonstrate the applicability of their work to practical issues.  Questions about submission topics can be sent via email to: eu-papers <at> dfrws <dot> org

IMPORTANT DATES - Please note that all deadlines are firm.

<ul><li>Papers & Presentation/Panel Proposals: October 5, 2015</li><li>Author notification: December 14, 2015</li><li>Final draft papers due and presenter registration(): January 25, 2016</li><li>( Papers for which no author has registered by this date may be dropped from the program.)</li><li>Workshop/Tutorial Submission Deadline: October 26, 2015</li><li>Demo & Poster Proposals: January 18, 2016</li><li>Conference Dates: March 30 - April 1, 2016</li></ul>
The FULL CFP with details about submissions is located at: http://dfrws.org/2016eu/cfp.shtml
<h3>SUBMISSIONS</h3>Research papers and presentation proposals must be submitted through the EasyChair site at http://www.easychair.org/conferences/?conf=dfrws2016eu.
Submissions must be in Adobe Acrobat PDF format. Send any questions about research paper / presentation proposal submissions to: eu-papers (at) dfrws (dot) org.

Panel proposals must be emailed to eu-panels (at) dfrws (dot) org in PDF or plain text format.
Demo proposals must be emailed to eu-demos (at) dfrws (dot) org in PDF or plain text format.

To submit a tutorial/workshop proposal please visit the Call for Workshop Proposals page at http://dfrws.org/2016eu/cfw.shtml.
<h3>STUDENT AWARD and STUDENT SCHOLARSHIP PROGRAM</h3>DFRWS continues its outreach to students studying digital forensics. This year DFRWS will be offering an award with a cash prize to the best student paper. A student paper is any paper in which the  majority of the work was performed and the paper written by full-time students at an accredited university, college, or high school.

A limited number of scholarships may be awarded to students presenting a paper at the conference. The intent is to help alleviate the financial burden due to the cost of hotel expenses and conference registration. For more information, see the DFRWS EU 2016 homepage at: http://www.dfrws.org/2016eu

3 min read

[CFP] SADFE-2015



Call for Papers
SADFE-2015
Tenth International Conference on
Systematic Approaches to Digital Forensics Engineering
September 30 – October 2, 2015, Malaga, Spain


Important dates

  • Paper Due Date: June 24, 2015 (anywhere in the world)
  • Acceptance Notification Date: August 15, 2015
  • Final Paper: September 1, 2015
  • Conference: September 30 – October 2, 2015
Invitation to submit
We invite you to SADFE-2015, the tenth international conference on Systematic Approaches to Digital Forensic Engineering to be held in Malaga, Spain, September 30 – October 2, 2015.

Digital forensics engineering and the curation of digital collections in cultural institutions face pressing and overlapping challenges related to provenance, chain of custody, authenticity, integrity, and identity. The generation, analysis and sustainability of digital evidence require innovative methods, systems and practices, grounded in solid research and understanding of user needs. The term digital forensic readiness describes systems that are build to satisfy the needs for secure digital evidence.

SADFE-2015 investigates requirements for digital forensic readiness and methods, technologies, and building blocks for digital forensic engineering. Digital forensic at SADFE focuses on variety of goals, including criminal and corporate investigations, data records produced by calibrated devices, as well as documentation of individual and organizational activities. Another focus is on challenges brought in by globalization and cross-legislation digital applications. We believe digital forensic engineering is vital to security, the administration of justice and the evolution of culture.

Topics
We welcome previously unpublished papers on engineering approaches for digital forensics, forensic readiness, technical building blocks for secure digital evidence and digital forensics and preservation as to civil, criminal and national security investigations for use within a court of law, the execution of national policy or to aid in understanding the past and digital knowledge in general.

We discuss digital forensic principles in new areas of the information society. We hope you consider submission for this international conference.

Potential topics to be addressed by submissions include, but are not limited to:

  • Digital Data and Evidence Management: advanced digital evidence discovery, collection, management, storage and preservation
    • Identification, authentication and collection of digital evidence
    • Extraction and management of forensic data/metadata
    • Identification and redaction of personally identifying information and other forms of sensitive information
    • Post-acquisition handling of evidence and the preservation of data integrity and admissibility
    • Evidence and digital memory preservation, curation and storage
    • Architectures and processes (including network processes) that comply with forensic requirements
    • Managing geographically, politically and/or jurisdictionally dispersed data artifacts
    • Data, digital knowledge, and web mining systems for identification and authentication of relevant data
    • Botnet forensics
  • Digital Evidence, Data Integrity and Analytics: advanced digital evidence and digitized data analysis, correlation, and presentation
    • Advanced search, analysis, and presentation of digital evidence
    • Cybercrime scenario analysis and reconstruction technologies
    • Legal case construction and digital evidence support
    • Cyber-crime strategy analysis and modeling
    • Combining digital and non-digital evidence
    • Supporting both qualitative and statistical evidence
    • Computational systems and computational forensic analysis
    • Digital evidence in the face of encryption
    • Forensic-support technologies: forensic-enabled and proactive monitoring/response
• Forensics of embedded or non-traditional devices (e.g.
   digicams, cell phones, SCADA, obsolete storage media)
- Innovative forensic engineering tools and applications
- Proactive forensic-enabled support for incident response
- Forensic tool validation: methodologies and principles
- Legal and technical collaboration
- Digital forensics surveillance technology and procedures
- “Honeypot” and other target systems for data collection and
   monitoring
- Quantitative attack impact assessment
- Comprehensive fault analysis, including, but not limited
   to, DFE study of broad realistic system and digital knowledge
   failures, criminal and non-criminal, with comprehensive DFE
   (malicious/non-malicious) analysis in theory, methods, and
   practices.

• Forensic and digital data integrity issues for digital
   preservation and recovery, including
- Technological challenges
- Legal and ethical challenges
- Economic challenges
- Institutional arrangements and workflows
- Political challenges and
- Cultural and professional challenges

• Scientific Principle-Based Digital Forensic Processes:
   systematic engineering processes supporting digital evidence
   management which are sound on scientific, technical and legal
   grounds
- Legal/technical aspects of admissibility and evidence tests
- Examination environments for digital data
- Courtroom expert witness and case presentation
- Case studies illustrating privacy, legal and legislative issues
- Forensic tool validation: legal implications and issues
- Legal and privacy implications for digital and computational
   forensic analysis
- Handling increasing volumes of digital discovery
- New Evidence Decisions, e.g., United States v. Jones, _ U.S._
   (2012) and United States v. Kotterman, _ F.3d _ (9th Cir. 2013)
- Computational Forensics and Validation
- Transnational Investigations/Case Integration under the
   Convention on Cybercrime of the Council of Europe
- Issues in Forensic Authentication and Validation.

• Legal, Ethical and Technical Challenges
- forensic, policy and ethical implications of The Internet of
   Things, The “Smart City,” “Big Data” or Cloud systems

Important dates
- Paper Due Date: June 24, 2015 (anywhere in the world)
- Acceptance Notification Date: August 15, 2015
- Final Paper: September 1, 2015
- Conference: September 30 – October 2, 2015

Publication
SADFE-2015 papers will be published in the Journal of Digital Forensics, Security and Law (JDFSL) and will undergo a double-blind review process.

Best paper award
A Best Paper Award will be made for the final papers.

Submission Instructions
Manuscripts submitted are expected to be:
- new and original,
- well organized and clearly written,
- of interest to the academic and research communities,
- not published previously, and
- not under consideration for publication in any other journal or book

PLEASE NOTE: To be published an author of an accepted paper is required to register for the conference. Non-refundable registration fees must be paid prior to uploading the final JDFSL formatted, publication-ready version of the paper.

All submissions should be written in English with a maximum paper length of six (6) printed pages (minimum 10-point font) including figures, without incurring additional page charges.

To ensure a blinded review process, the following information should be excluded from the submission:

- Authors Names section
- Biography section
- Acknowledgments section (if it contains information identifying
   the authors).

If an article is accepted, author(s) must provide a version in either Microsoft Word or LaTeX with graphics (figures) in GIF, TIF, or PowerPoint formats. Permissions for reprinted material are the sole responsibility of the author(s) and must be obtained in writing prior to publication.

Templates are available for:
Microsoft Word: www.jdfsl.org/files/JDFSL-Template-2014.docx

Only PDF files will be accepted for the review process, and all 
submissions must be done through Easychair. The submission site is 
already open. Visit the SADFE 2015 EasyChair submission page to submit 
your manuscripts.


Organization
General Chair: Antonio Maña (University of Malaga)

Program Committee Co-Chairs:
Carsten Rudolph (Huawei European Research Center)
Nicolai Kuntze (Huawei European Research Center)
Barbara Endicott-Popovsky (University of Washington)

Publication Chair: Ibrahim Baggili (University of New Haven)

Publicity Chair Europe: Joe Cannataci (University of Malta)
Publicity Chair North-America: Dave Dampier (Mississippi State University)
Publicity Chair Asia: Ricci Ieong (University of Hong Kong)

Steering Committee
Deborah Frincke, Co-Chair (Department of Defense)
Ming-Yuh Huang, Co-Chair (Northwest Security Institute)
Michael Losavio (University of Louisville)
Alec Yasinsac (University of South Alabama)
Robert F. Erbacher (Army Research Laboratory)
Wenke Lee (George Institute of Technology)
Barbara Endicott-Popovsky (University of Washington)
Roy Campbell (University of Illinois, Urbana/Champaign)
Yong Guan (Iowa State University)

Program Committee
Sudhir Aggarwal, Florida State University
Galina Borisevitch, Perm State University
Frank Breitinger, University of New Haven
Joseph Cannatacci, University of Groningen
Long Chen, Chongqing University of Posts and Telecommunications
Raymond Choo, University of South Australia
K.P. Chow, University of Hong Kong
David Dampier, Mississippi State University
Hervé Debar, France Telecom R&D
Barbara Endicott-Popovsky, University of Washington
Robert Erbacher, Northwest Security Institute
Xinwen Fu, UMass Lowell
Simson Garfinkel, Naval Postgraduate School
Brad Glisson, University of Glasgow
Lambert Großkopf, Universität Bremen
Yong Guan, Iowa State University
Barbara Guttman, National Institute for Standards and Technology
Brian Hay, University of Alaska, Fairbanks
Jeremy John, British Library
Ping Ji, John Jay College Of Criminal Justice
Andrina Y.L. Lin, Ministry of Justice Investigation Bureau
Pinxin Liu, Renmin University of China Law School
Michael Losavio, University of Louisville
David Manz, PNNL
Nasir Memon, Polytechnic Institute of NYU
Mariofanna Milanova, University of Arkansas at Little Rock
Carsten Momsen, Leibniz Universität Hannover
Kara Nance, University of Alaska, Fairbanks
Ming Ouyang, University of Louisville
Gilbert Peterson, Air Force Institute of Technology
Slim Rekhis, University of Carthage
Golden Richard, University of New Orleans
Corinne Rogers, University of British Columbia
Ahmed Salem, Hood College
Viola Schmid, Technische Universität Darmstadt
Clay Shields, Georgetown University
Vrizlynn Thing, Institute for Infocomm Research, Singapore
Sean Thorpe, University of Technology, Jamaica
William Underwood, Georgia Tech
Wietse Venema, IBM Research
Hein Venter, University of Pretoria
Xinyuan Wang, George Mason University
Kam Woods, University of North Carolina
Yang Xiang, Deakin University, Australia
Fei Xu, Inst. of Information Eng., Chinese Academy of Sciences
Alec Yasinsac, University of South Alabama
SM Yiu, Hong Kong University
Wei Yu, Towson University
Nan Zhang, George Washington University

Contact Information
Antonio Maña, University of Malaga,
+34 951 952 940

Michael Losavio, University of Louisville,
+1 502 852 3509
6 min read

[CFP] WSDF 2015: The 8th International Workshop on Digital Forensics

WSDF 2015: The 8th International Workshop on Digital Forensics

August 24-28, 2015
Toulouse, France

Important Dates
Submission Deadline        April 1, 2015
Author Notification          May 11, 2015
Proceedings Version        June 8, 2015

Digital forensics is a rapidly evolving field primarily focused on the extraction, preservation and analysis of digital evidence obtained from electronic devices in a manner that is legally acceptable. Research into new methodologies tools and techniques within this domain is necessitated by an ever-increasing dependency on tightly interconnected, complex and pervasive computer systems and networks. The ubiquitous nature of our digital lifestyle presents many avenues for the potential misuse of electronic devices in crimes that directly involve, or are facilitated by, these technologies. The aim of digital forensics is to produce outputs that can help investigators ascertain the overall state of a system. This includes any events that have occurred within the system and entities that have interacted with that system. Due care has to be taken in the identification, collection, archiving, maintenance, handling and analysis of digital evidence in order to prevent damage to data integrity. Such issues combined with the constant evolution of technology provide a large scope of digital forensic research. WSDF aims to bring together experts from academia, industry, government and law enforcement who are interested in advancing the state of the art in digital forensics by exchanging their knowledge, results, ideas and experiences.

The aim of the workshop is to provide a relaxed atmosphere that promotes discussion and free exchange of ideas while providing a sound academic backing. The focus of this workshop is not only restricted to digital forensics in the investigation of crime. It also addresses security applications such as automated log analysis, forensic aspects of fraud prevention and investigation, policy and governance.

Topics of interest comprise but are not limited to:
Digital Evidence
Network Forensics
Anti Forensics
Physical Memory Acquisition and Analysis
Digital Forensic Information Visualisation
Fraud Investigations Involving Technology
Portable Devices
Cyber Terrorism                              
Log Analysis
Risk and Incident Management
Investigative Case Studies
Data Hiding Techniques and Steganography
Novel Data Recovery Techniques
Cyber Criminal Profiling
Big Data in Digital Forensics
Cyber Crime

Workshop Chairs
Richard Overill
King’s College London, UK
richard.overill[at]kcl.ac.uk

Virginia N. L. Franqueira
University of Derby, UK
v.franqueira[at]derby.ac.uk

Martin Mulazzani
SBA Research, Austria
mmulazzani[at]sba-research.org

Program Committee
Aswami Ariffin, CyberSecurity Malaysia and University of South Australia
Harjinder Singh Lallie, University of Warwick, UK
Max Huber, Vienna University of Technology, Austria
Werner Schneider, University of Regensburg, Germany
Olga Angelopoulou, University of Derby, UK
Andrew Marrington, Zayed University, United Arab Emirates
Joanne Bryce, University of Central Lancashire, UK
Yijun Yu, Open University, UK
Joshua James, Soon Chun Hyang University, Korea
Ibrahim Baggili, University of New Haven, USA
Frank Breitinger, University of New Haven, USA
Kim-Kwang Raymond Choo, University of South Australia, Australia
Aniello Castiglione, University of Salerno, Italy
Kam-Pui Chow, Hong Kong University, China
Chris Hargreaves, Cranfield University, United Kingdom
Katharina Krombholz, SBA Research, Austria
Grant Osborne, iiNet, Australia
Vassil Roussev, University of New Orleans, United States
Vrizlynn Thing, Institute for Infocomm Research, Singapore
Simon Tjoa, St. Pölten University of Applied Sciences, Austria
Stefano Zanero, Politecnico di Milano, Italy
Antonio Colella, Italian Army, Italy

Submission
The proceedings of ARES (including workshops) have been published by Conference Publishing Services (CPS) of IEEE.

The submission guidelines valid for the WSDF workshop are the same as for the ARES conference.
Authors of selected papers accepted for WSDF will be invited to produce extended and updated versions of their papers (with at least 30% of new material) for review and publication in the Journal of Digital Forensics, Science and Law (JDFSL), http://www.jdfsl.org/, the official journal of the Association of Digital Forensics, Science and Law (ADFSL).
2 min read

2015 국제디지털포렌식 및 사이버범죄 컨퍼런스 · 한국디지털포렌식학회논문 모집 공고

2015 국제디지털포렌식 및 사이버범죄 컨퍼런스 · 한국디지털포렌식학회논문 모집 공고

Please note: all submissions and presentations must be in English.
국제디지털포렌식 및 사이버범죄 컨퍼런스(International Conference on Digital Forensics and Cyber Crime, ICDF2C)는 디지털포렌식과 사이버범죄 수사의 발전을 위해 전세계의 연구자, 실무자, 교관들 간 교류를 촉진하는 국제 연례학술행사입니다. ICDF2C의 국제적, 협력적 기조를 이어가고자 ICDF2C 2015 행사는 한국 디지털포렌식학회 연례 학술대회 (KDFS2015) 와 공동으로 주최될 예정입니다.
ICDF2C 2015는 대한민국, 서울에서 2015년 10월 6일부터 8일까지 개최되며, 완성된 연구논문, 연구 진행중인 논문, 산업분야발표, 패널 및 튜토리얼, 그리고 패널 토론 등 참가신청을 받습니다. 논문은 이중은폐 (double-blind) 방식 및 동료검토 (peer-review) 절차를 통해 평가되며, 채택된 연구논문은 Springer- Verlag社에서 인쇄된 책자(proceeding) 형태로 발표될 예정이며, 이 중 엄선된 논문은 SCI저널인 “Digital Investigation”誌에 게재될 예정입니다.
※ 본 컨퍼런스의 공식 진행언어는 영어로, 투고논문은 영어로 작성되어야 하며, 기타 유형의 참가 또한 모두 영어로 진행됨을 알려 드립니다
1. 특별테마
올해 ICDF2C에서는 두 가지 특별 테마주제를 중심으로 회의를 진행할 계획입니다. 다음 주제들과 관련된 논문을 투고하는 것을 권장합니다.
  • “DarkWeb”의 활용, 시사점(implication), 수사(investigation)
  • 암호화된 가상화폐(cryptocurrency)에 관한 사례연구 및 수사기법
2. 연구주제
인터넷의 등장으로 범죄자들은 보다 손쉽게 익명으로 범죄를 저지를 수 있게 되었습니다. 글로벌통신 및 네트워킹 기반시설, 장비가 복잡해짐에 따라 그만큼 사이버범죄 수사도 어려워지고 있습니다. 특히 대부분 범행의 단서나 증거는 대량의 데이터 속에 파묻혀 있어 찾아내기 어렵고, 범죄를 탐지하고 증거를 수집하기 위해서는 대량의 데이터 중 불필요한 데이터를 걸러낼 수 있는 기술이 절실히 필요해지고 있습니다. 디지털포렌식과 사이버범죄 수사는 법집행, 국가안보, 정보보호에 있어 매우 중요한 분야가 되었으며, 이는 법학, 컴퓨터공학, 금융, 전기통신, 데이터분석학, 경찰학 등의 수많은 영역을 아우르는 학제적 영역이라고 할 수 있습니다. ICDF2C는 다양한 분야의 실무자와 연구자들을 초청하여 이들 간 학술적 교류를 촉진시키고 새로운 사업 기회를 만들어낼 수 있는 창조적 화합의 장입니다.
컨퍼런스 주제
  • 안티포렌식 및 안티-안티포렌식 (Anti Forensics and Anti-Anti Forensics)
  • 빅데이터와 디지털포렌식 (Big Data and Digital Forensics)
  • 디지털포렌식의 비즈니스 응용 (Business Applications of Digital Forensics)
  • 민사소송지원 (Civil Litigation Support)
  • 클라우드 포렌식 (Cloud Forensics)
  • 사이버범죄수사 (CyberCrime Investigations)
  • 사이버범죄자 심리학 및 프로파일링 (Cyber Criminal Psychology and Profiling)
  • 사이버문화 및 사이버 테러리즘 (Cyber Culture & Cyber Terrorism)
  • 데이터은닉 및 스테가노그라피 (Data hiding and steganography)
  • 데이터베이스포렌식 (Database Forensics)
  • 디지털포렌식공학 (DigitalForensic Science)
  • 디지털포렌식 툴 테스트 및 검증 (Digital Forensic Tool Testing and validation)
  • 디지털포렌식 동향(Digital Forensic Trends)
  • 디지털포렌식과 법 (Digital Forensics & Law)
  • 디지털포렌식과 오차율 (Digital Forensics and Error rates)
  • 디지털포렌식 신종 알고리즘 (Digital Forensics novel algorithms)
  • 디지털포렌식 과정 및 절차 (Digital Forensics Process & Procedures)
  • 디지털포렌식 표준화 및 인증 (Digital Forensics Standardization & Accreditation)
  • 디지털포렌식 기법 및 툴 (Digital Forensics Techniques and Tools)
  • 디지털포렌식 정보 분류 (Digital Forensics Triage)
  • 전자증거개시 (e-Discovery)
  • 해킹 (Hacking)
  • 침해사고대응 (Incident Response)
  • 정보전 및 주요기반시설 보호 (Information Warfare & Critical Infrastructure Protection)
  • 법집행과 디지털포렌식 (Law Enforcement and Digital Forensics)
  • 기계학습과 디지털포렌식 (Machine learning and Digital Forensics)
  • 악성소프트웨어 및 봇넷 (Malware & Botnets)
  • 모바일/소형기기 및 멀티미디어 포렌식 (Mobile/Handheld Device & Multimedia
Forensics)
  • 돈세탁 (Money Laundering)
  • 네트워크 포렌식 (Network forensics)
  • 최신 chip-off 기법 (New chip-off techniques)
  • 최신 디지털포렌식 훈련 프로그램 (Novel Digital Forensics Training programs)
  • 온라인 사기 (Online Fraud)
  • 프로그래밍언어 및 디지털포렌식 (Programming Languages and Digital Forensics)
  • SCADA (supervisory control and data acquisition, 감시제어 데이터 수집 시스템)
Forensics
  • 인터넷 아동 성학대 (Sexual Abuse of Children on Internet)
  • 소프트웨어 및 미디어 저작권 침해 (Software & Media Piracy)
  • 디지털포렌식의 이론적 기반 (Theoretical Foundations of Digital Forensics)
  • 전통 범죄학의 디지털포렌식에의 적용 (Traditional Criminology applied to Digital 
Forensics)
  • 사이버범죄 및 디지털포렌식에 관한 철학적 해석 (Philosophical accounts for 
Cyber Crime and Digital Forensics) 
 3. 연구논문 (Research Papers)
연구논문의 경우 독창적이고 기존에 발표되지 않은 연구에 관한 것이어야 합니다. 본 회의가 아닌 다른 회의, 학술지 및 기타 다른 통로로 검토되고 있는 논문은 투고 대상에서 제외됩니다. 위에 제시된 주제 분야의 논문이 권장되나, 그 밖의 주제를 다룬 논문도 투고 가능합니다. 투고와 관련하여 문의사항이 있으면 언제든지 컨퍼런스 개최 담당자에게 연락 바랍니다.

4. 기타 참가신청
본 컨퍼런스는 완성된 연구논문, 연구진행중인 논문, 산업분야 발표, 패널토론 등 다양한 방식으로 참여가 가능합니다. 각 해당 참가유형에 따라 아래 지침에 따라 신청해주시기 바랍니다.
  • 완성된 연구논문 (Completed Research Papers) : 10매이내 (초록, 그림,표 및 참고문헌 포함)
  • 연구