Revisiting the Four Grand Challenges in Trustworthy Computing: Challenge 2

A while back we looked at Challenge 1 in the Four Grand Challenges in Trustworthy Computing from 2003. In my opinion, we have fallen quite short on Challenge 1, that is “eliminating epidemic attacks by 2014”. Today, we will look at Challenge 2.
<div class="separator" style="clear: both; text-align: center;"></div>
Challenge 2, is generally defined as “ensure[ing] that new, critical systems currently on the drawing board are immune from destructive attack”.

Challenge 2 looks at systems of critical importance that are currently being designed and implemented. Unlike Challenge 1, that focuses on systems that are already deployed, Challenge 2 focuses on security, reliability and trustability of systems that are (or were at that time) currently being developed.

The metric of success is based on the CIA model, focusing on systems that ensure:
<ul><li>Confidentiality</li><li>Integrity</li><li>Availability</li></ul>and is extended with:
<ul><li>“Auditability”</li><li>Global Accessibility</li></ul><div>The group identified a number of critical systems (Figure 1), and stated that “there is very little reason to believe that such systems, if developed under current technology, will be trustworthy”.</div><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Figure 1. Critical systems and infrastructure identified by the CRA group in 2003.</td></tr></tbody></table><div>This statement comes almost five years after the U.S. Presidential Directive 63, which had a national goal stating:
<blockquote class="tr_bq">No later than the year 2000, the United States shall have achieved an initial operating capability and no later than five years from today the United States shall have achieved and shall maintain the ability to protect the nation’s critical infrastructures from intentional acts that would significantly diminish the abilities of:
<ul><li>the Federal Government to perform essential national security missions and to ensure the general public health and safety;</li><li>state and local governments to maintain order and to deliver minimum essential public services.</li><li>the private sector to ensure the orderly functioning of the economy and the delivery of essential telecommunications, energy, financial and transportation services.</li></ul></blockquote>The Colloquium for Information Systems Security Education in 2008 again put critical systems, and specifically SCADA systems, as a priority area in need of organized research. There has been a growing amount of research into critical system defense, security and forensics, but the 2011 alleged hacking of an Illinois water system, as well as some infrastructures we have seen, lead me to believe that research is not being practically implemented.

From discussions with people dealing with critical infrastructure, there seems to be an attitude much like a home computer user. They know there is a risk, but in many cases don’t feel like there is a big enough risk to justify investing the amount of money necessary to update, secure and monitor their systems (even some physical systems). In the U.S., government regulation to that would allow the DHS to “enforce minimum cybersecurity standards on infrastructure computer systems that, if damaged, would lead to mass casualties or economic loss”. Regulation, however, was opposed.

I somewhat understand why some critical infrastructure providers may find it hard to justify large investment in cybersecurity. Last year, 198 cyber incidents were reported to DHS across all critical infrastructure sectors, most of which were reportedly spear-phishing attempts. Granted, many more attacks probably took place that were not discovered / reported, but with numbers like that, a director may be thinking that it is statistically unlikely that they would get hit.

For me, the takeaway is that critical systems are still not being designed with cybersecurity, and sometimes even physical security, in mind. Further, critical infrastructure providers have the same problems as any other business; their people - as well as technology - can be a security gap. Since critical infrastructure is a hot topic right now, I hope security and risk awareness increased, but I have yet to see any real changes implemented in many countries. Almost 10 years after the grand challenge was proposed, I would say that not only are we not designing systems that are “immune from destructive attack”, but we are still not designing critical systems with basic cybersecurity in mind.

</div>
Image: FreeDigitalPhotos.net

3 min read

InfoSecurity Russia 2012


Last week, Pavel and I gave an invited talk at InfoSecurity Russia 2012. From Digital FIRE:
<blockquote class="tr_bq">Our talk explored the issues of digital forensics in the cloud environment. The first part of the talk introduced the concepts of cyber crime investigations and the challenges faced by the digital forensic practitioners. The second part of the talk explored investigative difficulties posed by cloud computing. A possible approach to dealing with some of these difficulties based on I-STRIDE methodology was then outlined.</blockquote><div class="separator" style="clear: both; text-align: center;"></div>
Discussed security challenges with Cloud environments are further elaborated on in our chapter “Digital Forensics and Cloud Computing” that can be found in Cybercrime and Cloud Forensics: Applications for Investigation Processes. Some investigation challenges were introduced based on the work of our friends at CloudForensicsResearch.org, with a few of my own thoughts added. Finally, a very quick overview of the Investigation STRIDE (I-STRIDE) model was given to attempt to help investigators and first responders identify potential sources of evidence, their jurisdiction, and other factors that may effect the admissibility of evidence extracted from a Cloud Service Provider.

http://infosecurityrussia.ru/speakers/
Image from Koraxdc

~1 min read

Creating and Configuring a Large Redundant Storage Array for a Network Share in Ubuntu using MDADM

We had a hardware RAID card that worked well in Windows, but was giving some issues in Linux (Ubuntu 12.04). So, we decided to try to setup a software array using mdadm. This is how we did it.

First, make sure your hardware RAID card has "non-RAID" mode. Basically that it lets each of the attached drives show up as a single drive. Ensure this is enabled. On some cards you will have to flash the BIOS with a non-RAID version.
  • Do not create a hardware RAID array using the RAID configuration menu
  • Make sure no drives are assigned to a RAID array
  • Install the newest version of Ubuntu Server
    • When asked for partitioning information
    • Select manual partitioning
    • Select "Create Software RAID”
    • “Create MD device”
    • Select RAID5   (we are using RAID5)
    • Active devices = #   (where # is the number of drives you want in the array)
    • Spare devices = #
    • Select all the drives you want in the array
    • Select OK

After the array is created our new device has about 21TB. In versions of Ubuntu before 12.04 it was difficult to create a partition using the whole 21TB, but now you should be able to do it from the install menu.
<ul><li>Create a partition on the newly created device (usually md0)</li><li>Format as ext4</li><li>Save and continue with install per usual</li><ul><li>You might want to select “ssh” from the package selection</li></ul></ul>Once install is done, and you boot into the OS:
<ul><li>Make sure array device has been created</li><ul><li>Sudo fdisk –l</li><li>Look for /dev/md0</li></ul><li>Check status of the software array</li><ul><li>Sudo mdadm –detail /dev/md0</li><li>If the status of the array is “building” or “syncing”, let the process finish (may take several hours)</li></ul><li>Create a mount point for the array</li><ul><li>Sudo mkdir /media/RAIDStorage</li></ul><li>Modify fstab to mount partition on boot</li><ul><li>Sudo nano /etc/fstab</li><li>Add a new line:</li><ul><li>/dev/md0 /media/RAIDStorage ext4 defaults 0 0</li></ul><li>Save and exit</li><li>Remount</li><ul><li>Sudo mount –a</li></ul><li>Check device was mounted to /media/RAIDStorage</li><ul><li>Sudo mount | grep RAIDStorage</li></ul></ul><li>Share RAIDStorage with NFS</li><ul><li>Sudo apt-get install nfs-kernel-server</li></ul><li>Edit exports file</li><ul><li>Sudo nano /etc/exports</li><li>Add line</li><ul><li>/media/Storage/Case_Data 10.1.1.0/24(rw,insecure,async,no_subtree_check)</li></ul><li>Save and exit</li><ul><li>/etc/init.d/nfs-kernel-server restart</li></ul></ul></ul>In this case, permissions are NOT being set up on NFS. If you need a more secure environment, make sure you set it up.

Also, we are using ‘async’ instead of ‘sync’. For some reason, when writing very large files sync have very, very bad performance, while async allowed for maximum write speeds.

If there are write permission errors, check that the permissions on the folder (/media/RAIDStorage) on the server are set correctly for the user

1 min read

Korean National Police University International Cybercrime Research Center


<div class="p1">Today is the inauguration of the Korean National Police University (KNPU) International Cybercrime Research Center (ICRC). The inauguration ceremony will immediately be followed by the 1st International Cybercrime Research Seminar.</div><blockquote class="tr_bq">Grand opening: The International Cybercrime Research Center, Korean National Police University.
The opening ceremony will be hosted by the president of the Korean National Police University on September 18th, 2012. To commemorate the event, the Police orchestra will perform, and the 1st International Cybercrime Research Seminar will be held. The International Cybercrime Research Center will focus on multi-disciplinary research and support dealing with cyber-policing strategy, quality education and training. The official website for the International Cybercrime Research Center will be available soon.
For more information or to submit proposals for partnerships and collaboration, please contact: [email protected]</blockquote><div class="p1">The 1st International Cybercrime Research Seminar will cover current trends and the future of training, education and research, child exploitation in South Korea (International Centre for Missing and Exploited Children: South Korean Chapter), and criminological aspects of digital crime.

Update:
<blockquote class="tr_bq">경찰대학(학장 치안정감 서천호)에서는 2012. 9. 18(화) 14:00 영상강의실에서 오전 1부행사로 국제 사이버범죄 연구센터 개소식에 이어 국내외 전문가들이 ‘유럽 사이버수사의 교육훈련 동향’ , ‘호주의 정보보안 침해범죄의 범죄학적 연구’, ‘인터넷 아동음란물 실태와 대응방안’ 등에 대한 제1회 국제사이버 범죄 세미나를 개최하였다.
경찰대학(학장 치안정감 서천호)에서는 2012. 9. 18(화) 14:00 영상강의실에서 오전 1부행사로 국제 사이버범죄 연구센터 개소식에 이어 국내외 전문가들이 ‘유럽 사이버수사의 교육훈련 동향’ , ‘호주의 정보보안 침해범죄의 범죄학적 연구’, ‘인터넷 아동음란물 실태와 대응방안’ 등에 대한 제1회 국제사이버 범죄 세미나를 개최하였다.</blockquote>
http://www.police.ac.kr/open/photo_news.jsp?SEQ=33307&BoardMode=view</div>

1 min read

Another SDHASH Test with Picture Files

After the last SDHASH test showed that fuzzy hashing on multiple sizes of the same picture files did not appear to work well. I decided to try the same size image with slight modifications like one might see in the real world. So, again there is an original image, the same image modified with text added, and the same image modified with a swirl pattern on the face.

<div class="separator" style="clear: both; text-align: center;"><table cellpadding="0" cellspacing="0" class="tr-caption-container" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Kitty Orig: 75K MD5 6d5663de34cd53e900d486a2c3b811fd</td></tr></tbody></table><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Kitty Text: 82K MD5 bcbed42be68cd81b4d903d487d19d790</td></tr></tbody></table></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Kitty whirl: 92K MD5 4312932e8b91b301c5f33872e0b9ad98</td></tr></tbody></table>On this test, I hypothesized that there would be a high match between the original kitty, and the text kitty, and a low, if any, match between the original kitty and the whirl kitty. My reasoning for this is because I think the features of the data would be similar enough - excluding the text area.<div>
</div><div>Unfortunately, I was wrong. sdhash did not find similarity between any of the pictures (ssdeep did not either).</div><blockquote class="tr_bq">$sdhash -g *</blockquote><div><blockquote class="tr_bq">kitty_orig.jpeg
kitty_text.jpeg 000
kitty_orig.jpeg
kitty_whirl.jpeg 000
kitty_text.jpeg
kitty_whirl.jpeg 000</blockquote> So, both sdhash and ssdeep did not detect any similarity between the picture files. Perhaps these tools are not suitable for picture file analysis, or a replacement for standard hashes like MD5, etc. when looking for like pictures.

</div>
~1 min read

Comparing Fuzzy Hashes of Different Sizes of the Same Picture (SDHASH)

In a previous single, we looked at setting up and using SDHASH. After comparing modified files and, and getting a high score for similarity, we started wondering how well fuzzy hashing works on different sized images. So today, we have a little experiment.

First, we have 4 images. One original, and 3 smaller versions of the original.
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">Original: 75K, MD5 6d5663de34cd53e900d486a2c3b811fd</td></tr></tbody></table><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;"> 1/2 Original: 44K, MD5 87ec8d4b69293161bca25244ad4ff1ac</td></tr></tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">1/4 Original: 14K, MD5 978f28d7da1e7c1ba23490ed8e7e8384</td></tr></tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"></td></tr><tr><td class="tr-caption" style="text-align: center;">1/8 Original: 3.6K, MD5 3e8e0d049be8938f579b04144d2c3594</td></tr></tbody></table>So, if we have an original image, we can take the hash like so:
<blockquote class="tr_bq">$sdhash kitty_orig.jpeg > kitty_orig.sdbf</blockquote>Now, we want to take the hashes of other other files (manual way):
<blockquote class="tr_bq">$sdhash kitty_2.jpeg >> kitties.sdbf
$sdhash kitty_4.jpeg >> kitties.sdbf
$sdhash kitty_8.jpeg >> kitties.sdbf</blockquote>Now we can compare the hashes of the smaller versions to the hash of the original. Note: set the threshold to negative one (-t -1) if you want to see results below 1.
<blockquote class="tr_bq">$sdhash -t -1 -c kitty_orig.sdbf kitties.sdbf</blockquote>Unfortunately, but as expected, the results were not so good. Feature selection for the hash is done at the bit level, and those features do not carry over to smaller files since there are less bytes.
<blockquote class="tr_bq">kitty_2.jpeg
kitty_orig.jpeg 000
kitty_4.jpeg
kitty_orig.jpeg 000
kitty_8.jpeg
kitty_orig.jpeg 000 </blockquote>If you were working with more images, and you wanted to hash and compare at the same time, you could use the -g switch. For example:
<blockquote class="tr_bq">$sdhash -t -1 -g *</blockquote>The output of which (in this case) would be:
<blockquote class="tr_bq">kitty_2.jpeg
kitty_4.jpeg 000
kitty_2.jpeg
kitty_8.jpeg 000
kitty_2.jpeg
kitty_orig.jpeg 000
kitty_4.jpeg
kitty_8.jpeg 000
kitty_4.jpeg
kitty_orig.jpeg 000
kitty_8.jpeg
kitty_orig.jpeg 000</blockquote>So, in conclusion, sdhash’s feature selection does not allow for comparison of greatly different sized picture files. Note that a text file would be quite different, and would probably produce better results.
1 min read