Category Archives: Information Assurance

Windows Server 2012, Hyper-V, Ubuntu+ZFS VM for Backups

I set this up only to test ZFS with deduplication as an alternative to using a standard hardware RAID configuration for storing SQL backups.

Hardware used:

  • Dell R720xd
  • Dell Perc H800 RAID controller
  • Dell PowerVault MD1220
  1. Configure RAID on the Dell R720xd / Perc H800 controller. All 24 disks (2.5″ 900GB SAS in my case) as an independent RAID-0 logical volume since the H800 (as far as I can tell) doesn’t support an easy JBOD option.
  2. In Windows Server 2012 Computer Management, I set each of the 24 volumes to GPT partition table when asked, and then set all 24 volumes “offline”. Doing so allows Hyper-V to access each disk directly, so that Windows cannot.
  3. Install the Hyper-V role and rebooted.
  4. I went into the VMs settings. First I added a SCSI controller, since IDE Controllers are limited to 2 devices, while SCSI can support up to 64 devices. Then I added the 24 volumes as “physical hard disks”, matching the SCSI location number 0 – 23 to the 24 volume target number 0 – 23.
  5. Also, make sure to install a virtual switch and configure your network interface for the Ubuntu VM.
  6. Downloaded Ubuntu Server 12.04 x64, installed Ubuntu with OpenSSH and Samba features. Started the Ubuntu VM.
  7. Verify Ubuntu can see the 24 volumes:
    sudo lshw -C disk


  8. sudo apt-get update
  9. sudo apt-get install python-software-properties software-properties-common -y
  10. sudo add-apt-repository ppa:zfs-native/stable -y
  11. sudo apt-get update && sudo apt-get dist-upgrade -y
  12. sudo apt-get install ubuntu-zfs -y
  13. dmesg | grep ZFS
    [    0.000000] Command line: BOOT_IMAGE=/vmlinuz-3.11.0-15-generic root=/dev/mapper/DPMHOST--ZFS--vg-root ro
     [    0.000000] Kernel command line: BOOT_IMAGE=/vmlinuz-3.11.0-15-generic root=/dev/mapper/DPMHOST--ZFS--vg-root ro
     [    8.529432] Adding 4190204k swap on /dev/mapper/DPMHOST--ZFS--vg-swap_1.  Priority:-1 extents:1 across:4190204k SSFS
  14. sudo vim /etc/modules
  15. Add these lines:
  16. Incorporate new modules into the boot files:
    sudo update-initramfs -u
  17. sudo reboot
  18. I created a ZFS pool called “zfs0” using raidz3 which can lose up to 3 disks using all 24 volumes:
    sudo zpool create zfs0 raidz3 /dev/sdb /dev/sdc /dev/sdd /dev/sde /dev/sdf /dev/sdg /dev/sdh /dev/sdi /dev/sdj /dev/sdk /dev/sdl /dev/sdm /dev/sdn /dev/sdo /dev/sdp /dev/sdq /dev/sdr /dev/sds /dev/sdt /dev/sdu /dev/sdv /dev/sdw /dev/sdx /dev/sdy -f
  19. sudo zpool status
    pool: zfs0
     state: ONLINE
     scan: none requested
     zfs0      ONLINE       0     0     0
     raidz3-0  ONLINE       0     0     0
     sdb     ONLINE       0     0     0
     sdc     ONLINE       0     0     0
     sdd     ONLINE       0     0     0
     sde     ONLINE       0     0     0
     sdf     ONLINE       0     0     0
     sdg     ONLINE       0     0     0
     sdh     ONLINE       0     0     0
     sdi     ONLINE       0     0     0
     sdj     ONLINE       0     0     0
     sdk     ONLINE       0     0     0
     sdl     ONLINE       0     0     0
     sdm     ONLINE       0     0     0
     sdn     ONLINE       0     0     0
     sdo     ONLINE       0     0     0
     sdp     ONLINE       0     0     0
     sdq     ONLINE       0     0     0
     sdr     ONLINE       0     0     0
     sds     ONLINE       0     0     0
     sdt     ONLINE       0     0     0
     sdu     ONLINE       0     0     0
     sdv     ONLINE       0     0     0
     sdw     ONLINE       0     0     0
     sdx     ONLINE       0     0     0
     sdy     ONLINE       0     0     0
    errors: No known data errors
  20. sudo zfs list
     zfs0   297K  16.7T  89.8K  /zfs0
  21. df -h
    Filesystem                         Size  Used Avail Use% Mounted on
     /dev/mapper/DPMHOST--ZFS--vg-root   15G  1.5G   13G  11% /
     udev                               2.0G  4.0K  2.0G   1% /dev
     tmpfs                              790M  668K  789M   1% /run
     none                               5.0M     0  5.0M   0% /run/lock
     none                               2.0G     0  2.0G   0% /run/shm
     /dev/sda1                          236M   32M  192M  14% /boot
     zfs0                                17T  128K   17T   1% /zfs0
  22. Configure Samba:
    sudo vim /etc/samba/smb.conf
  23. sudo zfs set sharesmb=on zfs0/backuptest1
  24. sudo chmod 0777 /zfs0/backuptest1
  25. sudo service smbd restart
  26. sudo zfs get sharesmb,sharenfs
    NAME              PROPERTY  VALUE     SOURCE
    zfs0              sharesmb  off       default
    zfs0              sharenfs  off       default
    zfs0/backuptest1  sharesmb  on        local
    zfs0/backuptest1  sharenfs  off       default
  27. I set compression as LZ4, which does wonders for raw SQL files:
    sudo zfs set compression=lz4 zfs0/backuptest1


Ubuntu 13.11 + ZFS / raidz2 Samba share

These are the steps that I took and what works for me. I hope it helps someone else. Configure the RAID controller as either JBOD or as each HDD being an independent RAID-0 logical volume. Then install Ubuntu server 13.11 x64 with OpenSSH and Samba.

sudo add-apt-repository ppa:zfs-native/stable
sudo apt-get update
sudo apt-get dist-upgrade
sudo apt-get install ubuntu-zfs python-software-properties
dmesg | grep ZFS
sudo vim /etc/modules

(add the following…)


(then run…)

sudo update-initramfs -u
sudo reboot
sudo zpool status
sudo zpool create zfsshare raidz2 /dev/sdb /dev/sdc /dev/sdd /dev/sde /dev/sdf -f
sudo zfs list
sudo zfs create zfsshare/backup
sudo zpool status
sudo vim /etc/samba/smb.conf

(configured smb.conf…)

sudo zfs set sharesmb=on zfsshare/backup
sudo chmod 0777 /zfsshare/backup
sudo service samba restart
sudo zfs get sharesmb,sharenfs
sudo zfs set compression=lz4 zfsshare/backup
sudo zdb -b zfsshare
sudo zfs set dedup=on zfsshare/backup

(after copying SQL .bak files, etc, to the share…)

ls -alh /zfsshare/backup/
sudo zfs get compressratio zfsshare/backup
sudo zfs get all |grep comp

Statement of Purpose

Thank you for allowing me the opportunity to share my ambitions and goals regarding the University of Washington (UW) master’s degree program in Infrastructure Planning and Management (MIPM).

Earlier this year, I passed the interview portion for a network administration position within the Seattle Police Department (SPD). Following the extensive background-check process, I was denied the position due to a lack of work-related experience compared to another candidate. I consider the SPD application experience a success for three reasons. First, it was an honor to simply spend time with SPD information technology managers and being challenged with technical and non-technical questions. Working for the City of Seattle has been a long-time desire, especially concerning the security of critical infrastructure. Second, at the end of my interview, I was praised for my ability to be articulate when providing answers. For nearly two years, I have been employed by Big Fish Games in their network operations center (NOC). Having made it a specific point of mine to further develop appropriately-verbose communication skills, it was wonderful feedback to hear. Finally, during the SPD’s interview process, I was asked if there was anything I would like to add to bolster my prospect of being hired. I specifically mentioned the MIPM program with the intention of working directly with the SPD for any and all related projects. Two of my three interviewers were clearly interested in the UW’s MIPM program. One responded by explaining the SPD’s desire to work closer with the University of Washington. I hope that I will be presented with a future opportunity to work for the SPD on some form of city-level information assurance development.

For over two years, I have been maintaining high standards for information technology (IT) infrastructure incident response and problem management in two separate NOCs. My first NOC position was with Microsoft supporting online business communications technologies across 19 internationally-spread datacenter co-locations. The majority of my professional NOC experience has been with Big Fish Games where I help support their entire IT infrastructure, encompassing 4 internationally-spread datacenter co-locations.

IT professionals, who are fortunate enough to be able to rely on a NOC for all initial triage and communications support, differ in terms of knowledge specialization and development. Unlike network or database administrators, NOC personnel must holistically understand all operational aspects of their entire business infrastructure. Contrast to Microsoft, an extremely large company, Big Fish Games is a medium-sized company with 99% revenue dependency on IT infrastructure and uptime. I feel very fortunate to be valued as a peer in a company like Big Fish Games where one is able to clearly understand where professional specializations and business drivers merge.

Prior to my NOC experiences, I had a successful internship at Microsoft as a Support Analyst on a datacenter deployment operations team. Although I feel that this internship was too short (three months), I helped perform a diverse set of large-scale hardware and software deployments, including some in Microsoft’s famous 470,000 square foot facility in Quincy, WA. I believe that these experiences with large-scale IT systems are setting the stage for even greater work in support of critical infrastructure.

Working as an information systems problem manager has allowed me to gain a unique understanding and appreciation for the IT field. I am looking forward to shifting gears from a response-oriented (reactive) career to a forward-thinking (pro-active) career in IT.

My future plans involve working in a security-focused role in Seattle while maintaining high academic performance in the MIPM program. Professional IT security experience is a requirement for Certified Information Systems Security Professional (CISSP) certification. Additionally, in 2013, I would like to attend the Oxford Scenarios Programme, hosted by the Saïd Business School, University of Oxford. This futures-development coursework would dramatically increase my contributions to the MIPM program. The Oxford Scenarios Programme would also fit in with my long-term objectives of executive-level information assurance development.

The MIPM program is a clear next-step. Being a critical and strategic thinker, I have outlined two primary objective-oriented paths with many levels of goals—one path academic, one career-oriented. I have taught myself the concept of how to pursue what I can when I can, and to merge these two paths whenever possible. The MIPM program would undoubtedly be one of those rare events where I can merge both paths. My long-term career objectives include the executive management of information assurance processes. Furthermore, I hope to advance my company, Sagawa LLC., which will build a network security appliance that utilizes Suricata, an intrusion detection and prevention system (IDPS) developed by the Open Information Security Foundation (OISF). OISF is funded by the Department of Homeland Security (DHS) and the Navy’s Space and Naval Warfare Systems Command (SPAWAR). I originally became interested in developing my skills in security information and event management (SIEM) using Suricata because of the DHS and Navy’s direct support for its development—supporting federal information assurance initiatives greatly appeals to me.

I have many hobbies. For entertainment (please keep in mind that I am an introvert), I study information philosophy and information systems theory, and have a general interest in complex systems theory and intelligence analysis. Also, I read a great deal of information-security related media. I contribute to the project ( by developing public-domain licensed standard operating procedures for installing and using open-source cryptographic communication tools. Additionally, I maintain two Tor ( exit routers. I have a keen interest in supporting international freedom of expression and the right to read (anti-censorship).

Every single year of my formal education has been an outstanding challenge. The one exception was a single quarter spent with Dr. Barbara Endicott-Popovsky in the UW’s IMT 551. I was undoubtedly on the edge of my seat during every class because of my excitement concerning the course material. As a young child, I was diagnosed as both gifted and learning-disabled. Like many students with this “twice-exceptional” condition, I have dealt with an unnecessary amount of frustration coming from teachers and mentors. Elementary and middle school teachers repeatedly called me lazy. High school administrators told me not to pursue higher education. Toward the end of my undergraduate degree, my disabilities-support adviser declared me a failure and that I would only succeed in life as an entrepreneur. These once-frustrating set-backs have not overcome my tenacity.

Due to my academic history, you may not view me as an ideal candidate for a respected tier-one research institution. My twice-exceptional condition is rooted to a physical re-conditioning of my brain, and it forces me to assimilate and process information differently. For example, my team-lead at Big Fish Games told me that he values my feedback when problem-solving because I present unique, useful information. There is no doubt that I have academic weaknesses; however, my cognitive differences also give me uncharacteristic academic strengths. I passionately believe that my differences will aid the MIPM program for which I clearly see myself graduating successfully.

Toward an Open Privacy Specification

Information privacy is the claim of individuals to determine what information about them is disclosed to others and encompasses the collection, maintenance, and use of identifiable information. Privacy is an important value in a democratic society. For individuals, it enhances their sense of autonomy and dignity by permitting them to influence what others know about them. For associations, privacy enhances the ability of individuals to function collectively by permitting the association to keep deliberations and membership and other activities confidential. For society, privacy fosters individual and associational contributions to society, promotes diversity, and limits undesirable conduct and abuse of authority by government and other institutions.

Toward an Information Bill of Rights & Responsibilities

This post is a brain dump for my ideas for designing a new, community driven (#open) certification. I’d like to eventually make this a program that is actively maintained by the Open Knowledge Foundation and licensed under the Creative Commons.

First, checkout this 5-minute video presentation from the Chaos Communication Congress where this idea spawned:

28c3 LT Day 2: Securing the Servers: Privacy Policy for Providers

The PCP is a policy for communication service providers who seek to respect the privacy of their user-base. It includes a set of modules that cover various aspects of the server configuration and three levels in each module which provide more and more privacy.

I’d like to adapt their work, specifically, to create an open framework that would be made up of a spectrum of policies and procedures for auditing and implementing privacy-centric services for information hosting providers. So, if you’re a blogger or an internet service provider, you would use this specification to audit yourself, make specific changes to your network or hosting infrastructure, then precisely outline such capabilities, publicly. This would be a voluntary and trust-based process, being that service providers will be their own auditors.

Their existing work:

Open Privacy Specification


Collaboratively build an open framework for a broad range of internet-based information service providers with the objective of creating and maintaining specific policies, procedures, and certifications for objectively controlling personal information.


Fundamentally, maintaining individual privacy requires accessibility to control the confidentiality, integrity, and availability of specific information. Information that cannot be controlled by a services user must be defined and made publicly available, with detail, without compromising the security of the information hosting provider.

The purpose of the Open Privacy Specification is to:

  • define the relative privacy expectations between the information hosting provider’s service and the services users;
  • design and implement services that safeguard the services users whenever possible against voluntary and involuntary compromisation;
  • provide the services users meaningful information about their ability to maintain their privacy while using said services;
  • implement routine processes and secure controls via standardized policies and procedures;
  • implement a standardized public disclosure document outlining the information service providers metric-based capabilities and limitations.


Certifications will be built around service capabilities and information management infrastructure.

Service capability examples include:

  1. Pertinent regional laws (when available)
  2. Organizational management (as permissible)
  3. Automated and manual processes (as permissible)

Information management infrastructure examples, standardized around the OSI model, may include:

  1. Physical, data link, network, transport, and session layers:
    1. Upstream providers capabilities and limitations
    2. Hardware configuration, capabilities, and limitations
    1. Network configuration, capabilities, and limitations
  2. Presentation and applications layers:
    1. Operating systems configuration, capabilities, and limitations
    2. Software applications configuration, capabilities, and limitations

Future revisions of specific policies or procedures should be adaptable to existing information assurance frameworks, such as PCI-DSS, COBIT, NIST, or ISO/IEC 27002, etcetera. At the moment, I’m thinking about sponsoring a hack-day event to launch the initial draft with the University of Washington. I think it would be a solid start. As always, feel free to share any commentary.

Layered email security

There are two takeaways from CloudFlare‘s ( recent security breach that are outstanding pieces of actionable information.


Ensure your password on your email account is extremely strong and not used on any other services…



…using an out-of-band authentication that doesn’t rely on the phone company’s network (e.g., Google Authenticator App, not SMS or voice verification).

If you already have two-factor authentication ( turned on for your Gmail or Google Aps account, you likely have a cell phone number or a landline number in use. It’s really easy to remove the number once you are using the Google Authenticator ( app. If you have a rooted phone like me, and enjoy reflashing your phone to try out new roms or mods, be sure to deactivate two-factor authentication before you purge your apps!

If you aren’t using two-factor authentication… may the internet gods be with you XD

Password Reset

Hello Company,

Can you please assist me with resetting my account password for the company customer portal? I don’t know how I answered my “security questions”. I never use the same answers since answering the same question at multiple locations (like my bank, etc) is no different than using a password twice, just these ones an attacker could actually figure out just by finding the right information.

If security is important to you, you should look into multi-factor authentication, and not simply increase the amount of passwords a person has to type in. Please forward this suggestion to Jane Doe, your CIO, who apparently designed the company customer portal.

By the way, when you disallow web browsers to remember my randomly-generated passwords, it gets in the way of my workflow. I must have saved the password in clear-text somewhere but instead now I’m spending my employer’s time emailing you for help.


[changed for privacy]

Corporate-centric Information Operations

The general applicability of information warfare as an extension of information assurance seems prudent. CIO’s and CISO’s should be aware of correlating risks and act appropriately. The modern enterprise is dependent on an ever-growing need for information and automated information management systems. While information assurance offers a holistic approach to defending a business organization, the risks of all three classes of information warfare are not even brought into question in corporate-level information assurance policy management frameworks. The following model is an attempt to describe where information operations risk management could be implemented. Short-term damage can largely be applied to class 2 information warfare (corporate implications), while long-term damage can be applied to class 3 information warfare (regional or global implications).