Dealing with Users Gone Bad…

We have all been there when someone that gets paid more than you do runs in and says “We need to remove User X from the network and lock them out of their PC to preserve evidence!!!”. If not, you will be.

So…aside from any 3rd party endpoint software you might have in that environment…what do you do? I have a list of steps, native Windows functions, that I have found helps eliminate avenues for User X’s continued use of ComputerX.

  1. Disable User X’s Active Directory Account
    This can be done in a number of ways but this post has a PowerShell tag so I recommend downloading Remote Server Administration Tools for your flavor of Windows if you don’t have it already. Then it is a simple matter of …Even if the user’s AD account is disabled, if they unplug from the network they can work off of cached credentials for a good while. So we need to nuke those too.
  2. Delete All Cached Credentials
    This will set the number of cached creds Windows is storing to 0 and erase the previous creds.
  3. Change their BitLocker Key Remotely
    This is a fun one. This will delete the user’s BitLocker key and save a new one that the user doesn’t know so they can’t unlock the drive after a reboot.
  4. Shut down the remote system using any number of methods.

This is not meant to be an exhaustive list but rather a collection of practical considerations. Lemme know if this helps or you have other suggestions.

UnXORing a RAT

I was looking at a sample of malware and I wanted to highlight a very simple technique I used to remove a layer of encryption without knowing the key. This post is not going to impress the seasoned veterans but since I think it will probably help someone in the community; I will bang out a post on it.

I was in the middle of reverse engineering a sample of AdWind RAT. McAfee has a great post on it here. As I read their article, my sample seemed to be in line with Variant 2, which among numerous other layers of obfuscation, has an XORed config.ini file in that contains, another malicious payload.

Looks like this:
Screen Shot 2016-02-05 at 10.15.23 PM

Since McAfee article mentions this file is an XML document, I saw an opportunity for a cheap win. The article shows the first line of the file as:
"<?xml version="1.0" encoding="UTF-8" standalone="no"?>"
One of the biggest weaknesses of using XOR as an encryption mechanism is that if you have enough plaintext, you can XOR the ciphertext with the plaintext to derive the key that was used to XOR. So with that being said…TO THE CODE!

Here is what we get:
Screen Shot 2016-02-05 at 10.26.16 PM
Immediately you can see the “ABJSIOODKKDIOSKKJDJUIOIKASJIOOQKSJIUDIKDKIAS” pattern starting over from the top of the file. This is our XOR key. Now that we have the key, all we need to do is XOR the original ciphertext file and get the complete plaintext document. MORE CODE!

and Viola!
Screen Shot 2016-02-05 at 10.32.38 PM

With this cleartext document in hand I could continue with my analysis. Again, this isn’t breathtaking but it is worth sharing.

Getting a Grep on Things

Sometimes when it rains, it pours. It was 3pm on Christmas Eve and I was mentally out the door and half way home already when proxy alerts came in for a known malicious file (for the sake of this blog will be referred to as “malicious.exe”) hitting a handful of endpoints. Here are a couple of tricks I used to get out of the door by 4:15pm.

I use a tool that pulls back a targeted logical disk image from remote machines for my triage effort. So as you read this it is probably easier to think in terms of forensic disk analysis rather than incident response.

I saw numerous requests for malicious.exe from each machine in the proxy logs. From the surrounding traffic it was more consistent with drive by redirect to a watering hole than anything a user clicked on.

So I mounted the disk images from the 4 machines to C:\Cases and here is how I quickly triaged the extent of compromise:
C:\Cases> strings -s | grep malicious.exe
Here is what I got back:

So for this first machine the only mention of malicious.exe was found in the Internet Explorer 11 history. Upon further investigation, it turns out that the user cancelled all of the download attempts and the malware never hit the filesystem. NEXT!

Looks like Machine2 was not so lucky. My 30 second analysis on this data tells me:

  1. There is an $MFT record for malicious.exe. This means it successfully downloaded and found a home in the FS. Also it looks like there is a small file in the $MFT that contains that filename. Looking at the tail end of the string it looks like javascript. This makes for a great IOC.
  2. I see numerous records in the $UsnJrnl-$J that can most likely be attributed to Internet Explorer attempting to download the malware 3 times before the user said “Ok” and allowed the download. IE creates a “filename.randomstring.partial” file for downloading files. Once the download is complete and everything checks out it renames the file to filename.exe. Here I see malicious.exe.3l98fuk.partial, malicious.exe.sx2afu8.partial, malicious.exe.trtsnvz.partial. There are also a number of USNJrnl-$J records for just malicious.exe. While most of these could be associated with the changes in file size during download and/or the file rename, it is enough of a red flag to pull this box from the network and move on.
  3. While I pulled the box off the network, it was mostly to err on the side of caution. I really only see the malicious.exe in the IE WebhistoryV01.dat but no files that screams “execution”. No prefetch, no user or system registry hits, no shortcut files, and no hits in event logs. Again, this is my 30 second analysis (I still had presents to wrap).
  4. This machine validated what my findings from the first machine. This exhibits signs of hitting the file system; Machine1 did not.

NEXT!

Much like Machine1, Machine3 shows very little filesystem activity. The only files that contain the “malicious.exe” file name are:

  • C:\Cases\Machine3\C\Users\User3\AppData\Local\Microsoft\Windows\WebCache\V01001B0.log
  • C:\Cases\Machine3\C\Users\User3\AppData\Local\Microsoft\Windows\WebCache\WebCacheV01.dat

NEXT!

This user had Chrome setup as the default browser and the only place in the collected evidence that we see the malware’s filename is in the Chrome History file. I decided to crack it open and get some better definition on what happened.
Screen Shot 2016-01-02 at 7.01.48 PM

  1. First things first…Chrome didn’t log current or target path for the malware. This means the download wasn’t written to disk. Great but how?
  2. The interrupt_reason is populated with code 40. A quick look at a copy of the Chromium source code reveals what happened:
    // The user canceled the download.
    // "Canceled".
    INTERRUPT_REASON(USER_CANCELED, 40)

  3. No end time for the download because the user cancelled it.
  4. Finally, the malware wasn’t opened because because the user cancelled it.

Now before I catch a lot of flack for not being thorough enough, this malware has well established behavior patterns that are easy to check for. Using the SysInternals version of strings checks for unicode and ascii strings at the same time which offers the best avenue of finding evidence of the filename on disk. Even from this small sample of workstations, we can quickly see how far a malicious file has penetrated a file system and determine the extent of compromise.

Does this technique work for all malware? No but it is a great start.

Review Of Mastering Python Forensics

On October 30th, Dr. Michael Spreitzenbarth and Dr. Johann Uhrmann released Mastering Python Forensics.
Here are a few of my thoughts on it:

Chapter 1: Setting Up the Lab and Introduction to Python ctypes

I enjoyed learning about Python virtualenv. For those that don’t know, Virtual Environment is a tool to keep the dependencies required by different projects in separate places by creating virtual Python environments for them. This helps when packaging tools for distribution. This book does a excellent job of explaining the basics and incorporates virtualenv throughout all of examples in the text. Here is a great rundown I discovered this nice write up.
The Introduction to Python ctypes was worth reading 3 or 4 times. I have been using ctypes and structs in Python for a while but this chapter helped organize and clarify the “copy and paste” knowledge I was working with. I expect to use the portion about interacting with DLLs as a reference numerous times in the pursuit of reverse engineering malware.

For me, this first chapter alone was worth the cost of the book. I think I can leverage the principles covered in this chapter to do some interesting stuff.

Chapter 2 , Forensic Algorithms

While chapter is surely a necessary topic for beginners, it was not particularly revolutionary for me. They did a great job of covering the subject though. I did like the addition of a Python client for nsrlsvr to compare files hashes against list of known files provided by the good people at NIST. That might come in handy.

Chapter 3 , Using Python for Windows and Linux Forensics

This is a deep subject to cover in one chapter. This chapter highlights one of the things I like the most about this book: it leans on assumed knowledge by quickly covering the foundations and moves on to advanced topics. This is not a “basics of forensics” or “Python for Dummies” style of book but it covers enough to still be inclusive for the ambitious novice. This chapter:

  • does a great job explaining how to use Willi Ballenthin’s Python-EVTX to parse the Windows Registry and search for Indicators of Compromise.
  • does feature a small blurb about plaso and log2timeline tools. For what it is worth, I would have liked to see more from this project but, again, they had a lot of ground to cover so I understand the brevity.
  • covers the Windows Registry structure, and highlights of what to look for in the registry. This chapter fails to mention Mr. Ballenthin’s Python-Registry module and the resulting Python Registry Parser from Patrick Olsen both I think are missed opportunities. It does cover Andrew Davis’ ShimCacheParser which uses an old version of Ballenthin’s Registry module. The surface was scratched but in my opinion, they could have dug a little deeper.
  • nails Linux Forensics 101 with Python perfectly. I have nothing but praise for this section of the book. It is rich with practical examples and insightful explanations.
  • does a good job of using matplotlib to visualize data with histograms. The method demonstrated could be leveraged to display all kinds of data.

Chapter 4 , Using Python for Network Forensics

This chapter is short and doesn’t cover as much ground as I would have hoped. This chapter:

  • Introduces Dshell to dissect packet captures. I learned a lot from this section and see a lot of practical applications for the examples provided. Dshell makes carving files out pcaps easy peasy.
  • Offers an example of using Scapy for during a forensic investigation. The script they provide records the statistics about the geolocation of the IP address source and the destination of an ongoing network connection. While this seems cool, this falls more into the realm of Incident Response than Forensics.
  • Explains how to use Scapy to create a simple port scanner. Building a port scanner for a Forensic investigation feels like a stretch to me. Again, I feel like this is more suited for IR or Penetration Testing. Neat though.
  • I would have loved to see more deep packet analysis or visualization of traffic patterns. Perhaps decrypting CryptoWall’s RC4 C&C traffic in Python? Just a thought.
  • For the next edition, newer projects like Omri Herscovici’s CapTipper would make a fine addition.

Chapter 5 , Using Python for Virtualization Forensics

Chapter 5 is well executed and covered topic thoroughly. This is another chapter that made me glad I bought the book though I will be the first to admit that I didn’t lob any of the example scripts at an actual VMware vSphere environment. I enjoyed this portion of the book because it describes an area of digital forensics that is outside of my usual wheelhouse. This chapter describes how to use pyVmomi, VMware’s Python SDK for the vSphere API, to analyze ESX, ESXi, or vCenter systems for the creation of rogue virtual machines, the creation of rogue virtual network devices on existing virtual machines, and enumerating virtual machines’ direct hardware access. Pretty cool stuff. Since I am more of a VMware Workstation/Fusion type of guy , I have done similar things from using the vmrun command line interface.

Chapter 6 , Using Python for Mobile Forensics

I am not a seasoned mobile forensics analyst but I struggle with the mechanics of rooting an android phone in the pursuit of a forensic investigation.I say all that to say this, to me it seems like the introduction of an exploit to jailbreak or root your mobile device would not only be a tough sell in court but would trample all over filesystem metadata during the rooting process. I have some experience using Cellebrite’s equipment to perform logical extractions of Blackberries and Android devices which doesn’t dig deep but also doesn’t compromise the device either.

For the record, I am familiar with rooting. I rooted all four of my Motorola Droids (Droid 1-4)(I was a sucker for the sliding keyboard. Don’t judge me) and my Motorola Xoom (Motorola had me in their clutches for a while). Like many earlier android rooters, I pretty much only rooted my phone to install hacking tools and use the phone as a Wifi hotspot. Once the mobile hotspot became a mainstream feature and rooting because an enormous pain in the ass, I stopped scouring the android forums for hours trying to find shady recovery images and poorly written instructions for my exact model of phone. All things being considered, it was a good turning point in my life. I digress…

This chapter assumes you have rooted/jailbroken your device so, tabling all of my silly apprehensions about compromising the device, I found this to be another very interesting chapter. It covers using Python to grab the hash of the screen pin and cracking the hash with hashcat to unlock the screen. I could see this being very handy. Overall, I see enormous practicality in the examples provided in this chapter. I was previously unaware of the ADEL project and I haven’t evaluated it but upfront it says it works on Android 4.X. I wonder if the development of this tool and tools like are keeping up with the versions of Android. As of October 5th, we are up to Android 6.0 Marshmallow. The project looks awesome for the record and this book does a great job of explaining the applications.

Chapter 7 , Using Python for Memory Forensics

This is not the Art of Memory Forensics and it doesn’t try to be. I like that. The subject of analyzing Windows memory images with Volatility is covered, in depth, all over the place so I appreciate that this book doesn’t attempt to cover it. Smart move. Instead it offers examples of how to analyze volatile memory from Android and Linux devices. By targeting this subset of Volatility’s functionality, I think it adds a lot of value to this text. This chapter was another that broadened my horizons and made me glad I purchased.

Conclusion

If you are someone that would read this blog, you should buy this book.  This book is easy to digest and a wonderful starting point for DFIR professionals that are interested in leveraging Python to accomplish their work. “Forensics with Python” is a broad and ambitious topic to cover in 192 pages with diagrams and source code. Spreitzenbarth and Uhrmann did an impressive job of tackling the variety of subjects in an appropriate level of detail to make this book useful for forensic analysts of any experience level. Well done.

What happens when Windows Defender Quarantines Stuff…

Recently a colleague of mine asked me what happens in the file system when a malicious file is “quarantined”. The answer varies widely and as this is the “secret sauce” for many antivirus vendors, most of the time it is not overly documented how they do the voodoo they do. Seems like something that might make for a good blog or two so I sat down and did a few tests.

This post is going to cover what happened on my Windows 8 VM when I turned Windows Defender against a vicious EICAR.TXT file!

Windows Defender is a software product that attempts to detect and remove malware. Initially released as an antispyware program, it was first released as a free download for Windows XP, shipped with Windows Vista by default, and currently ships with antivirus capabilities as part of Windows 10. –Wikipedia

I chose to beat up on Windows Defender mostly because it is free and has a huge market share. Nothing personal.

So first things first: I grabbed the EICAR file and saved it to C:\temp.
Then I grabbed a copy of the $MFT to take a look at the this file’s record. Looks like this:
2015-11-02 23_26_38-Document - WordPad
There is a lot going on in there but I just wanted to focus on a few things. If you are lost, read this.

NEXT, I turned on Windows Defender real-time protection. It was recommended.
2015_11_02_23_46_14_Windows_Defender

Then a whole bunch of stuff happened.

Let’s start with $MFT record number 27152. So I quickly dumped the $MFT again and here’s what I got:

2015-11-03 00_52_57-Document.rtf - WordPad

So what changed? Pretty much everything accept the $MFT record number.

The sequence number is increment by 4, indicating that there were numerous changes to the file. Specifically the rename and move to a new parent folder.
Lets take a closer look at the USNJrnl-$J to get an idea what happened:

So in short Windows Defender deleted the original file. The MFT record number was up for grabs so it was picked up by a newly created file C:\ProgramData\Microsoft\Windows Defender\Scans\History\RemCheck\5A7D7B64F11FF203E09434276A974A97

So where did my EICAR file go? Windows Defender puts quarantined files C:\ProgramData\Microsoft\Windows Defender\Quarantine\ResourceData\. Mine was saved C:\ProgramData\Microsoft\Windows Defender\Quarantine\ResourceData\50\50761523FA79FDF68E04707959836D1F6DBA9969.
Let’s take a look at that:
2015-11-05 19_55_39-Document.rtf - WordPad
For those that don’t know, Windows Defender and Microsoft Security Essentials Quarantine files have a magic number of 0B AD 00. Clever.

Looking at the histogram of the data, it is pretty obvious that it was stored using some kind of encryption.
Screen Shot 2015-11-04 at 10.40.52 AM
After doing a bit more digging, it turns out that Windows Defender uses a hard coded RC4 key to encrypt quarantine files.
A colleague of my pointed me at the this cool script from Cuckoo
Here is the relevant chuck of their code that I bastardized for this blog post:

The RC4 cipher can be found twice in each of these files:
c:\ProgramData\Microsoft\Windows Defender\Definition Updates\Backup\mpengine.dll
c:\ProgramData\Microsoft\Windows Defender\Definition Updates\Default\MpEngine.dll
c:\ProgramData\Microsoft\Windows Defender\Definition Updates\{D45C13C3-59B3-4726-B82F-03461072F006}\mpengine.dll
c:\Users\All Users\Microsoft\Windows Defender\Definition Updates\Backup\mpengine.dll
c:\Users\All Users\Microsoft\Windows Defender\Definition Updates\Default\MpEngine.dll
c:\Users\All Users\Microsoft\Windows Defender\Definition Updates\{D45C13C3-59B3-4726-B82F-03461072F006}\mpengine.dll
c:\Windows\WinSxS\amd64_windows-defender-am-engine_31bf3856ad364e35_6.3.9600.16384_none_efe9bba68a38095a\MpEngine.dll

Looks like this:
mpengine.dll.rc4
I might dig a little deeper on this but this is all for now. Hope this helps.

Parsing Chrome Artifacts with Python! Part 3

Continuing on my mission to bore the crap out of my readers, I took a look at the html local storage databases I discovered in Part One.

What is HTML Local Storage?

With local storage, web applications can store data locally within the user’s browser.
Before HTML5, application data had to be stored in cookies, included in every server request. Local storage is more secure, and large amounts of data can be stored locally, without affecting website performance. Unlike cookies, the storage limit is far larger (at least 5MB) and information is never transferred to the server. Local storage is per origin (per domain and protocol). All pages, from one origin, can store and access the same data.
–http://www.w3schools.com/html/html5_webstorage.asp

If this is a foreign concept to you, try it out:


Pretty simple stuff. If you browsed this from Chrome you should now have a https_jon.glass_0.localstorage in your local storage directory.
On my Windows 7 VM the Chrome Local Storage files are located here:
C:\Users\UserName\AppData\Local\Google\Chrome\User Data\Default\Local Storage
Here is a look at a few of mine:

It is worthy to mention there are a few investigative points you can glean from just this directory listing:

  • The protocol of the site (http/https).
  • The domain.
  • The last modified date and time of the local storage file

Let’s look at my local storage entry from a few angles.
First lets checkout Chrome’s developer tools > Resources > Local Storage:
devtool
Here is a shot of what it looks like in the ol’ SQLITE Browser:
devtool
Here is some Python that will read it tool:

devtool
I have looked at a bunch of these and the format is mostly developers choice. It is interesting to see how they are leveraged. For example, Wikipedia’s local storage is so big, comparatively, because they are storing Base64 encoded images in the SQLite database.

Kinda makes you wonder about what is hiding in YOUR local storage.
Well that is all for now…

Parsing Chrome Artifacts with Python! Part 2

Continuing on with my riveting series on how to parse the Chrome SQLite files with Python, let’s dig a little deeper into SQL portion by taking a look the History database.

History Database

Here are the tables and associated fields:

  • downloads: id,current_path,target_path, start_time, received_bytes, total_bytes, state, danger_type, interrupt_reason, end_time, opened, referrer, by_ext_id, by_ext_name, etag, last_modified, mime_type, original_mime_type
  • downloads_url_chains: id, chain_index, url
  • keyword_search_terms: keyword_id, url_id, lower_term, term
  • meta: key,value
  • segment_usage:  id, segment_id, time_slot, visit_count
  • segments: id, name, url_id
  • urls: id, url, title, visit_count, typed_count, last_visit_time, hidden, favicon_id
  • visits: id, url, visit_time, from_visit, transition, segment_id, visit_duration
  • visit_source: id, source
  • ses

Some of these tables are useful on there own but much more information can be derived from querying more than one at a time. For example, the visits table stores information about the when and how often you visit a URL. To reduce redundant information in the database, instead of storing the URL again in the visits table, the visits table contains a pointer to the index of the URL in the url table.

Here are some more examples of SQL statements you could run against the History file if you are interested in the exercise.

I can hear the groaning already…
“Jon, I am a seasoned Information Security professional and I am taking the time to read your silly blog. Don’t bore me.” Imaginary Reader, I respect your candor and thirst for knowledge. Fine! Challenge accepted.
Here is a query that will query the visits table and the urls table for the URLS for the visits and their referring visits URL, assuming both are available.

From the output of this query we can see the referring URL that sponsored the records in the visits table. Note about this particular artifact is that it doesn’t look like this visits.from_visit field is propagated on new sessions or tabs. This more or less illustrates the URL changes in one tab. Looking though the sample output, I see a lot of http redirects to https, so these transitions are not dependent upon user interaction. Not saying this is particularly useful but more or less just an example of how you can build tools to leverage SQL differently. You could see the same thing, arguably better, from:

So these kind of techniques I will use in the next few posts to show various nifty artifacts from Chrome. Thanks for reading. More to come soon.

Parsing Chrome Artifacts with Python! Part 1

I recently googled myself and saw that I had a blog. After several failed guesses at my e-mail address and password, I was able to stumble back in here. For what it is worth, I have been very busy doing some cool stuff at work. Along those lines, I wanted to highlight some tips on Chrome forensics. As usual with my blog posts, you are going to have to do your homework for them to make complete sense.
I recommend reading these:
http://forensicswiki.org/wiki/Google_Chrome
https://digital-forensics.sans.org/blog/2010/01/21/google-chrome-forensics/

In fact the more I look at these posts…I am not even certain if any of the following is new or original but I am posting examples of how to use SQLite and Python to parse Chrome files. In fact, if you want one comprehensive Python script for parsing all of the Chrome artifacts from the various versions of Chrome, grab a copy of hindsight by Ryan Benson from Obsidian Forensics. Looks like he used the source from Chromium to interpret the values in each of the tables. Top Notch.

First things first, which of Chromes files can you parse with SQLite?
Here is a quick and dirty way to find all of the Chrome SQLite files on Windows:
All of the sqlite files have an associated rollback journal with them. A rollback journal is a temporary file used to implement atomic commit and rollback capabilities in SQLite. Rollback journals have the same filename as the database file except there is a “-journal” appended to the end.
Example:
C:\Users\User\AppData\Local\Google\Chrome\User Data\Default\History is the sqlite database that stores Chrome’s web history.
C:\Users\User\AppData\Local\Google\Chrome\User Data\Default\History-journal is it’s rollback journal. With that in mind…
Or if you are into Apple’s:
Or if you use Linux like a real forensicator…

By reviewing just these dir listings we can see that all three versions have the following files:
Origin Bound Certs – DB of Origin-Bound Certificates (OBC). OBCs are a self-signed certificates that the browser uses to perform TLS Client Authentication.
QuotaManager – Handles offline content quotas for AppCache, IndexedDB, WebSQL and File System API.
Shortcuts – Contains info about the “omnibox” shortcuts that come up when you open a new tab or window.
databases/Databases.db – Not 100% on these but I found an evercookie in one.
Favicons – Keeps track of icons associated with web sites.
History – The file that gets you in trouble with your boss/wife/priest.
Network Action Predictor – When you start typing in stuff in the navigation bar, Google makes a guess at what you want a based on previous stuff you have looked and straight up voodoo. This file keeps track of what you type, what Google guessed, and how accurate the guess was based on whether or not you clicked on the guess they presented you with a list of options from the drop down that appears. This is a great artifact of attribute because it is generated by hands on keyboards.
Application Cache/Index – Cache for Chrome Apps.
Web Data – Contains mostly autofill data. Some useful timestamps.
Login Data – Contains any username and password you have asked Chrome to store for you. This is in plaintext.
Cookies – This is where Chrome stores all of the bits of crap that web sites use to remember you.
Top Sites – Name says it all really.
Extension Cookies – Cookies for Chrome Extensions
Safe Browsing Cookies – Google use this for determining how well their server-side components are functioning.
Additionally, all 3 versions have local storage for extensions and websites that have been visited.
The format looks like this:Local Storage/PROTOCAL_DOMAINNAME_0.localstorage
We are already profiling user activity without even trying! Thanks Chrome!

This is all well and good but I promised you Python and Python you shall have but before you get too deep into code, download SQLite Browser.
It’s cross platform and painless to use. This will make the road ahead a lot easier.

Note about using SQLite Browser to look at Chrome files: SQLite Browser is, by default, always looking for .sqlite files, and while these files are sqlite, they do not have the nifty extension. So you will need to select All files (*) from the drop down menu to see them.

Since I know most of you can manage a GUI, I am not going to bore you with explaining everything…
Open up any of the SQLite files we found earlier…let’s say…History
Open History file
And here is what that looks like…history
This is everything you need to start cranking out some sweet SQLite parsin’ Python scripts!
So lets say we wanted to get a list of all of the downloads from the History database.
Specifically, I want to know where it was saved to, where did it come from, the time it started downloading, when it finished, and how big it was when it was downloaded. The select statement would look a like this:

Let’s wrap some python around it and bring this post to a close:

The output looks like this:
c:\Demo> ChromeDownloads.py
Download: C:\Users\User\Downloads\SysinternalsSuite.zip
From: http://technet.microsoft.com/en-us/sysinternals/bb842062.aspx
Started: 2014-12-27 02:44:33.643417
Finished: 2014-12-27 02:45:01.690324
Size: 13708848
Download: C:\Users\User\Downloads\7z936.msi
From: http://sourceforge.net/projects/sevenzip/files/7-Zip/9.36/7z936.msi/download
Started: 2014-12-27 02:46:00.532712
Finished: 2014-12-27 02:46:04.400309
Size: 1196032
Download: C:\Users\User\Downloads\Wireshark-win32-1.12.2.exe
From: https://www.wireshark.org/download.html
Started: 2014-12-27 03:36:24.090374
Finished: 2014-12-27 03:36:33.952502
Size: 23571488
Download: C:\Users\User\Downloads\0xED.tar.bz2
From: http://www.suavetech.com/0xed/
Started: 2014-12-27 03:48:12.519877
Finished: 2014-12-27 03:48:13.893038
Size: 896330

That is all for this post but I plan on digging a bit deeper on the next few.

Use Python to Encrypt Memory Files

After some digging, I ran across this post on stackoverflow.com. The basic idea here is to use standard python libraries to take a plaintext file and make an AES encrypted copy in a manner that is compatible with OpenSSL tools.
The code provided in the post work like a charm when collecting memory, disk space is a big issue. Most systems these days are running at least 8GB. Making two full size copies of a memory dump on a host machine is not practical even under ideal conditions. Even if I compress first, we still need enough free space to cover of the original plus the compressed copy. Sometimes that is not available.
To help with this issue, I adapted the script to encrypt the file as it reads it.

Here is the important tweak that I made:

So we are reading chunk of the file, encrypting it, then overwriting the chunk with encrypted data until we get to the end of the file.
Does this really work you ask? Of course it does.
openssl
Don’t know why this feels like a magic trick but TA DA!
This doesn’t fix all of my issues with moving huge files around but this does encrypt it locally before it is sent over the network without needing double the space.
Big thanks to Thjis van Dien for putting this solution out there.