Practical SSD Secure Wiping/Erasure

In a previous post, I worked through various ways of securely erasing a Solid State Drive (SSD) — that is, a data storage device using memory chips rather than rotating magnetic disks and other moving parts typical of hard disk drives (HDDs). Now I needed to put those ideas to use. I had a laptop, running Windows 7, whose drive C was located on an SSD. I wanted to securely erase that internal SSD, so as to sell or otherwise dispose of it. (Another post discusses the challenge of wiping free space on an SSD without deleting files that the user might want to keep on that drive.)

This was a SATA SSD — that is, it used a SATA connector to communicate with the motherboard. If it had not been a SATA (or eSATA, or mSATA) drive, I might have had to use techniques different from the ones discussed here.

Also, since SSDs did not use magnetic recording technology, they did not need and would not necessarily benefit from the redundant overwrite procedures used by popular disk-wiping programs like DBAN. Those sorts of procedures could leave data unerased and, in the worst case, could even damage an SSD. Therefore, I would want to use secure erase procedures specifically intended for SSDs.

Adjusting the Partitions

In this laptop, I had the SSD and also an HDD. It seemed that I might want to perform some Windows operations on the SSD, in the course of securely erasing it. Therefore, I mirrored my Windows installation from drive C, on the SSD, over to an empty partition on the HDD. The plan here was to delete the drive C partition, but still be able to boot the machine in Windows from the HDD.

There were multiple ways to clone Windows from drive C  (which I had named PROGRAMS) to that empty partition on the HDD (NEWPROGS). The approach I used was to create an image backup of PROGRAMS, and then restore it on NEWPROGS. I used Acronis to make the image. I could have used other imaging programs instead (e.g., Easeus, Macrium). This approach required me to have a third partition: I would save the image of PROGRAMS onto this third partition, and then restore that image of PROGRAMS onto the NEWPROGS partition.

As an alternative, I think I probably could have booted the computer with a Linux live CD and used its file management tool (i.e., its counterpart of Windows Explorer) to copy over all of the files from PROGRAMS to NEWPROGS. There were also other programs that would clone a partition.

Some of these operations would obviously require booting the system with something other than the Windows installation on drive C. For example, booting Linux from a live CD would require that CD, or something like it; manipulating partitions would call for the use of a bootable partitioner; programs like Acronis would be booted from a program CD. It would be possible to maintain separate CDs for each of these and other bootable tools. In another post, however, I had explored the process of creating a single, convenient multiboot drive that would combine several of these tools. Ordinarily, I would have created that multiboot drive on a USB flash drive. I had discovered, however, that my laptop ceased to boot from USB once I installed the SSD. Therefore, that other post describes the process by which I created a multiboot DVD disc. I was now using that multiboot DVD to run Acronis, the Ubuntu (Linux) live CD, and other bootable programs mentioned here.

Once I had my image of drive C and had restored it into the NEWPROGS partition, I felt I was safe in deleting the drive C (PROGRAMS) partition and everything else on the SSD. For this, I rebooted and used the GParted partitioning tool. GParted was available on Linux live CDs and elsewhere. The copy I used was the Partition Editor in Parted Magic, another highly useful bootable tool incorporated into my multiboot DVD. While in GParted, I selected NEWPROGS > right click > Manage Flags > boot.

Booting the Cloned Windows Installation

Unfortunately, Windows would not boot. I got an error message, citing Status: 0xc000000e, “The boot selection failed because a required device is inaccessible.” A search led to a variety of suggested solutions. These included resetting the BIOS, temporarily removing the second drive (in my case, the SSD), and running various commands in the Windows System Repair CD (another option in my multiboot DVD; the same functionality was also available via the Repair Your Computer option when booting the Windows 7 installation disc).

Most of the suggestions called for entry of a command. To move in that direction, I booted the Windows System Repair CD and went through its System Recovery Options > Use recovery tools > Startup Repair. It said, “If problems are found, Startup Repair will fix them automatically.” It indicated that it was “Attempting repairs…” That took maybe ten minutes. Then, sadly, it said, “Startup Repair cannot repair this computer automatically.”

An HP Support webpage, addressing multiple possibilities, suggested that maybe I should look at the BIOS setup after all. Now I recalled that there was a setting, in the BIOS, that told the computer which drive to look at first. Maybe it would not see past the SSD unless I rearranged that boot order. So I rebooted the machine and hit F2 and DEL rapidly until it let me into the BIOS settings. I moved the SSD to come after the HDD in the boot order list. Now, back in the Windows System Repair CD, this time System Recovery did see a Windows installation. It said, “Windows found problems with your computer’s startup options.” I clicked “Repair and restart.” That solved the problem: Windows booted from the HDD.

In Windows, I went into Disk Management (Run > diskmgmt.msc) and changed a few drive letters, so that certain programs on drive C would find things that I had relocated. For example, my customized Start Menu was located on drive X. But drive letters had changed, now that I had deleted a few partitions from the SSD. So I had to make sure that the partition containing the Start Menu files was once again labeled as drive X.

ATA Secure Erase via Parted Magic

Now it was time to start securely erasing the SSD. The previous post had indicated that, ideally, the SSD manufacturer would supply a reliable utility for wiping the drive. It had seemed that Intel might be an example of an SSD manufacturer that would do a good job, whereas other manufacturers had not done so. My SSD was a Crucial M500 (CT240M50, according to Speccy; CT240M500SSD1, according to the receipt). A search suggested there was no secure erase tool or capability accompanying or built into the device. So this approach was not available in my case.

Another approach identified in the previous post called for an attempt to use the ATA Secure Erase command that was supposedly available through various programs, notably Parted Magic (above). I booted again from the multiboot DVD and chose Parted Magic. In Parted Magic I went into the Start button > Disk Management > Erase Disk > Internal Secure Erase. It listed both the HDD and the Crucial SSD. Both were listed as Frozen. As advised by the tooltip, I clicked the Sleep button to try to unfreeze them. This immediately powered down the machine. I hit the computer’s power button. Instantly Parted Magic was back, and now the drives were listed as Not Frozen.

It seemed that some things had changed, in Parted Magic, since the time of the previous post. I was not sure whether this was because I was using a different version of the program (this time, it was Parted Magic 2014_02_26; not sure what version I was using previously), or because, at that point, I had perhaps been looking at an SSD mounted in an external dock rather than one plugged into the computer internally, as this SSD was.

This time around, at any rate, Parted Magic was showing me an option for Secure erase of the SSD. The tooltip said this:

Secure Erase overwrites HDD user data areas with binary zeroes. For SSD devices, Secure Erase activates the SSD controller firmware’s pre-programmed, ATA Security Erase Unit command. It applies a voltage spike to all of the NAND simultaneously, flushing the stored electrons from the flash memory cells, thus cleaning the NAND flash memory.

I went with that: secure erase of the SSD. It estimated two minutes to complete the task. I confirmed the order. It took less than one minute, and then said “Successfully Erased.” That was the timeframe I would have expected from real ATA Secure Erase.

I booted back into Windows and ran the Recuva file-recovery program. It would only work with Windows partitions, so I went into Disk Management and created a partition to fill the SSD, using the quick format option. Recuva’s Deep Scan found nothing beyond the few system files that Windows installed on a formatted drive. I tinkered briefly with two hex editors — Frhed and Hexinator — thinking that these might give me a direct look at what remained on the SSD. Unfortunately, I did not have the knowledge needed to do a proper job of that, and at this moment could not invest the time needed to develop that knowledge.

Recuva was the top data recovery program in an About Tech list of 19 freeware undeleters. The second item in that list was Puran File Recovery. I ran Puran — again, in Deep Scan mode — and also Glary Undelete, fourth on the list. Glary found nothing. Puran, running in Deep and Full scan modes, found seven files. This was very different from the result achieved in the previous post, when I had used Parted Magic on an SSD that was not connected internally, and had seen thousands of recoverable files.

I selected the seven files detected by Puran and clicked Recover with Folder Structure. One was desktop.ini, containing no significant data. Two others were unrecoverable. Of the remaining four, one contained no data. The other three contained relatively small amounts of data (10MB, in two of the three; 0.06MB in the third). When viewed in Notepad, that data was gibberish. These (e.g., $TxfLog.blf) appeared to be the same files that Recuva had found, evidently created by Windows for the new partition. As far as I could tell, then, the ATA Secure Erase command via Parted Magic had succeeded.

Encryption and Reformatting

The previous post observed that, with existing technologies, short of physical destruction, one can rarely if ever be completely sure that an SSD has been fully erased. The best one could do, it seemed, was to use two or three different approaches, so as to minimize the amount of recoverable data that might remain.

In that spirit, for additional security, the previous post suggested using a program like TrueCrypt to encrypt the drive, and then quick-formatting the drive. That quick-formatting step would return the drive to ordinary usable form, and would also expose lingering data to the automatic deletion performed by the TRIM function in SSDs. The idea here was that a fully encrypted drive, using a complex password and other appropriate precautions, would be difficult if not impossible to read even if its data were not wiped — and then TRIM, operating on the finally reformatted drive, would wipe out anything that was left.

As a precaution, the previous post advised entering this command at the Windows 7 command prompt: “fsutil behavior query DisableDeleteNotify.” Once again, this produced the response, “DisableDeleteNotify = 0,” indicating that TRIM was enabled. Another precaution detailed in that post involved resizing the Recycle Bin to reduce the number of files protected from TRIM.

I had not previously attempted to encrypt an SSD. A search now led to various discussions suggesting potential vulnerabilities in SSD encryption. I was encrypting space that I did not want others to scan for traces of deletion, as distinct from data that I would want to protect. Suffice it to say that I was not certain whether this step contributed much. Fortunately, the 240GB SSD was encrypted in less than 20 minutes; it did not appear that the encryption could have done much harm. After finishing the encryption, I quick-formatted the SSD.

Some insight emerged from a Belkasoft article (2014) discussing the recovery of evidence from the computers of alleged criminals. It seemed that encryption, by itself, would actually improve the odds of recovering data from a drive, because the data was not completely gone. The investigator would still have to find a password, or something like it. But the encryption would prevent deleted data, inside the encrypted space, from being eliminated via TRIM. Hence it did seem advisable to reformat the drive, returning it to normal use, after encrypting it.

I did not bother attempting to run Recuva again at this point. What I probably should have done would be to fill the drive with files (see below) and then see whether any of them were recoverable, with or without my password, at the conclusion of this process. I would be leaving the drive connected and the system turned on for a while afterwards; I guessed that this would be enough to permit TRIM to operate.

Overwriting the Drive with Junk Files

Programs like DBAN might be confused into thinking that they were writing to each space on the drive, when in fact the SSD’s load-leveling mechanism might be reallocating those writes, so as to repeatedly rewrite some spots while missing others. (See Micron document TN-29-42.) This did not appear to be a risk in the case of an effort to fill the drive with junk files. Assuming that each file was completely full of data — that, for instance, a 2KB file actually contained 2KB worth of junk data, without empty spaces — it did seem that a 240GB drive should be capable of holding approximately 240GB of data, not counting system overhead.

Thus, as discussed in another post, I used Tahionic Disk Tools to fill the SSD with junk data. Specifically, I pointed Tahionic’s output toward M:\Data, a newly created folder on drive M (i.e., the SSD). I pointed toward that subfolder, rather than the root (M:), because various sources indicated that Windows 7 could accommodate only 512 files in the root directory, but 65,534 files per folder. I told Tahionic to create 60,000 randomly named files, filled with random content. This implied that each such file would have to contain nearly 4 million bytes to fill the 223GB shown as “free space” in the Windows Explorer Properties for that drive. Tahionic confirmed that these numbers (i.e., 60K files, 4M bytes each) would produce a total of 223.51GB.

That raised a question. If I had a junk file of, say, 3.9MB, and a disk cluster size of 4.0MB, would I be saving one file per cluster, leaving 0.1MB of slack space in each cluster? In other words, would there be 60,000 holes in my plan, where the previous data would not be reliably overwritten by the new file?

According to the Belkasoft article, in SSD terms, a “page” (e.g., 4KB) is the smallest unit of storage that can be written to, and a “block” is the smallest unit of storage that can be erased (typically 128 to 256 pages). The authors said,

In practical terms, this means that files or file fragments (chunks) sized less than 512 KB or less than 2 MB depending on SSD model, may not be affected by the TRIM command, and may still be forensically recoverable.

The Belkasoft authors cited Micron article TN-29-19. I was not able to find that article at their cited location, nor elsewhere on Micron’s website. I did find copies in Google’s cache and also at the University of Maryland. That article (pp. 2, 13) said, “A NAND Flash block is 128KB” and “The BLOCK ERASE (60h) operation erases an entire block of 64 pages, or 128KB total.”

It was not clear to me, however, what this might mean. Combined with the foregoing Belkasoft quote, it appeared that TRIM would not affect file fragments of up to 128KB in the hardware Micron was discussing, and up to 2MB in other kinds of SSD hardware. It seemed that I would want to create my overwriting files to be not less than 2MB in size. Granted, the architecture of this Crucial SSD might be different; I was just trying to get the general concepts.

But would a file of, say, 2.1MB leave an unerased fragment after TRIM ran? Figuring that out would require more knowledge and time than I could invest. As a general rule, it seemed that I might be best advised not to use odd file sizes: to choose 2MB, for example, rather than 2.1MB per file. It also seemed that I would want the files to be as large as possible, so as to minimize the number of files and, hence, the number of file fragments left over after the bulk of the file was erased.

On that basis, I revised my Tahionic approach. For the first round, I asked it to fill M:\Data with 224 files of exactly 1GB each. That should have slightly overfilled the SSD. Tahionic produced those files quite rapidly, taking only about 15 minutes on my machine.

I inspected one of these 1GB files. It was too large to view in Notepad; I used Notepad++. I noticed that, in Notepad++, each of those files was reported to have a size of 1,048,576 KB (rounded, in my copy of Explorer++, to 1024MB). The one I looked at was indeed filled with gibberish. I thought that, if I could read through it page by page, somewhere in there I might find the collected speeches of several well-known politicians. Alas, time was short; I had to press on.

I was not able to verify that the 1GB file contained, in fact, a billion characters or more. The status bar in Notepad++ reported only 336,040,341 characters. I was not sure whether it was perhaps not counting some of the file’s many special characters.

Explorer++ was now reporting that M:\Data contained 224 items, but the last one was only 194MB. I deleted that file. Now, after emptying the Recycle Bin, Explorer++ reported 198MB free. I ran Tahionic again, this time creating 50 files of 4MB each. That task finished very quickly. After hitting Shift-Del on the runt at the end, I created a handful of 512KB files to fill the remaining space.

I was not sure how to determine exactly what those junk files had overwritten, or what they had failed to overwrite. As far as I could tell, I had indeed filled the drive with junk data. It tentatively appeared that this approach would complicate an effort to recover data from the SSD. Given the absence of magnetic traces, it appeared that this approach would leave most of the drive’s data completely unrecoverable.

Update: as discussed at Stack Exchange, repeatedly overwriting the available space on an SSD (or on a USB drive) appeared likely to minimize the amount of residual data lurking in hidden corners of the SSD — not because repeated overwriting would eliminate magnetic traces, as on an HDD, but because the drive’s wear leveling firmware would swap the drive’s reserve storage into and out of use with each rewrite, so that eventually almost all of the reserve would have been overwritten. That Stack Exchange discussion did reiterate that physical destruction might be the only sure way to wipe data from an SSD or USB drive, especially where the drive has removed bad sectors (still containing data) from active use, thereby putting their contents beyond the reach of any number of overwrites.

Testing by Deleting and Reformatting

At this point, I had a set of odd experiences. It stared when I deleted those junk files. I expected that this would put them into the Recycle Bin. Then I ran Recuva and found that, sure enough, they were recoverable. Oddly, Recuva reported 496 recoverable files, which was far more than the number of files that I had added on what was supposedly a wiped disk.

I emptied the Recycle Bin and ran Recuva again. It said they were still in the Recycle Bin, strangely enough, and still recoverable. They were junk files, so I could not verify their contents, but a quick test did seem to indicate that Recuva was able to restore one of them.

So then I deleted the Recycle Bin, using Unlocker. Once again, Recuva found them. I retried this on another computer, this time with the SSD connected via external SATA dock. Each time, I allowed minutes — in some cases, hours — before trying again. It certainly appeared that TRIM was not cleaning out deleted files.

I wondered if reformatting the drive would help — not as a simple way of deleting the files directly, but rather as an engraved invitation for TRIM to step up and do its job. I started with the Quick Format option. That took only a few seconds. This time, Recuva’s Deep Scan found only 278 files. Judging by their sizes, only a handful were the junk files that I had lately generated. So it seemed that even a quick format would make it harder for the casual snoop to recover data.

Even so, I did again succeed in restoring the first 1GB file that I tried to recover using Recuva. The computer on which I was doing this lacked the capacity to display a 1GB file with special characters in Notepad++, but it did appear the restored file had opened successfully.

I was not sure why some files were gone and others still lingered. This outcome did suggest that, for data security, one should not use the overwriting method by itself, and also that one should use a good data recovery tool to test the outcome.

This entry was posted in Uncategorized and tagged , , , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.