I had installed Windows 7 on a solid state drive (SSD). I was using VeraCrypt to secure my data drives, but had decided not to use VeraCrypt on the Windows system drive (C:). I had also learned that programs like (Heidi) Eraser were designed to wipe free space, or optionally to wipe all space, on traditional hard disk drives (HDDs), but were not necessarily very effective on SSDs. It seemed that, on an SSD, significant amounts of data could remain unscathed after treatment with Eraser.
The potentially recoverable data could include data stored on drive C by various programs. Internet browsing information would be an example, unless I used a portable browser rather than the installed version. Downloads could be another example. But even my private data, encrypted on drive D, could wind up in an unencrypted state on drive C. This could include data on current processes that Windows 7 would store in a paging or hibernation file. Another example: a working or temporary file not reliably deleted by the program that created it (e.g., when a spreadsheet program crashes without a clean exit, or when programs don’t even try to delete their temporary files). From some previous reading, I also had the vague impression that data in compressed (e.g., RAR, ZIP) files, securely encrypted on drive D, might be at least temporarily saved on drive C by some decompression programs.
Of course, even if a file was deleted from drive C, it would be readily recoverable through use of an unerasing tool like Recuva. That was because, when a file was deleted (by e.g., emptying the Recycle Bin), Windows 7 would not attempt to wipe all disk sectors containing file data. It would only delete the directory entry listing the file. The data contained within the file could be recovered until some other data was written to those sectors.
So there seemed to be a need for a method of reliably wiping free space on my SSD drive C. This would not be the same as wiping the entire drive. I still wanted Windows 7 to be installed and running on drive C. I just wanted to remove data from portions of the disk that were no longer associated with any officially registered files.
TRIM and Tahionic
As discussed in another post, the TRIM function was supposed to remove deleted files from an SSD automatically and within a matter of moments. As that post advised, I ran the “fsutil behavior query DisableDeleteNotify” command and got back, “DisableDeleteNotify = 0,” confirming that TRIM was enabled in Windows 7 on this computer. CrystalDiskInfo confirmed that my drive C SSD, a highly regarded Samsung SD 850 EVO M.2 drive, did have TRIM as one of its features. And yet these deleted files were still there. It seemed that, as in my own attempt described in that other post, on another computer using a different kind of SSD (and also in attempts made by others cited in that post), the TRIM function was simply failing to perform as advertised.
So it looked like I would have to find another way. As a seemingly easy solution, I tried using the same Tahionic Disk Tools Toolkit that I had previously used to generate a specified number of junk files, of a specified size, containing random data, so as to have unique junk files, one of which would be sufficient to prevent an otherwise empty folder from being deleted by an empty folder deleter like Remove Empty Directories. In this case, I told Tahionic to create just one file, almost as large as the amount of available space reported by Windows Explorer for drive C. Then I repeated that operation, creating another much smaller file to fill almost all the remaining space. And so on, until Windows Explorer reported that the drive was completely filled. It seemed that this would overwrite the SSD’s free space.
Then I deleted those newly created junk files, and ran Recuva. And you know what? It reported that I still had more than 400,000 files on the disk. If I recall correctly, Recuva did not provide its usual indication that a given file had an Excellent or Poor (or some other) chance of being recovered. Instead, it said simply that the files were “Not Deleted.” I think that changed when I closed Recuva and installed its most recent version. At the time when I started writing this post, I was looking at the results of a Recuva deep scan showing a huge number of files, many of which were supposedly in Excellent condition for recovery.
I did not know why Recuva was showing any recoverable files. I did not explore SSD technology in detail. The short answer appeared to include over-provisioning, which would mean giving you more than the 128GB (or whatever) that you thought you were buying: the SSD would use some surplus memory to speed some of its operations. In other words, evidently I was seeing a kind of drive that would store files in locations that would not be reached with conventional disk-writing software.
Trying to Wipe Files with Recuva
I was not sure that Recuva’s indications were always accurate. Starting with Recuva’s list of files having Excellent recovery prospects (above), I tried recovering some PDF files, but when I tried to read them, Acrobat said, “There was an error processing a page. There was a problem reading this document.” On the other hand, when I tried recovering other file types, I did seem to be getting some contents. They were gibberish when viewed in Notepad (which I used because I did not know how else to view them); but they might have been perfectly readable when viewed with a more appropriate program. At any rate, the point remains: I was viewing files with contents, of some sort, as opposed to beautiful, pristine, empty disk space with no files at all.
So I tried another approach. In Recuva, I used Ctrl-A to highlight all of those hundreds of thousands of files, right-clicked, and selected the Check Highlighted Option. Then I right-clicked again, with the intention of choosing Secure Overwrite Checked. Or, eliminating an unnecessary step, I could have just highlighted them all and chosen Secure Overwrite Highlighted.
Unfortunately, those Overwrite options were grayed (or, if you prefer, greyed) out. That was the point at which I began writing this post. I intended to explore the reasons why those Secure Overwrite options were not available. I had already run a search on that question, and had not found an immediate solution, so I was planning to try selecting and deleting just a subset of the listed files, starting with those that Recuva said had Excellent prospects for recovery, to see if that piecemeal approach would work. If not, I planned to explore other file wiping programs.
But then, for some reason, by the time my writing had caught up — that is, by the time I got to this point in this post — the graying problem was gone. I was able to choose the Secure Overwrite option for all recoverable files listed in Recuva. One possible explanation was that the update had fixed a bug in the previous version. But I didn’t think so: I thought I had already verified this problem after running a deep scan in the latest version. Another possibility was just that it took Recuva a while to wrap its head around the vast number of files listed — that it appreciated how long it took me to write all these words, and by this point it was ready to proceed.
So, OK, I highlighted all files listed by Recuva’s deep scan, and I right-clicked and chose Secure Overwrite Highlighted. It asked if I was sure. I said yes. Now it said this:
Files you are going to overwrite might be located on a SSD drive (C:). Are you sure you want to continue?
I wondered why it was asking me that. It sounded like an admission that it would be using old-school overwrite techniques suited for HDDs. Or possibly it was reflecting the old concern that the multiple writes commonly used in erasing programs would significantly shorten the life of an SSD. That concern had since been debunked. Anyway, I told Recuva to Continue. At first, it estimated that this overwriting process would take hours. And on an HDD, it might have. But on the SSD, it was done within a few minutes.
Then I re-ran Recuva’s deep scan. But — what’s this? It appeared that the Secure Overwrite option had achieved nothing: there was still a list of more than a hundred thousand files. I ran it again, in case it took more than one try. This time, I paid more attention to the dialog that reported the results. It said that about half of the files had been overwritten. Most of the others, it appeared, were labeled thus: “Not Overwritten – File is already overwritten by existing file(s).” But that didn’t explain the list of 75,154 files that, according to Recuva, had Excellent recovery prospects, with “No overwritten clusters detected.” Each time, Recuva was claiming that it was overwriting these files, with progress reports: “22656 files overwritten.” And yet, after another try, the deep scan still produced an extensive list of files with Excellent prospects, and the Secure Overwrite option offered to eliminate 75,040 files. It appeared that Recuva was not reliably able to overwrite files on an SSD.
How-To Geek said that my problems with Recuva were not surprising: “The conventional wisdom is that, with TRIM enabled, the SSD will automatically delete its data when you delete the file. This isn’t necessarily true, and it’s more complicated than that.” How-To Geek and MakeUseOf agreed that the only really secure ways of erasing data from an SSD were to wipe the whole disk or, even better, to destroy the disk; otherwise, they said, encryption was the only alternative.
Alternatives: Drive Imaging and Two Computers
There did seem to be another option. A drive imaging program (I used Acronis True Image Home 2011), not attempting a sector-by-sector backup, and with the Recycle Bin excluded (and perhaps other Windows log and User files deleted and folders emptied), would presumably save an image of only the visible files (i.e., not the deleted fragments that Recuva could detect). So I could make an image of what I wanted to preserve from my drive C SSD; I could secure-erase the SSD; and then I could restore the drive image to the clean SSD.
That would be a cumbersome procedure. It seemed there might be ways to speed it up somewhat, especially for those who were willing to buy additional hardware. One way of reducing downtime on the primary computer would be to use a drive imaging or cloning program (or Linux) to copy the files from the active SSD to another SSD; swap the SSDs; and then use some other machine to secure-erase the old, clogged SSD.
Another approach would be to use two substantially identical computers. As detailed in another post, I had used GoodSync to make sure both computers would have the same data. Both computers also had the same programs installed. If one went down for maintenance, I could use my KVM switch to turn instantly to the other one. In this scenario, I could get the first machine started on the image-wipe-restore sequence just described, with hardly any downtime: I would be continuing my work on the second machine in the meantime. Having a second computer had other incidental advantages. One example was to run programs that would be heavy on resources (e.g., video rendering, some kinds of file indexing or file comparison tasks) on the secondary machine, so as to prevent such programs from slowing down work on the primary machine.
Drive imaging would be faster, of course, if the drive contained fewer essential program files. It seemed that the contents of drive C would be reduced if I used portable or installed programs that could run from some drive other than C, and likewise if there were tasks I could perform using a virtual machine (VM) stored on an encrypted drive. There could also be an advantage to using Linux, because (aside from being able to run Windows in VMs) it would allow significant portions of the operating system to be installed on other drives.
Finally, there did seem to be room for improvement in Recuva, and perhaps there was some other utility that already implemented such improvements. It seemed, for example, that Recuva could include a verified-delete option that would resolve the inconsistency described above. In other words, a user seeking to delete files would want the option of seeing those files completely eliminated from the list or, where applicable, an explanation of why files continued to be listed after supposedly being wiped. I also wondered whether, if Recuva could ascertain the existence of a file somewhere on an SSD, it might be possible to overwrite it by creating a zeroed version of that file in a corresponding folder, and then back up that zeroed file to the SSD.