Choosing a User-Friendly Drive Imaging Tool for Linux


This post describes the steps I took to explore backup imaging software for use on a computer running Linux Mint Xfce. I had been using Acronis True Image Home for years, and found that few other programs matched its features. I also found, however, that Acronis was not very good for imaging Linux ext4 partitions. That turned out to be true of other Windows-oriented programs as well (e.g., AOMEI, Macrium).

There were Linux-oriented alternatives, but I found these intimidating, given my still-limited knowledge of Linux. I also found them undesirable, in the sense that I would not usually want surprises or complexities when my system crashed or my hard drive died: I would want to be able to restore the thing promptly, with absolute clarity as to what the imaging program and I were doing. I was able to eliminate some Linux programs.

I tried Clonezilla and Redo Backup & Recovery. I found that, for me at this point, Redo was a simple, well-designed tool that did the job. With the insight afforded by subsequent adverse experience, however, I found myself distinctly less pleased with that tool, relying instead on Clonezilla.

Introducing Some Possibilities

In the Windows world, I had been using Acronis True Image Home 2011 (not a free or open-source program) for some years. It had been very reliable. I had heard that newer versions of Acronis were not of the same quality, however. A recent inquiry had suggested that the primary competitors of Acronis ($50) now included ShadowProtect 5 ($100), Paragon ($40), AOMEI (free), and Macrium (free). Until such time as I found myself completely free of the Windows world, I felt that the best solution would be cross-platform (i.e., capable of working with both Windows and Linux partitions).

I was not sure which of those Windows-oriented programs, if any, would be comfortable with ext4 partitions and other aspects of Linux systems. So I thought I had better look at imaging options from the Linux world. A search pointed toward numerous (e.g., 1 2 3 4 5 6) sites endorsing (among others) Clonezilla, dd, FSArchiver, Partimage, Partclone, and G4L. I had used some of those programs, and I agreed with some of the comments I saw on some of those websites. For example: Clonezilla definitely “isn’t quite as user-friendly as truly graphical options,” and dd was said to be short for “disk destroyer”: someone described it as “incredibly easy to use,” aside from its many technical command-line options.

I realized that the command line could be superior to a graphical user interface (GUI) for tasks that would always be done the same way, and for people who used those commands frequently and who would immediately notice if something was wrong. But for a task as infrequent as program drive imaging was for me, I vastly preferred a GUI. I’d had the experience of returning to a backup when I needed it, only to discover it wasn’t what I had expected. I wanted to be as certain as possible that I was doing it right.

In the Windows world, it was also typical for the imaging tool to run from a bootable USB or CD drive. That apparently wasn’t necessary in the Linux world. So it could make sense to make a backup of the system partition(s) just by using a zip program like Tar, or a file-by-file copying tool like rsync.

The websites listed above did point to other Linux-oriented imaging tools, some of which offered GUIs and/or other appealing features. In particular, they said that Mondo Rescue was able to create a custom live CD, and doClone could do live cloning (i.e., while Linux was running). They presented Redo Backup & Recovery as perhaps the most user-friendly. I wasn’t sure whether I could safely run Gnome software on my Mint Xfce installation, but one site did make Gnome Disk Utility sound worthwhile.

Windows-Oriented Options

Since I already had Acronis on a bootable USB drive, I started it up and took a look at the system that I was proposing to back up. That system had / (root) and /home partitions in ext4 format, as well as a swap partition. Acronis recognized the / and /home partitions and, for its own reasons, assigned them the arbitrary Windows letters of drives H: and I:. The drive also had another ext4 partition for miscellaneous data. Acronis decided that must be drive J. I marked drives H and I for backup, and told Acronis to save the backup image on drive J. But Acronis wouldn’t do that. It gave me an error: “The file name, location, or format is invalid.” A search, seeking an explanation, led to an Acronis page stating, “You cannot use Acronis True Image Home . . . for backups to disks or partitions with these [i.e., Ext2, Ext3, Ext4, ReiserFS, or Linus SWAP] file systems.”

I decided to try again. I used a partitioning program to reformat that miscellaneous data partition in NTFS format. With that done, Acronis was willing to make the backup. But I noticed that it took a long time. When it was finally done, I saw that, for some reason, Acronis had created a 90GB backup file. I wasn’t sure why. The / and /home partitions being backed up had only a total of about 8GB of material in them. To try to figure out what was going on, I booted the Acronis USB drive, went into its Recovery Wizard, and indicated that I just wanted to restore certain files and folders from the 90GB image. That allowed me to browse the image contents. There didn’t seem to be anything unusual in there. It was just an 8GB that, somehow, had become a 90GB image file.

A search led to indications that others were having similar problems. It appeared that, as of this writing, Acronis was still not fully supporting ext4 partitions. I was going to have to find a non-Acronis solution. I deleted the 90GB image and started over.

My next choice was Paragon, one of the Windows solutions mentioned above. I had just found out that there was a free version of Paragon Backup & Recovery, whose User Manual (p. 9) promised full read/write access to ext4 partitions. I downloaded and installed it on my Windows machine. While I was at it, I also installed Macrium Reflect Free 6.1 and AOMEI Backupper. I used these installations to create image backup ISOs, and added those ISOs to my YUMI multiboot USB drive.

Unfortunately, there did not seem to be a 32-bit version of the Paragon tool, and the 64-bit download could not run on this old 32-bit laptop. The same was true of the 64-bit version of AOMEI Backupper Standard 3.2. But then I noticed that Softpedia also offered a Linux version of AOMEI. That ran. It offered options to clone an entire disk, an operating system, or a partition. It did not want to clone two out of three partitions on a drive in a single pass. Moreover, when I selected one partition, it did not offer compression options; rather, it indicated that it was going to wipe out the target partition. In other words, it was indeed cloning the partition, not making a backup image of it. The situation it found on the source drive was exactly the situation that it was going to create on the target drive.

I bailed out of that and tried Macrium, for which I had put both 32- and 64-bit versions on the YUMI drive. Its 32-bit version ran, and it did offer the options of clicking checkboxes to select individual partitions for either cloning or imaging. It allowed me to create the image onto an external USB drive, and to choose among no/medium/high compression, and between intelligent or sector-by-sector imaging. That was all good.

(Note that, for some of the programs discussed in this post, the decision of when to connect an external USB drive could be a little tricky. On this particular machine, as on some others I had used, the system might not boot if the nonbootable external USB storage drive was plugged in at time of boot. But unplugging it at the wrong time could mess up its filesystem. Probably the safest approach was to plug in after the system booted, and unplug after the system was shut down. Alternately, I seemed to do OK unplugging when the system was first showing its splash screen, at the start of a reboot.)

Unlike Acronis, Macrium did not offer the option of excluding unnecessary files and folders (e.g., pagefile.sys; \System Volume Information). It also offered only two compression options, as compared to three or four in Acronis. Macrium also didn’t show the Linux labels for the ext4 partitions (i.e., / (root) and /home). But it did seem to be showing them in order, and it recognized that those first two partitions were indeed “ext” partitions, and that the third one (DATA) was NTFS; it felt that the fourth one (swap) was an “Unformatted logical” partition. So Macrium was not completely Linux-savvy.

I was also a little concerned that, like Acronis, Macrium believed that the / (root) partition contained 14GB and that the /home partition contained 34GB, when in fact GParted (running in Mint) reported that / contained only 6GB and /home contained only 2GB (including system overhead). Moreover, Macrium was not willing to save the resulting image file to an ext4 partition — as I found when, for instance, I tried to image only the root partition: /home was not offered as a destination option. Finally, as I feared, Macrium’s output image was vastly larger than the program files being imaged from root and home partitions: 45GB, requiring 44 minutes to create. Not as awful as the Acronis result, but still plainly unacceptable.

At this point, I had not yet tried 64-bit Paragon, which would have to wait for a different installation, nor had I tried any paid solutions other than my long-ago purchased copy of Acronis. There were many other Windows drive imaging solutions that I would not have time to try and/or money to purchase, such as the apparently respectable O&O DiskImage. But I had to suspect that the reason why those Linux websites were praising programs like CloneZilla is that none of these flashier Windows imaging programs handled Linux partitions very well.

Linux-Oriented Alternatives

If a Windows-oriented imaging program wasn’t going to work, it seemed I would have to tinker with some of those Linux-oriented programs. I decided to start with Redo Backup & Recovery. A look at its webpage provided an indication that it is “simply a front end to partclone,” as well as a statement that its current version 1.0.4 (i.e., the one also available on Softpedia, where it rated 4.4 out of 5 stars) was released in November 2012. At SourceForge, where it rated 4.6 out of 5 stars, there were a few claims that “the forums are littered with people having restoration problems” and (in suspiciously similar phrasing) that “the forums are riddled with . . . [users] that have ZERO data being written to disk” with Redo. Those accusations did not appear to be true, either for Redo or for the partclone program, underlying it, which Wikipedia described as “the default backup application in Clonezilla” (see also e.g., FredsHack and SourceForge 1 2 3 4). Given Clonezilla’s strong positive reputation (on Amazon, compare ratings for Clonezilla vs. e.g., Acronis; see also SourceForge; Softpedia; DistroWatch), it appeared that Redo would be a fairly safe Linux-oriented imaging tool.

I downloaded the Redo 1.0.4 ISO and added it to my YUMI multiboot USB drive. Redo’s instructions seemed to prefer burning the ISO to a CD or to a single-boot USB drive, but I hoped this would be all right; I much preferred to have my bootable tools on a single USB drive if possible. I booted the old laptop with Redo and felt immediately that I was dealing with a more appropriate tool: its opening screen identified the laptop’s internal hard disk drive (HDD) and its partitions — noting, for example, that the drive’s first partition was a 28GB ext4 containing Linux Mint 17.3. Redo appeared willing to back up every partition on that drive, including its swap partition, if I wished. I designated only the root and home partitions for backup. I felt that Redo could have been clearer about the destination — I was not yet entirely refamiliarized with the Linux way of viewing the external drive and its partitions — and I would also have preferred that Redo would have a summary screen, to confirm what I had told it to do, before launching into the backup. Redo also offered no options for compression, for excluding any files or folders in the source partitions, or for anything else. The objective was plainly to keep it very simple. If I wanted more than that, apparently I would have to find a different tool.

When the backup finished, I found myself on Redo’s main screen. There were options, at the lower left corner, to open (among other things) a file viewer, a text editor, web browser, or the GParted partition editor; to create a bootable USB; and to change Redo’s appearance and adjust other settings. But there was no tool to view the contents of the backup image that Redo had just created.

Among those options, I used GParted to verify that the target drive was formatted in NTFS, and I used the file viewer to see what Redo had put onto it. Along with what appeared to be administrative files (e.g., a text file listing the partitions being backed up), there were three files with numerical extensions (i.e., 000 or 001). These were much larger than the administrative files. They appeared to contain the actual drive image. Two of them appeared to contain the backup of the root partition: their names contained a reference to “part 5,” which I took to be short for Partition No. 5, which was where Redo had listed the root partition. The first of those two ended in 000 and was reported to be a Gzip archive of 2.1GB. The second ended in 001. It appeared, in other words, that even though the target was NTFS, and could therefore handle larger files, Redo was going to spit out 2GB backup files. So if I used it to image a 60GB Windows installation, I would apparently have about 30 of these files, numbered 000, 001, 002 et seq.

Redo finished the backup in less than 11 minutes. The file viewer said that the combined size of the files that Redo had placed into the designated folder on the target drive was only 3.0 GB. Plainly, there had been some compression. So far, then, Redo was the first program in this review that recognized and could work effectively with ext4 partitions.

There was some uncertainty as to whether Redo would work with GPT disks. It looked like a number of people (1 2 3 4 5 6) had asked about that in Redo’s forums, or had scoured Redo’s website or SourceForge page for further information. There were hints that, at least until sometime in 2015, Redo did not support GPT — but also that some people had been able to use Redo with GPT drives for at least some purposes. The read-me file accompanying the most recent revision (11/21/2012) made no reference to GPT. One person suggested that GPT would be an issue only if the source was GPT; a GPT target would work OK. That person did not specify the grounds for that belief. Interestingly, the Ubuntu manual pages for partclone said nothing about GPT, whereas the main page for Clonezilla indicated support for GPT. Evidently Clonezilla used one of its other supporting programs (i.e., Partimage, ntfsclone, or dd) when working with GPT.

Redo Backup & Recovery appeared to provide a smooth and easy solution for ordinary use. In case of problems with GPT drives, and to explore the possibility of getting some of the other choices mentioned above (e.g., compression ratio, size of output files, files or folders to exclude from the backup), I ran a search naming some other programs. That search led to an AlternativeTo page suggesting that Clonezilla was far more popular than Redo, and that the latter had been discontinued. Search results gave the impression, consistent with my previous browsing, that dd and Clonezilla were the most widely used Linux imaging tools. Reviewing the search that I had run previously, I looked again at this warning:

dd/ddrescue/dcfldd are power tools. You need to understand what it does, and you need to understand some things about the machines it does those things to, in order to use it safely.

Given my present level of Linux knowledge, it seemed I could and probably should rule out dd as an imaging solution. Tar, another option mentioned above, had certain strengths, but also appeared to entail more complexities and difficulties than I would want to have to address when trying to recover from a system failure (see Ubuntu; Debian; Stack Exchange). For example, files added to the system after the Tar backup was made would remain on the system. Gnome Disk Utility (GDU) apparently required creation of a customized bootable USB drive or on-the-spot modification of a Linux live CD, as distinct from other self-contained USB solutions (such as Redo) that would work immediately upon booting. GDU also did not appear to offer many more options than Redo, and doClone appeared to be inactive beta software. Initially, Mondo Rescue appeared to be an established tool unfortunately requiring more knowledge, and entailing potentially greater complexity, than I could presently muster. But after trying Clonezilla and Redo, I came back to Mondo, and tried to include it in the following comparison.


The foregoing reflections yielded the conclusion that Clonezilla might be the tool for me, to the extent that I wanted something with more options than Redo Backup & Recovery. With the help of a TechRepublic tutorial, I decided to try it out on this old Dell laptop.

The first step was to download the stable Clonezilla live ISO. (Later, I found there was also an option of running it as a program available for installation via Synaptic (i.e., Start > System > Package Manager), but it would work only on unmounted partitions.) I chose the older i686 download, since I was going to be running it on this old 32-bit machine; but as the download page said, it would work for most machines. I added it to my YUMI drive and booted the Dell laptop with it. I accepted the default boot option and other defaults that followed, including the choices to work with images and use local devices. I chose the Beginner option, and waited when the program told me to wait. At various points, it seemed like I had to use a combination of arrow keys and the Tab key to select things I wanted to select.

The first real decision came when it said, “Now we need to mount a device as /home/partimag.” This was the selection of the destination. For that, I didn’t choose the ext4 partitions that I wanted to back up; instead, I chose the same external USB drive that I had used when testing Redo (above). Next, I opted for the top or root directory. I chose saveparts, not savedisk, because I wanted to back up just the ext4 partitions, not the DATA (NTFS) partition that I had created on the Dell. I used the spacebar and the arrow keys to select the ext4 partitions, when Clonezilla gave me that opportunity. On Clonezilla’s advice, I declined to run a disk check on the external NTFS target.

Really, it wasn’t bad at all. I was kind of surprised that someone had built a GUI for Partclone but not for Clonezilla. Then it occurred to me that maybe they had. But a search didn’t turn up any obvious contenders. So, for me, it was going to be a choice between the slick Redo and the more convoluted Clonezilla. When Clonezilla finished, I thought I was going to see a report of what it had done and how long it had taken; but for some reason, that didn’t happen. But it was fast.

At this point, I realized that I didn’t have enough information to make a decision about Clonezilla because, by choosing the beginner option, I had probably skipped the question of compression and other desired options. So I ran it again. In Expert mode, there were options for restoring an image, creating an ISO for a live CD or USB drive, and encrypting. There were options to exclude Windows page and hibernation files (not applicable here) — but no option to exclude individual files or file types (e.g., *.tmp). There were, after all, compression options. Since I didn’t have a multicore CPU, I chose gzip compression, based on the information Clonezilla provided. There was an option not to split the output file.

So, yeah, Clonezilla had a lot. I was pleasantly surprised. It was not as smooth and simple as Redo, for the most basic kind of backup, but it had almost all of the options that I liked in Acronis, and perhaps at some point I would switch to the command line for other options built into Clonezilla but not offered in the text-based interface.

Choosing a Compression Option

As I was writing the foregoing words, I noticed that Clonezilla’s Expert backup was taking a long time. Closer inspection of the screen revealed that Clonezilla believed it was working with a “raw” filesystem, and expected the whole job to take about 45 minutes — same as Macrium Reflect (above). The 45 minutes ended, and then Clonezilla started something else. What was it doing? It estimated that this new process would take nearly three hours. To back up 8GB of material? Eventually I came to understand that Clonezilla could not figure out what type of filesystem I was using on the root and home partitions, so it just defaulted to a sector-by-sector backup. There didn’t seem to be any way to stop it, short of pulling the plug. Fortunately, Ctrl-C worked.

I rebooted and looked at what we had achieved. Linux Mint still booted OK, so we had not trashed anything. The external drive where I had intended to create the backups contained the 3GB backup created by Clonezilla in Beginner mode, and a 23GB backup created in Expert mode.

I deleted that 23GB monstrosity and tried again, thinking that perhaps I had made a wrong choice somewhere. But all of my choices seemed right, until I came to the compression choice in Expert mode. There, I was not so sure. I ran a search, flailed around with its inconclusive results, and then tried another. That one did produce several somewhat informative webpages, summarized as follows:

  • Spiros Georgaras seemed to indicate that, for a single-core CPU, gzip was fast; lzop was much faster, producing images that were only slightly larger; bzip2 and lzma were slower, but managed smaller images, and between the two bzip2 was faster; xz and lzip were much faster than bzip2 and lzma, and produced images that were only slightly larger.
  • Stephane Lesimple seemed to conclude that gzip was nearly as fast as lzop, with much better compression; that bzip2 achieved better compression than gzip, but had very slow decompression times; that xz produced superior compression, but at the cost of a lot of time and high demands on RAM; and that gzip was probably the best general-purpose solution.
  • A study posted on Linuxaria concluded that lzop produced high compression and speed; that gzip was very good for general use, having both high compression and high speed; that bzip2 was inferior in both compression and speed; and that xz was “the clear winner in the compression ratio” but was very slow.

Note that most of these articles were five or more years old. And by the time I got through with this reading, I had long since started the Dell on another try. Compression choice was the only difference I made in the Expert mode settings: this time, I chose bzip2. (It was next on the list after gzip, in Clonezilla, for single-core CPUs.)

That change made a big difference. It appeared that Clonezilla had failed to detect that the source partitions were ext4, last time, merely because I had chosen gzip. This time, when I chose bzip2, Clonezilla was closer: it identified the source partitions as at least being the Linux extfs type. The process was still pretty slow for the root partition: it took about a half-hour to make its image of the root partition. For some reason, it sailed right through the much larger but emptier /home partition, requiring only four minutes.

Continuing down the list of single-core options in Clonezilla, I tried lzo, which I assumed was related to lzop. This would be my last try; Clonezilla’s notes seemed to say that the others in the list, after lzo, would be slower and more demanding. I wasn’t that concerned about getting the smallest possible image size. With lzo, Clonezilla again identified the root and home partitions as having an extfs filesystem, and it got through both of them in about five minutes.

Now I needed to look at the results. I rebooted into Mint and viewed the contents of the external USB drive. The bzip2 archive for the root partition was about 2.5GB. The lzo archive was about 3.3GB. So the lzo archive was about one-third larger. That was a substantial difference. The Redo backup (above) was about halfway between the two, and (unlike the bzip2 archive) it hadn’t required a half-hour. Given ease of use, if space was not crucial, and if I didn’t mind dealing with a pile of 2GB backup files, it seemed I might just go with Redo after all.

Mondo Rescue

I was surprised to see that Mondo Rescue came in versions specifically tailored to various Linux distributions. There was none for Linux Mint; but since Mint 17.3 was based on Ubuntu 14.04 LTS, I believed I could probably use the Ubuntu 14.04 version. But as I viewed the many files and options connected with that version, I realized I was not sure how to proceed. There did not seem to be a way to obtain Mondo Rescue through Synaptic Package Manager (i.e., Start > System > Package Manager); instead, it seemed I should run the following commands:

cd /tmp
rm *.list
wget`lsb_release -r|awk '{print $2}'`/mondorescue.sources.list

There were more commands, but that third one was enough: it produced a message: “No such directory ‘ubuntu/17.3.'” Perhaps more on target, MintGuide suggested these commands instead, perhaps best copied and pasted, one at a time:

sudo wget -q -P /etc/apt/sources.list.d
wget -q -O - | sudo apt-key add -
sudo apt-get update
sudo apt-get install -y mondo

Those seemed to work, or at least they produced no errors, at least until the very end, when I got a warning that several packages (including mondo) could not be authenticated and a message that said, “E: There are problems and -y was used without –force-yes.” I had no idea as to whether any of that was significant.

The effort did result in a change in Synaptic: a search for mondo now produced exactly that item. I marked it for installation, but before installing it, I tried MintGuide’s advice to type “sudo mondoarchive.” That produced an error: “command not found.” So I went ahead with the Synaptic installation. It marked some additional items. I said OK. It warned me that some items were Not Authenticated. These appeared to be approximately the same items as I had been warned about after the foregoing commands. I was not sure why Synaptic felt that these packages were not installed, when supposedly apt-get had already installed them. Or perhaps the error messages cited at the end of the previous paragraph meant that installation had not succeeded.

This time around, “sudo mondoarchive” produced different error messages. It appeared that the errors, both times, were produced by Mondo’s own efforts to verify that it was compatible with my system. Some of these messages were rather amusing. For example:

Perhaps your /etc/fstab file is insane. Perhaps Mindi’s MakeMountlist() subroutine has a bug. We’ll see.

Ultimately, though, the program failed to run, and I was not sophisticated and determined enough to work out why not. Maybe there was a reason why Mondo was available in versions for Mageia, Gentoo, and Asianux, but not for Linux Mint. It did not appear to be available as an ISO that I could run from a live CD. For now, I had to let this rest.


So. It was time to put these backups to the test. I was going to use them to overwrite what was on the Dell laptop’s hard disk. If everything went according to plan, the restore would proceed perfectly, and Linux Mint on the Dell would be functioning as well after the restore as it had before.

I didn’t think I would need to try all of these backups. I figured I would try the Redo Backup & Recovery image, to see how well Redo did, and then I would try the Clonezilla lzo image. If that went well, I wouldn’t bother testing the others.

I rebooted with the YUMI USB drive, went into Redo, and chose Start Redo Backup > Restore > Select Source Drive > Select Backup Image > Select Destination Drive. There were no further options. Redo was apparently going to restore both the root and home partitions, because I had backed them both up together. If I wanted the option of restoring just one of them, apparently I should have backed them up separately.

The Redo restore process took less than five minutes. When it was done, I rebooted the machine. Linux Mint started normally. Everything seemed fine. I concluded that Redo worked.

Now it was time to try Clonezilla. I booted it, went partway through the default options, and chose the external USB drive when it asked me what device I wanted to mount as /home/partimg. I chose the top directory on the external drive, and then chose Expert mode. That gave me the restoreparts option. I chose the lzo image, the last of the ones I had made.

I indicated that I wanted to restore both of the partitions saved in that image. This gave me a message: “Two or more partitions from image dir were selected. Only the same partitions on the destination disk could be restored.” I wasn’t sure what that meant, but it said, “Press Enter to continue,” so I did. This brought me to a long list of options. This was the point at which I felt I would much rather have a tool like Redo. I read the list, mostly agreed with it, and kept the defaults. That took me to another list of “Advanced Extra Parameters.” Here, again, I agreed with the advice: “If you have no idea, keep the default values.” That’s what I did. In the following screen, I told it to go ahead and check the image before restoring. The check only took a minute or so, and then we were on to the restore process.

I didn’t like that, throughout these Clonezilla image and restore processes, I did not get straightforward reminders of what I had said I wanted to do, or of what the program was now doing. I did appreciate the text, provided with multiple exclamation marks and “Warning!” statements, telling me that something dire was about to happen — after the image check completed, for instance — but precisely what was going to happen, I was not sure. Not as sure as I was with Redo. So at the end, I was holding my breath. But it worked. My system was fine.


My review led me to Redo Backup & Recovery. I found it to be simple and effective. There probably were other Linux-oriented tools to which I would have had the same reaction. There might also have been some Windows-oriented tools that would have worked well with Linux partitions on a 64-bit machine.

While I liked Redo, it lacked features that would probably matter to me eventually. Perhaps the most important was the ability to work with GPT drives. If I used it to back up Windows partitions, I would surely want to be able to exclude certain files and folders, so as to prevent unnecessary image bloat. On any platform, I wished that it could save an image in a single file, instead of spreading it out over numerous 2GB files. I would have liked an option to choose among compression levels.

Unfortunately, Redo appeared to be no longer actively developed. Moreover, I had some problems with it in later usage. I gravitated toward using Clonezilla after all, but maintained an active interest in finding a superior GUI or text-based tool or, perhaps, becoming more familiar with command-line alternatives.

This entry was posted in Uncategorized and tagged , , , , , , , , , , , . Bookmark the permalink.

2 Responses to Choosing a User-Friendly Drive Imaging Tool for Linux

  1. Steve says:

    Unless you wipe all sectors on the disk first, any restore test you do is completely useless!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s