Choosing the Best Benchmarks: A Comparison of VirtualBox vs. VMware Virtual Machines

As described in (1 2) other posts, I was looking at VirtualBox 5.2.6 and VMware Workstation Player 14.1.1 as possible ways to run certain software in a Windows 7 guest virtual machine (VM) on a Windows 10 host computer. The question considered in this post is whether one had visibly better performance than the other.

This was a rough, first-cut exploration of numerous factors related to that question. As with other long posts in this blog, I may write a more concise version at some point in the future, if I revisit this question.

Contents

Summary
System Setup
Benchmarking Software
An Incidental Usage Comparison
Windows Experience Index
VirtualBox Guest Additions Problem
PassMark PerformanceTest 9
PCMark10
Novabench
SiSoftware Sandra Lite 2017
Review
Basic Benchmarks for Virginal VMs
Hands-On Exploration: Video Editing
Deciding on CPU Cores

.

Summary

This post explores benchmarking programs, to find the best way of comparing VirtualBox and VMware. It turned out that the best benchmarkers required unspecified Windows updates. So there was a choice of whether to install a lot of updates, which might themselves affect performance, or just settle for the relatively limited benchmarkers that would run without Windows updates.

Windows updates could have a performance impact by imposing clutter; by increasing exposure to Windows 7 instabilities, introduced by Microsoft starting in 2016, to push users toward Windows 10; and by requiring antivirus software, which itself could impose a major performance hit. It seemed that I might avoid the need for antivirus software by keeping my VMs offline. That could work: I only needed these VMs to run programs like Adobe Premiere Elements and Microsoft Office. Those did not need Internet access.

Nonetheless, I did go ahead, download and install the full supply of Windows updates, and run a variety of benchmarking programs. I found that most of these programs were component-oriented (measuring the performance of e.g., CPU or RAM), but PCMark10 provided a nice alternative, with a set of eight practical elements (e.g., spreadsheets, photo editing). On a machine with the requisite updates, I found that PassMark PerformanceTest 9, SiSoftware Sandra Lite 2017, and CrystalMark all provided relatively sophisticated general-purpose comparisons, though CrystalMark was only 32-bit.

Those full-picture benchmarking tools supported an impression that VirtualBox was slightly faster than VMware Player — and, not surprisingly, that the host machine was, in most regards, much faster than either VM. Although benchmarking programs varied in their results, it seemed pretty clear that VMware was not generally faster than VirtualBox.

Of course, there were non-performance-related tradeoffs. VMware was easier to use, for purposes of file operations (e.g., copying or renaming VMs); on the other hand, VirtualBox offered features (e.g., snapshots) that were not available in VMware’s free Player.

I considered the possibility that the Windows updates might have had an unequally negative impact on the VMware VM. To address that, I tried several simpler benchmarking tools on a version of the VMware VM that did not have any updates installed, beyond the installation DVD’s built-in Service Pack 1 updates. Those simpler benchmarkers provided a sense that the non-updated VMware VM might be a little faster than the updated VMware VM, but still wasn’t faster overall than VirtualBox.

I took a brief look at the subjective difference between host and VM, but decided not to pursue that. Specifically, there were early signs that video editing in Adobe Premiere Elements would feel somewhat slower in a VM, but it would take hours of experience before I could offer more specific comments on that.

I also took a brief look at the question of how many CPUs to allocate to the VM. It tentatively appeared that the best approach would be to assign half of the total number of CPUs available (e.g., four of eight) to the VM. Some contended that it was possible to get better performance without instability, at least in VirtualBox, by assigning all CPUs to the VM and setting them at a maximum 80-90% execution cap.

Overall, it seemed that the host computer would be the fastest and simplest place to do one’s computing. But where one had a particular reason to use a VM (as in my discovery that Adobe Premiere Elements would no longer run on my Windows 10 host), it appeared that both VirtualBox and VMware Player could provide an acceptable albeit visibly slower alternative.

My impression, based on this and other recent explorations, plus several mostly satisfactory years of using VMware Workstation circa 2009, was that VMware Player would provide a more stable, organized, and user-friendly feel, while VirtualBox offered more options, more complexity, and probably somewhat faster performance.

System Setup

The computer used for this performance comparison was running Windows 10 Pro x64 on an ASUSTeK H97-PLUS motherboard with an Intel Core i7-4790, with 24GB DDR3 RAM and a 4GB NVIDIA GeForce GT 730. The Windows 10 installation was on an M.2 SSD; the VMs were on a PNY CS1311 SSD.

I made similar performance adjustments on each VM. These included two CPU cores and 8GB RAM allocated; 256MB graphics RAM; 2D and 3D acceleration, if available; no other significant tasks running on the host; antivirus on the host configured not to scan VM program or storage directories; and other settings described in another post. I verified that VirtualBox had Guest Additions installed, and that VMware had VMware Tools installed.

In this comparison, I was interested in general-purpose, real-life performance. I was not a gamer. Video editing tended to be my most demanding task. I was also not an academic researcher, concerned with precise and reproducible results. To the contrary, I expected to be using my chosen VM in a system with other stuff running, so I did not feel particularly compelled to develop a test machine that would eliminate all other processes. Rather, I wanted to see VM performance in a system where assorted things would always be going on. So I had various programs open (e.g., web browsers, Microsoft Excel), although not demanding significant system resources (e.g., not doing calculations) nor changing their demands (e.g., not starting or closing programs) while these tests were underway.

Until I got to SiSoftware Sandra (below), except as otherwise noted, I ran the benchmark tests twice on each VM and once on the underlying Windows 10 host system. In one set of VM tests, I ran the benchmarkers simultaneously on both VMs. I did that partly to see whether one VM would be more likely to grab system resources, and partly to insure that test conditions were identical. I ran the other set of tests on one VM at a time, to see whether that made any difference.

Benchmarking Software

Searches led to multiple sites discussing VM benchmarks. A StackOverflow discussion (2009) suggested that benchmarking tools not designed for VMs “tend to crash the VM,” but I did not find that to be the case. For example, to compare VirtualBox against Microsoft’s Hyper-V, Caleb Lloyd (2015) used Geekbench 3 (replaced now by Geekbench 4, $10-15 for single-user licenses) to measure memory and CPU performance. Lifewire (Nelson, 2016) used Geekbench, plus CineBench to test GPU and graphics. I saw only very expensive versions of CineBench at the Maxon website, but Softpedia offered a free version (3.7 stars from 127 raters). To compare Mac’s Parallels against VMware Fusion and VirtualBox, TekRevue (Tanous, 2015) used Geekbench and CineBench, along with 3DMark (advertised as “The Gamer’s Benchmark,” free), FurMark OpenGL (“GPU Stress Test,” free), PCMark (“the complete benchmark for Windows”), and PassMark PerformanceTest (tests most system components; free trial). The PassMark software had the advantage of permitting comparison against the PassMark database of other users’ results. Among these alternatives, PassMark and PCMark appeared to be the only general-purpose tests (i.e., not focused just on CPU or other specific components).

For general-purpose PC (not specifically VM) benchmarking, my browsing led to Novabench, PCMark, and SiSoft Sandra Lite 2017 as tools recommended by multiple sources (e.g., How-To Geek (2017), Logical Increments (2016), TechWhoop (2017), Softpedia users). I decided to try those three, plus PassMark’s PerformanceTest 9. As discussed below, I would also eventually try the obsolete (i.e., 32-bit) CrystalMark 2004R7.

Unfortunately, I was unable to install those benchmarking tools. The problem seemed to be that these bare-bones VMs, ideal for testing the VMs themselves without the vagaries of additional software, were unfortunately lacking in certain Windows updates needed for their installation. I would have to install those updates to install the benchmarking software.

Installing Windows updates would be contrary to my original purpose of developing simple VMs solely to run a few specific programs in Windows 7. I realized that many Windows updates could be helpful if not essential — making the software run better, for instance, and protecting against viruses. As detailed in other posts, however, I had also reached the unfortunate conclusion that some Windows updates undermined Windows 7 rather than helping it, and that this was consistent with Microsoft’s sometimes unscrupulous efforts to push Windows 7 users to switch to Windows 10.

To run these benchmarking programs, though, it seemed that some Windows updates were unavoidable. I did make a brief attempt to figure out exactly which updates these programs needed. Unfortunately, it appeared that the necessary updates were now available only in services packs and/or cumulative updates. The simplest and possibly most practical solution seemed to be to install all Windows updates. That would take hours, but not hours of my time: the machine would handle it, with the occasional reboot.

So I brought these VMs up to date, in terms of Windows updates. Those updates, followed by installation of the benchmarking programs listed above, vastly increased the sizes of these VMs, from roughly 10-15GB to more than 50GB. As far as I could tell, those updates did not significantly affect the conclusion that VirtualBox VMs were moderately faster than VMware VMs.

An Incidental Usage Comparison

Originally, the maximum sizes of the virtual disks in these VMs were only 25GB (VirtualBox) and 50GB (VMware). The Windows updates changed that: I had to enlarge the virtual disks in order to accommodate the updates and the testing software. The steps required to enlarge those disks served, in themselves, as a point of comparison between VirtualBox and VMware Player. (This section summarizes some steps, and refers to some tools (e.g., CloneVDI), that are discussed in more detail in the other post.)

To let both VMs finish installing all Windows updates, I decided to resize both to 80GB. Resizing the VM in VMware seemed relatively simple, partly because VMware guided me through some of the steps. To enlarge the VMware VM, I edited the Settings to expand the VM, then clicked Play virtual machine, and then installed and used a third-party partition editor (i.e., MiniTool Partition Wizard) in the VM, to enlarge the drive C partition within that expanded space. I have not elaborated, here, upon all of the steps in that multi-step process, mostly because VMware made it seem straightforward and easy.

VirtualBox likewise needed a third-party tool, CloneVDI, to expand the VM. CloneVDI itself was simple to use, but it cloned only the .vdi. There was no obvious way to save the old VM’s settings into this larger VM. Experimenting, I copied the old .vbox into the new folder, in Windows Explorer, renamed it as VirtualBox Test 80GB, and then used VirtualBox Manager > Machine > Add > navigate to the new .vbox file. Then VirtualBox Manager > select the new VM > Settings > General > Basic Tab > rename the VM. Then File > Virtual Media Manager > Release the old .vdi > Remove the old .vdi > Close. Then Settings > Storage > Storage Devices > right-click > Add Hard Disk > Choose existing disk > navigate to the new 80GB .vdi > Open > OK.

As those notes indicate, I found the VirtualBox process much more complicated than the VMware Player process. And that was just the tip of the iceberg. As documented in the other post, I had already spent many hours digging for information, trying various approaches, failing, and retrying, to achieve even this limited familiarity with the steps needed in VirtualBox. Having devoted those hours, I did feel that I was learning the VirtualBox way; I felt that I would get the hang of it eventually, and that these ease-of-use issues would not determine my choice of VM software. But I also felt that VirtualBox would be much improved, for users unwilling or unable to work with this moderately technical stuff, if the program included simple options for copying, moving, and renaming VMs.  Unfortunately, after years of obvious user struggles, it still appeared that no such improvements were imminent. To the contrary, as noted in another post, VirtualBox forum moderators actually ridiculed such suggestions.

Windows Experience Index

Windows 7 contained a built-in benchmarking tool known as the Windows Experience Index (WEI). A search led to a Reddit discussion dismissing WEI as a mere gimmick, a Quora discussion stating WEI was inaccurate, a SuperUser discussion indicating that WEI scores would influence a few aspects of system performance, a Tom’s Hardware discussion characterizing WEI as a tool of convenience, and a MakeUseOf article treating it, somewhat more seriously, as an approximation of whether a computer was suitable for basic (3.0+), multitasking (5.0+), or high-end (7.0+) tasks and features. Since the final score was simply the lowest of the individual component scores, I was more interested in the component scores.

To view the WEI, in each VM, I went to Control Panel > System > Windows Experience Index. I ran the WEI assessment once in the fresh new VMs, before installing updates, and again after installing updates.

I wanted to compare those scores against the underlying Windows 10 host. WEI itself no longer existed in Windows 10, but TenForums led me to view potentially comparable results by using Win-R > perfmon (Performance Monitor) > left pane > Data Collector Sets > System > right-click on System Diagnostics > Start. Within a minute or so, the report produced by that step was visible at Performance Monitor > left pane > Reports > System > System Diagnostics > double-click the report > Hardware Configuration section > Desktop Rating > click on the plus signs (+) to expand the available items. Like this:

Windows Experience Index on a Windows 10 machine

The image shows the scores from my host Win10 system. (D3DScore was reportedly the Gaming Graphics score.) Those scores appeared not to be comparable to the Windows 7 WEI, which could not go higher than 7.9. (The scores might have been even higher if I hadn’t obtained them while the system was handling a somewhat demanding file comparison process.)

The WEI was the one benchmark that I was able to run on the basic VMs, before installing Windows updates. The WEI scores before and after updates were the same, with one exception: VirtualBox primary hard disk drive (HDD) access improved from 6.6 to 7.8. I was not sure whether that was due to an improvement in those updates (in e.g., a driver) or perhaps to some quirk in my test procedure. The only really notable result from the WEI comparison was that VirtualBox graphics and gaming graphics tests remained stuck at 1.0, the lowest possible score, while VMware scored 5.7 for graphics and 5.0 for gaming graphics. That did turn out to be an accurate predictor that VirtualBox would have problems when confronted by the graphics aspects of various benchmarking tools.

VirtualBox Guest Additions Problem

The first time I ran PassMark (below), VirtualBox said, “Unknown error running Direct 2D test.” Actually, it produced a screen that seemed partly frozen; I saw that error only when I killed the top window. I clicked OK, and the test resumed; but then I got this notice:

PassMark DirectX Testing Suite

Warning: Nothing will be rendered.

The reference rendering device was selected, but your computer only has a reduced-functionality reference device installed. Please insure that the most recent device driver for your graphics adapter is correctly installed.

Other graphics errors followed (i.e., “The specified video adapter does not exist. Please specify a valid video adapter”; “Unable to open the GPU Compute benchmark’s results file. Results for this test are unavailable”). It was not clear why I would get such errors, in a VM where I had checked the boxes supposedly offering 2D and 3D graphics.

As I figured out in a previous post, it was possible that the Guest Additions, useful for improving some aspects of VirtualBox performance, could be mounted but not yet installed. Hints to that effect could include the absence of the VirtualBox blue cube icon in the VM’s system tray, no Guest Additions installation listed in the guest Windows 7 Control Panel > Programs and Features, and a grayed-out option at VirtualBox menu > View > Auto-resize Guest Display.

In this case, of those possible symptoms, I did notice the absence of Guest Additions in Programs and Features. I tried using Start Menu > Programs > Oracle VM VirtualBox Guest Additions > Uninstall. That required a reboot. After reboot, VirtualBox menu > Machine > Settings > Storage still showed VBoxGuestAdditions.iso as being mounted. Further tinkering yielded confusion rather than clarity.

I noticed, around this time, that VirtualBox 5.2.8 had been released, so I tried upgrading to that, including its Extension Pack and Guest Additions. Downloading the Guest Additions required a full investigation, as Oracle had artfully removed any Guest Additions download link from (you guessed it) the VirtualBox Downloads page (though at least they did include a helpful suggestion that the user should be sure to download them — from somewhere). Oracle also made sure not to include a link to the Guest Additions download on its separate VirtualBox 5.2.8 downloads page. I did at last find the Guest Additions through a good old 1990s-style index page.

Then the mystery continued: now what? What I figured out was, make sure the VM is powered down. Then go to VirtualBox Manager > select the VM > Settings > Storage > Storage Devices > VBoxGuestAdditions.iso > right-click > Remove Attachment (because it seemed to be fubar) > ignore the dire warnings > Remove. Or, even better (since that didn’t seem to do the trick) try VirtualBox Manager > File > Virtual Media Manager > Optical disks tab > right-click on VBoxGuestAdditions.iso > Remove > Close. Then VirtualBox Manager > Settings > Storage > Storage Devices > right-click on Controller: SATA > Add optical drive > Choose disk > navigate to the Guest Additions ISO > Open > OK. Then start the VM . . . .

But wait. Wait. We are not doing justice to the nature of the process I was experiencing by this point. The sense of bewilderment, indeed disbelief, may be more precisely approximated if I lurch, right here, to the Read-Me file that — well, I will just allow it to speak for itself:

Oracle VM VirtualBox Guest Additions

Where have the Windows drivers gone?

The Windows Guest Additions drivers were removed from this directory to save space on your hard drive. To get the files you have to extract them from the Windows Guest Additions installers:

To extract the 32-bit drivers to “C:\Drivers”, do the following:
VBoxWindowsAdditions-x86 /extract /D=C:\Drivers

For the 64-bit drivers:
VBoxWindowsAdditions-amd64 /extract /D=C:\Drivers

Note: The extraction routine will create an additional sub directory with the selected architecture (x86 or amd64) to prevent mixing up the drivers.

To get further help with the command line parameters of the installer, type: VBoxWindowsAdditions-<arch> /?

What? Are you high? Save space on my hard drive? Seriously. Guys, this is not 1986. I have much more space left on my disk than I have hours remaining in my life. Which do you suppose I would rather devote to this task?

Let us review. I came down this rabbit hole seeking nothing more than a way to run VirtualBox. The concept seemed to be that I had to run Guest Additions, itself something of a kludge, in order to get the VM to function normally. And, fine, I accepted that. But how did that translate into me figuring out how and where to run some command in order to install drivers that were evidently supposed to be a part of Guest Additions?

I mean, I think they left out a step. I think they meant to compose that command in some ancient or obscure language, like Fortran or maybe Sanskrit, so I could learn that in order to get the Guest Additions installed. Or maybe there was going to be an authentication process at the end of the tunnel. Because this really wasn’t confusing enough yet.

What I hoped was that maybe I wasn’t supposed to read that file, the one named Read-Me. Maybe I was just supposed to run VBoxWindowsAdditions.exe, and it would take care of everything. Or maybe VBoxWindowsAdditions-amd64.exe. Not sure. I found those by starting the VM and going into Windows Explorer > CD Drive (D:) VirtualBox Guest Additions. The User’s Manual (4.2.1.1) said I should just find VBoxWindowsAdditions.exe on my own, and run it manually, if it didn’t autostart.

So I did that. It paused with a question about components. Did I want to install Direct3D Support (Experimental)? A search led to a nice How-To Geek article, published in 2012. So, OK, maybe I was being unfair. Maybe the current state of the Guest Additions really did date from the dawn of the universe, when hard drive space was scarce. Geek warned, “3D acceleration and the driver are experimental in VirtualBox” — and that was true, they still were, six years later, here in 2018. I could tell because the Guest Additions installer now gave me the same “experimental” warning, confirmed by comments in a VirtualBox.org discussion. So, OK, I gave up and went with the non-experimental approach. No Direct3D. I wasn’t looking for (2012, or perhaps 1986) cutting edge here. I wanted a stable, well-behaved VM. I wanted to get some work done.

And that did it. Now I had Oracle VM VirtualBox Guest Additions 5.2.8 listed in Control Panel > Programs and Features. I re-ran the Windows Experience Index but, uh-oh, the graphics score was still 1.0. So — how did these other people manage to test graphics in VirtualBox? Apparently they didn’t. I got this from TekRevue (Tanous, 2015):

VirtualBox technically supports up to DirectX 11, and its graphics driver reports that compatibility to the guest OS, enabling access to the higher-end tests. Graphics performance, however, is so poor in VirtualBox at these higher levels that it is unusable for any practical purpose.

Similarly, Lifewire (Nelson, 2016) reported multiple testing issues with VirtualBox, including “a problem with . . . how it makes use of CPUs” and “a strange bump in rendering results,” and said “VirtualBox couldn’t reliably complete the rendering test” and in some tests “it was unable to start or finish.”

It seemed, in other words, that I had gotten the Guest Additions installed, and for some purposes that would help, and for the many tasks on which I didn’t need good graphics performance, VirtualBox would be fine, even if it did mess up some of the tests I was about to perform.

PassMark PerformanceTest 9

So now it was time to run benchmarking programs, and see what they told me about these VirtualBox and VMware VMs. I started with PassMark PerformanceTest 9. When I ran it on the host machine, it said,

Your evaluation period has expired. You may continue using PerformanceTest, however some additional features have been disabled.

I had just downloaded it. How could my evaluation period expire? Evidently I had used it on the host computer at some point in the past, and PassMark had not forgotten. I ran it anyway, and chose Run Benchmark. It said it would take “several minutes,” and it did. When it was done, I ran it on the two VMs. VirtualBox stopped when it got to the graphics test, with the errors noted in the previous section. The test ran in VMware with no errors, but it did include some notes (e.g., “Requested level of Anti-Aliasing not supported, rendering without”; “Unable to run at desired resolution of 1680×1050”). I tried again, the next day, with similar results on both machines — but this time I stood ready to click OK as soon as errors popped up in VirtualBox. So the first of my two tests did not run on both machines simultaneously, but the second of the two tests was roughly simultaneous.

I saw some irregularities in test results, and decided to re-run both the simultaneous and non-simultaneous tests. On my second run through the simultaneous tests, VMware produced an error, “Could not reset the Direct3D device. The dialog will now be closed.” Other than that, it ran without problems. Since that error appeared to be an anomaly, I re-ran the test and discarded those results. In VirtualBox, meanwhile, the same errors appeared in all tests.

My impression, at this point, was that I was getting maybe 5% variation in test results due to the other minor stuff happening on the computer. In other words, running the PassMark tests several times seemed to produce roughly similar results for the same scenario. For instance, the overall PassMark rating for VMware, running the tests while VirtualBox was inactive, was 1916.3 from the first try and 1990.9 from the second try, for a 4% variation. So when I saw that one score (i.e., VMware’s 3D Graphics Mark) actually improved by 7% when I was running tests on both VMs at the same time (as compared to the scores when running them asynchronously), I attributed that to minor background variation.

With that one exception, all scores dropped, as expected, when the two VMs were running simultaneously, as compared to when they had the host system’s resources all to themselves. For VirtualBox, that drop was 11% in the overall rating, with individual component score declines ranging from 7% (CPU Mark) to 18% (Memory Mark). For VMware Player, most score declines were minor, in the range of 4% to 9% (not counting the 7% increase noted in the preceding paragraph). But VMware’s Disk Mark scores dropped by 76%! In other words, when both VirtualBox and VMware Player were competing for hard disk access, VMware Player either yielded or was run over, depending on how you look at it. The average simultaneous Disk Mark score from two tests was 3173.6 for VirtualBox and only 609.7 for VMware Player.

Looking only at the average scores produced when the VMs were running asychronously (i.e., not competing with each other for system resources), VirtualBox was still the overall performance winner. There were virtually no differences in scores on the CPU and Memory tests. VMware was 18% better on the 2D Graphics Mark test and, of course, was in a class by itself on the 3D Graphics Mark tests that VirtualBox could not complete. But it appeared that PassMark gave the Disk Mark test a lot of weight in calculating its overall rating: even when, as I say, the VMs were not competing for hard disk (or, in this case, SSD) access, VMware Disk Mark score was still 39% slower than VirtualBox’s. Those Disk Mark differences were quite similar in both of the asynchronous tests administered to each VM.

It was possible that the disk test results were due to some quirk of my system setup. As with all of these tests, other users would be best advised to try benchmarking their own installations. Tentatively, though, at least for purposes of this computer on which I proposed to run a VM, these scores — and my visual observation, as I watched the relative speeds at which the two VMs worked through the PassMark tests — suggested that VirtualBox offered better performance, assuming my tasks didn’t need the graphic capabilities that were not available for the PassMark tests.

There was also the question of how the relatively faster VirtualBox VM compared to the PassMark scores produced by the host machine. Somehow, according to PassMark, the VirtualBox VM was actually 11% faster than the host machine running it, using the average of two asynchronous tests of each. A search led to a StackOverflow discussion in which people explained that this was possible because VM software was now designed to operate at a lower level, bypassing bottlenecks in Windows. The math was a bit dubious, nonetheless: the only benchmark on which VirtualBox beat the host machine was Memory Mark, and even there it was only 4% faster; meanwhile, the host was 94% faster on CPU Mark, 16% faster on 2D Graphics, and 29% faster on Disk Mark. It still seemed that, generally, one would be best advised to use the host when possible.

Finally, PassMark offered a webpage briefly describing its database of PerformanceTest scores containing “thousands of PC benchmark results,” with links to several high-powered subgroups (e.g., Fastest Desktops). Unfortunately, my system’s average score of 2921 was far too low to appear among their “fastest” desktop machines — the slowest was at 9055 — and they did not seem to offer access to more detailed information from their database.

PCMark10

I turned to the next benchmarking tool. During installation, PCMark10 confronted me with an error message: “Sorry, unable to continue. You appears to be running this installation in Administrator mode.” I clicked Ignore. That worked, for purposes of finishing the installation.

Next, I got, “No compatible benchmarks found.” I clicked Benchmarks a couple times. This gave me an option of PCMark10, PCMark10 Express, or PCMark10 Extended. I chose PCMark10. That took another couple of clicks. In both VMs, PCMark10 said, “Your hardware may not be compatible. To run this test you need an OpenGL 4.3 compatible GPU.” I clicked Run anyway — a couple of times. It proceeded. It took maybe 35 minutes. At the end of each test, it said, “There was a problem running the benchmark.” The accompanying explanation was, “The benchmark run ended unexpectedly. Please try again or contact Support.”

But the test scores seemed to indicate that the tests had completed successfully, for the most part. There were summary scores titled Essentials, Productivity, and Digital Content Creation (DCC). Within those summary scores, Essentials included App Start-up Score, Video Conferencing Score, and Web Browsing Score; Productivity included Spreadsheets and Writing; and DCC included Photo Editing, Rendering and Visualization (R&V), and Video Editing. As the sole failure, both VirtualBox and VMware produced scores of zero for R&V. Possibly that explained the “problem” notation. For my purposes, I focused on the Essentials summary score and on the specific Spreadsheets, R&V, and Video Editing components.

I ran the test again, as needed, to produce asynchronous scores for the host system and synchronous scores for the VMs. Even when the host machine ran, PCMark10 had one error: “SHGetFolderPathAndSubDir failed: The system cannot find the path specified.” That error arose during the R&V phase, which neither VM was able to complete. I was not sure what that error might mean, but apparently it was sufficient to prevent the R&V test from producing a result. So I had no R&V score from any source. The host also concluded with the same “benchmark run ended unexpectedly” error.

In the PCMark10 scores, unlike the PassMark scores, there was little if any penalty for running tests in both VMs at the same time: scores for all categories (i.e., Essentials, Spreadsheets, and Video Editing) ranged within 8% of each other. The exception: VMware’s score on Essentials was actually 12% faster when the VMs ran simultaneously, possibly signaling that the VMs together (or VirtualBox in particular) tended to crowd out other programs that might be seeking system resources.

There was also no noticeable difference between VirtualBox and VMware scores on PCMark10: they were within 2% of each other on Essentials and Spreadsheets, and VMware was only 5% faster on Video Editing. The most dramatic difference in PCMark10 scores was that the physical machine was far faster than the VMs. Host scores exceeded VirtualBox scores by 38% in Spreadsheets, 64% in Essentials, and 167% in Video Editing.

Novabench

Like PassMark, Novabench 4.0.3 tested component speeds rather than performance in practical tasks like PCMark10; and like PassMark, the Novabench tests completed quickly — very quickly, in fact, taking only a minute or so.

After running the first test, on the host machine, I clicked the option to View Performance Charts and Comparisons. That opened a Novabench webpage from which I could register with Novabench, so as to save my score to my free public profile. I took that offer. Their webpage created my new account! but lost my scores. I clicked the button in Novabench again. This time for sure: the result was added to my profile. The website offered a Performance Analysis and a Baseline Comparison. The Baseline Comparison allowed me to compare my computer’s scores to those of other profiles (e.g., Mid-level Gaming PC, Budget Laptop). The Performance Analysis informed me how my CPU and GPU scores compared to others.

The Novabench benchmark did not run successfully in VirtualBox; it reported, “Failed to initialize Direct 3D,” and concluded with a Novabench score of -2147483232. In VMware, it ran once but, on another try, reported, “The game crashed.”

For such reasons, Novabench provided limited useful information, offering contrasting data only for the host and VMware Player. Like PassMark, it found the host was faster in CPU, RAM, and disk scores, but Novabench found the host to be much faster than PassMark did. I doubted, however, that RAM was indeed 51% faster on the host than in VMware Player, or that the CPU was 166% faster. Since Novabench reached its radical conclusions on the basis of much less testing, I concluded that PassMark probably provided a more accurate calculation of VM component speeds.

SiSoftware Sandra Lite 2017

The last of my selected benchmarking tools was a version of the famous and venerable SiSoft Sandra, available since 1997. To get it to run, I went to its Benchmarks tab > Overall Score > Refresh the results by running all benchmarks > uncheck “Enable certification, aggregated results and rankings (recommended) > click on the green checkmark. (Alternately, I could have left the certification option checked, and entered an email address and password.) Sandra did not show much action at first: for some minutes, it just said “Analysing.” Altogether, it took nearly an hour to complete its tests on each tested machine.

SiSoftware Sandra seemed seemed to be designed to compare computers in terms of kPT. According to Mario F. in a Cprogramming discussion, kPT was “a unit of measurement designed by SiSoftware . . . meant to display an aggregate score of several performance tests” for the purpose of “comparing it to other people’s scores.” In the words of a participant in a Tom’s Hardware forum, “[N]o-one really seems to know what it means. It would appear to be a made up unit for the benchmarking software.”

It seemed, then, that I could use the Sandra findings in two ways: I could compare computers by using kPT as a summary measure, and also by using specific calculated values that were apparently included in the kPT value. As an example of those specific calculated values, Sandra found that the VirtualBox VM had a Processor Multi-Media aggregated score of 95.02MPix/s, which Imgtec translated as “million pixels per second.”

To see what Sandra calculated, after running it on the VirtualBox VM, I just used Ctrl-C to copy from Sandra and then Ctrl-V to paste into, say, Notepad or Excel. It didn’t seem to matter where I clicked on Sandra before hitting Ctrl-C; the command seemed to know that I wanted to copy the full statement of results. Alternately, I got the same results by clicking on its Copy to Clipboard icon (i.e., the icon showing two sheets of paper, located near the bottom left corner of the image, above). After doing the same thing with the VMware and host system results, I got the following list of 18 tested components:

.NET Arithmetic
Cache & Memory Latency
File System Bandwidth
File System I/O
GP (GPU/CPU/APU) Cryptography
GP (GPU/CPU/APU) Financial Analysis
GP (GPU/CPU/APU) Memory Bandwidth
GP (GPU/CPU/APU) Memory Latency
GP (GPU/CPU/APU) Processing
GP (GPU/CPU/APU) Scientific Analysis
Media (Audio/Video) Transcode
Memory Bandwidth
Processor Cryptography
Processor Financial Analysis
Processor Multi-Media
Processor Scientific Analysis
Video Memory Bandwidth
Video Shader Compute

In the VMs, but not in the host, some components were tested twice, producing two dissimilar scores. I was not sure why. A search did not lead immediately to an explanation. I kept only one of the duplicative results, generally the latter.

Starting at this point, I stopped the practice of running benchmarks in both the VirtualBox and VMware VMs simultaneously. Thus, for this test, I only needed to run Sandra once on each of the VMs, and once on the host computer, to gather the data needed for my SiSoftware Sandra comparison.

Of the 18 components tested in at least one of these three machines, only ten were tested in all three machines, with these results:

Focusing on these ten components tested in all three machines, a number of VirtualBox scores differed from the VMware scores by less than 5%. But there were also some larger differences. VMware was 13% faster on Processor Multi-Media, and VirtualBox was markedly faster on File System Bandwidth (54%), File System I/O (84%), and Media Audio/Video Transcode (61%). The VirtualBox Overall Score was 19% higher. That is, VirtualBox was about one-fifth faster overall, according to Sandra’s Overall Score calculation.

Focusing on those same components, the host was 34% faster than VirtualBox overall, by Sandra’s calculation. But that calculation masked some profound differences. Unlike the comparison of the two VMs, there were only two components on which VirtualBox and the host were within 5-6% of each other. With the exception of memory bandwidth, in all other regards, the host ranged from being significantly to extremely superior.

Review

The preceding sections report on the outcomes of tests of VirtualBox and VMware Player VMs, and of their Windows 10 host, using four highly recommended benchmarking programs.

Of those four, Novabench failed to run in the VirtualBox VM; and when comparing the VirtualBox VM against the host, Novabench produced dubious results inconsistent with other tests.

PassMark found that VMware was better than VirtualBox for 2D graphics, and that VirtualBox offered no 3D graphics capability. Otherwise, though, VirtualBox turned in a notably better performance, regardless of whether both VMs were running the PassMark test at the same time. VMware seemed to struggle especially in disk access speed. PassMark found the VirtualBox VM to be faster overall than the Windows 10 host, but this conclusion was dubious: the host was faster (sometimes much faster) on almost all individual component measures.

PCMark10 took a different approach, estimating system performance in real-world types of applications rather than system components. Among the measures provided by PCMark10, I focused particularly on Essentials (combining app startup, video conferencing, and web browsing scores), Spreadsheets, and Video Editing. On those items, PCMark10 found little difference between the two VMs, but found that the host was 38% faster than VirtualBox in Spreadsheets, 64% faster in Essentials, and 167% faster in Video Editing.

Finally, SiSoftware Sandra Lite 2017 tested 18 system components (e.g., Memory Bandwidth), but only 10 of those 18 were tested on all three machines. As with PassMark, the results from tests of specific components cast doubt on the “Overall Score” calculation. Among those components, Sandra found that VirtualBox was markedly faster than VMware on several components, and was not significantly slower on any component. Agreeing with PCMark10, Sandra found the host to be significantly faster than VirtualBox on almost all components. The extreme was a score 13x faster for Media (Audio/Video) Transcode — consistent with, but much larger than, the Video Editing advantage detected by PCMark10.

From these tests, a rough consensus seemed to emerge. Two of the three benchmark tools comparing the host against the VirtualBox VM agreed that the host would be very much faster in especially demanding tasks, such as video editing. In the comparison between VMs, all three tools seemed to agree that VirtualBox was at least not slower than VMware; two of the three said that, actually, VirtualBox was much faster. The latter conclusion was consistent with my own observations: when I ran the same test in both VirtualBox and VMware VMs simultaneously, I could see that VirtualBox was moving more quickly through the graphics tests, especially. VMware Player was apparently superior to VirtualBox, not in performance, but in certain aspects of usability: it had much better graphics support; and in my file operation experience (e.g., copying VMs), it seemed much easier to use. VirtualBox did offer other features (e.g., snapshots) that might offset those advantages.

By this point, my experimentation had provoked several additional questions. The following sections turn to those questions.

Basic Benchmarks for Virginal VMs

As described above, I had to install voluminous Windows updates, in the test VMs, in order to run the several benchmarking tools just discussed. Windows Experience Index (WEI) was the only one that would run on a computer that did not have all those updates, and various commenters questioned WEI’s accuracy and usefulness.

There were tools and techniques that could be useful for benchmarking Windows 7 VMs on which those voluminous updates had not yet been installed. I had hesitated to use such tools, favoring those that were highly recommended by reputable sources. But now it seemed fair to ask whether these simpler tools would produce results, in a relatively virginal VM, that were consistent with each other and with the results obtained from PassMark, PCMark10, and SiSoftware Sandra.

To run these simpler tests, I used the two VMs that I had just been discussing, with PassMark et al. already installed, and I also used basic VirtualBox and VMware VMs, built from a Windows 7 Service Pack 1 (SP1) installation DVD, that had not received any Windows updates since SP1. I would also run these tests on the host. So I would be testing four virtual machines and one physical host machine.

But first, a digression. By this point, it had occurred to me that my VMware VM might be having performance problems because it had not been created in the same manner as the VirtualBox VM. I had created the VirtualBox VM from scratch, starting with an empty VM into which I installed Windows 7. But the VMware VM was created as a conversion of that VirtualBox VM. Would it make a difference if I created my basic VMware VM from scratch? To find out, I did that, and revised its settings as described in another post. For Shared Folders, I shared only one drive on the host, sufficient to move files in and out of the VM. As I was about to see, though, this didn’t seem to make much difference.

MajorGeeks offered a long list of benchmarking tools, including many abandoned older tools that would not necessarily test important aspects of modern systems. I focused on those whose age seemed appropriate for Windows 7 systems. These were the programs I tried, first, on the VirtualBox VM, with links to their download locations:

  • AIDA64 Extreme 5.95. This was trial software. I started it and went into Benchmark. It did not appear to offer a general-purpose benchmarking tool: everything was focused specifically on memory, CPU, or FPU.
  • ASUS RealBench 2.5.6. The Benchmark option appeared to be focused on graphics. It insured that I would do nothing else on the system, potentially affecting its calculations, by halting any time I moved the mouse. Eventually it died, as I suspected a graphics-oriented benchmark would, on the graphics-cursed VirtualBox VM.
  • CPUEater x64 9.0. The purpose of this tool seemed to be to demonstrate the value of a Process Lasso tool.
  • CPU-M Benchmark 1.6. This program quickly produced a CPU-specific benchmark value.
  • CrystalMark 2004R7 0.9.200.
  • According to CrystalDewWorld, this was obsolete 32-bit software, updated in 2017 just for old times’ sake. I clicked the Mark button. Nothing seemed to be happening. Eventually I saw that it actually was working through the listed tests, one at a time. At the end, it produced an overall Mark score that seemed to be the simple sum of scores for individual components. This was the only benchmark program, in this list, that seemed as comprehensive as PassMark and the others discussed above.
  • QwikMark Portable 0.4. QwikMark offered a Run All Tests button to run its tests of CPU Speed, CPU FLOPS, Memory Bandwidth, and Disk Transfer. It finished in a matter of seconds.
  • UserBenchMark 2.9.2. This one, too, said that it would typically finish in less than a minute. It reported failure in Internet connectivity — not surprising, since these VMs were deliberately Internet-disabled; and with that, it said, “The benchmarks did not complete successfully.” I could have enabled Internet access temporarily, but I was pretty sure it would fail again, and quit, when it got to the VirtualBox VM’s graphics limitations.
  • x264 FHD Benchmark. Its initial screen said, “This application will test how fast your CPU is able to encode test sequence (SVT_1080p50.mkv) using default x264 settings.” In other words, it was a single-purpose tool focused solely on video encoding. It took four minutes and saved its results in “executable directory,” which meant the folder from which this portable program was running. The key point in those results seemed to be that it averaged 9.67 fps, 22399 kb/s. The Compare button showed me that, compared to faster systems, this 9.67 value was pretty slow.

On the basis of this preliminary look, I was inclined to start with CrystalMark and then, as needed, try the special-purpose tools offered by AIDA64, CPU-M, QwikMark, and/or x264 FHD Benchmark. Having verified that at least these tools worked in the VirtualBox VM, I tried them in the virginal VMware Player VM, where I had not installed Windows updates.

Unfortunately, CrystalMark crashed when I ran it, and crashed again when I retried. The crash happened during the OpenGL test; the specific application was CM09OGL.exe. It appeared this benchmarking program probably required one or more Windows updates in order to run successfully in Windows 7. (Later, I went back and ran it successfully on VirtualBox and VMware VMs that did have the necessary updates.)

Moving on to the other benchmarking tools, then, I was quickly able to determine that benchmarks ran successfully in AIDA64, Qwik-Mark, and x264 FHD Benchmark. CPU-M was more of a problem. To install it, I had to use Shift-right-click > Run as Administrator; but then it failed to run, with this notice:

CPU-M Benchmark.exe – .NET Framework Initialization Error

To run this application, you must first install one of the following versions of the .NET framework:

v4.0.30319

Contact your application publisher for instructions about obtaining the appropriate version of the .NET Framework.

A search led to a Microsoft webpage offering the Microsoft .NET Framework 4.7 (Offline Installer) for Windows 7 SP1 and other versions of Windows. I assumed version 4.7 would cover the needs of the specified version 4.0. I downloaded it and tried to install it in the virginal VMware VM. It produced an error:

Setup has detected that this computer does not meet the requirements to complete this operation. The following blocking issues must be resolved before you can proceed.

Blocking Issues:

  1. The .NET Framework 4.7 is not supported on this operating system.
  2. Setup cannot continue because a dependent update needs to be installed before you can install this product on Windows 7, Windows Server 2008 R2, Windows 8 or Windows Servers 2012.

The second of those two issues pretty much confirmed that in fact, as the webpage said, this update was compatible with Windows 7. So I followed the link for the requisite update. The resulting webpage provided locations for the needed D3DCompiler_47.dll file — leading, for my Windows 7 x64 SP1, to a download of a file called Windows6.1-KB4019990-x64.msu. I tried to install that in the VM, but it said, “The update is not applicable to your computer.”

So, OK, maybe I should have searched for NET Framework 4.0 instead of accepting 4.7. I tried that now. That search led to a different Microsoft page, resulting in a download of the Microsoft .NET Framework 4 (Standalone Installer) in the form of dotNetFx40_Full_x86_x64.exe. I brought that into the VMware VM and installed it. With that done, I ran CPU-M again. This time it worked. I wondered if that .NET Framework update was also the solution to the CrystalMark problem. But no, when I tried CrystalMark again, it still crashed at the same place. Proceeding with CPU-M, then, I was able to run its benchmark test.

To sum up, the situation was that I did find a few benchmarking tools that would run in a Windows 7 VM with no Windows updates installed since SP1. None of these tools had the fancy graphics and other desirable features of the programs used earlier (e.g., PassMark). The benchmarking programs that I was able to run in this VM were Windows Experience Index (above), AIDA64, CPU-M, QwikMark, and x264 FHD Benchmark. To the extent that AIDA64, especially, simply reported various performance statistics, I might have supplemented or replaced it with other system information tools (e.g., Speccy).

So I did not find a general-purpose benchmarking program that would run in a virginal Win7 VM; but I did get enough information from these several tools to address the question of whether perhaps the VMware VM would have performed better without all those Windows updates that I had added in order to install PassMark and the other benchmarking suites discussed above. The information I got was as follows:

  • CPU-M. Scores: VirtualBox: 22909; VMware: 19673. Conclusion: the VirtualBox VM, cluttered with Windows updates and previous benchmark tool installations, was 16% faster.
  • QwikMark. The point of disagreement, here, was Disk Transfer. Specifying drive C transfers, in five tests, VirtualBox produced scores of 149, 243, 269, 347, and 412, while VMware scores were 417, 281, 425, 404, and 453, for averages of 284 vs. 396. Conclusion: on average, VMware’s Disk Transfer speed was 39% faster.
  • x264 FHD Benchmark. VirtualBox: 9.67 fps. VMware: 9.31 fps. Conclusion: VirtualBox was 4% faster.
  • AIDA64. Among this program’s 15 component-specific benchmarks, I selected a few examples, as follows: Memory Read: VirtualBox 14034 MB/s; VMware 13635; VirtualBox 3% faster. Memory Write: 15221 vs. 14714; VirtualBox 3% faster. CPU Queen: 18311 vs. 18270; VirtualBox 18270; VirtualBox <1% faster. CPU PhotoWorxx: 9095 vs. 7749; VirtualBox 17% faster. CPU ZLib: 127.8 vs. 127.6; little difference. CPU AES: 4722 vs. 6564; VMware 29% faster. CPU Hash: 1378 vs. 1489; VMware 7% faster. FPU Mandel: 3132 vs. 3113; VirtualBox 1% faster. Conclusion: VirtualBox and VMware seemed to be fairly close.

The conclusions from the previous section, comparing two fully updated Windows 7 VMs, were that VirtualBox seemed to be somewhat faster than VMware Player. The present section suggested that a minimal VMware VM might be somewhat faster than the VMware VM that was fully loaded with Windows updates, but still wouldn’t be noticeably faster than a VirtualBox VM similarly loaded. In an apples-to-apples comparison of minimal VMs, it seemed VirtualBox would probably pull ahead once more.

The big uncertainty, here, was Disk Transfer speed: PassMark had found the loaded VMware VM to be noticeably slower than the loaded VirtualBox VM, but now QwikMark was finding it to be noticeably faster. Maybe both were possible. The loaded VMware VM had five host partitions as shared folders, while the minimal VMware VM had only one. QwikMark had seemed to be looking at only drive C, so that shouldn’t affect the calculation, but maybe it did anyway.

There was one other noteworthy benchmarking tool that I had not explored: VMware‘s VMmark 3.0, downloaded as vmmark3-template-053117.ova. A search suggested that VMmark had not often been used to compare VMware Player against VirtualBox. I did not use it initially because I suspected it would be designed to highlight VMware’s strengths. But I was curious as to what it might say. So after I had finished this post and was reviewing its contents, I decided to insert, here, a brief comment on VMmark. Its User’s Guide (2017) provided extensive notes on all kinds of preparations that I was not about to invest time in. Instead, in VMware Player, I tried Open a Virtual Machine > navigate to the downloaded .ova file > Open > call it VMmark > Import. It gave me an error and an option, “Click Retry to relax OVF specification and virtual hardware compliance checks.” Retry resulted in an error: “Failed to open virtual machine: Invalid target disk adapter type: pvscsi.” A search found nothing exactly on point, but did lead to 1 2 suggestions for editing the .ova that might make it usable in VMware Player. I decided not to pursue that project at this time.

Overall, then, it seemed performance in VirtualBox and VMware Player VMs might be slightly improved by keeping them simple and limiting shared folders. It seemed relatively clear that the installation of updates had not profoundly impaired VMware. That is, I did not find support for the speculation that VMware would run much faster than VirtualBox if I did keep things simple. My rough sense was that, for any level of VM complexity, VirtualBox would probably tend to be somewhat faster.

Hands-On Exploration: Video Editing

At this writing, I was particularly interested in using a VM as an alternative to the Windows 10 host, which for some reason was having problems with some of my software. To gain an idea of what it would be like to use some of that software in a VM, I installed a trial version of Adobe Premiere Elements (APE) into the fully updated VirtualBox VM. I felt that video editing was among the more complex and demanding tasks a computer might attempt, and might thus provide at least some sense of whether the VM was ready to handle a workload.

VirtualBox did give me at least that trial version, and in that sense was willing to work with my sample video file, whereas for some reason the installation of APE on the VMware VM was insisting upon activation before it would go further. This was OK: I assumed the experience of working with the same software in the two VMs would be pretty similar, whereas there might be striking performance differences between the host and a VM.

One thing I noticed immediately: in VirtualBox > Windows > Control Panel > Display > Screen Resolution, the maximum was 1280 x 960, whereas the host went up to 1920 x 1080. A search for solutions found that the following suggestions did not help: drag the edge of the window to produce a larger window; use VirtualBox Manager > File > Preferences > Display > Maximum Guest Screen Size = None, or Hint > 1920×1080; enable 3D acceleration; make sure Guest Additions are installed; Host-F (using Right-Ctrl as the default Host key).

What did work was: with the VM running, enter a command in the folder where VBoxManage.exe was located. The command was as follows:

vboxmanage controlvm "Name of VM" setvideomodehint 1920 989 32

where “Name of VM” was the name shown in VirtualBox Manager; the ending 32 was the maximum color mode (i.e., 32-bit) allowed (where 24 might be a workable alternative if needed); and the resolution (i.e., 1920 x 989) was the result of trial and error. On my monitor, setting the vertical resolution even one pixel larger (i.e., 990) would add both horizontal and vertical scrollbars (i.e., not everything would fit onscreen; the Windows taskbar would be obscured). Tinkering was easy: the VM was immediately resized, so I could see what I was achieving, each time I re-ran the command with a different set of resolution values.

In a brief look, I found that the APE menus lagged a bit in dropping down; they didn’t respond instantly. But there was no sluggishness in dragging elements around on the APE timeline, and playback was normal. So it seemed that APE was going to work in a VM. I expected it to be slow in some regards (e.g., rendering), but it seemed it would serve until I decided to fix Windows 10 on the host, or switched my video editing to another computer, or tried some other video editing software. Beyond that, I didn’t feel I would have much informed commentary on APE within the VM until I had spent some hours using it there.

Deciding on CPU Cores

My last question, for now, in configuring a VM as an alternative to a host computer, was whether it mattered how many CPU cores I allocated to the VM. I had settled on 8GB RAM as a good starting number, on a system with 24GB total; I would revise that figure as I gained further experience with the host or guest running out of RAM in various situations; but I did want to get a clearer sense of how many CPU cores I should dedicate to the VM.

VirtualBox itself provided a guideline. As shown in the image (above), taken when the VM was powered down and I was in VirtualBox Manager > Settings > System > Processor tab, the slider ran (on my machine) from 1 CPU to 8 CPUs. But the color bar under the slider changed from green to orange above 4 CPUs. So when I tried selecting 5, it suggested backing off to 4.

Not everyone seemed to agree with that VirtualBox guideline. For example, some participants in a StackExchange discussion observed higher performance when allocating all (in my case, eight) CPU threads. If I was going to do that, an accompanying suggestion was to reduce the Execution Cap slider to 80-90%, so as to reduce the risk of a crash.

Lifewire (Nelson, 2016, item 06 of 07) reported overall GeekBench score improvements of 30% (when going from one CPU to two) and 34% (going from two CPUs to four) in VirtualBox. There did seem to be general consensus that going at least to the halfway mark (e.g., 4 CPUs of 8 maximum) would improve performance without increasing instability.

 

Advertisements
This entry was posted in Uncategorized and tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

One Response to Choosing the Best Benchmarks: A Comparison of VirtualBox vs. VMware Virtual Machines

  1. dg1261 says:

    “Of course, there were non-performance-related tradeoffs. VMware was easier to use, for purposes of file operations (e.g., copying or renaming VMs); on the other hand, VirtualBox offered features (e.g., snapshots) that were not available in VMware’s free Player.”

    Wait . . . what?? VMware doesn’t support snapshots? Screeching halt. That makes VMware a non-starter, in my book.

    I use my VMs for websurfing and test-driving trial software before committing it to my main machine. If I can’t throw away a VM session, then it’s of no use to me.

    I realize Ray is talking about the free version of VMware Player, but it’s a fair comparison. There’s no reason to have to pay for a different version just to get features that are already in VirtualBox for free.

    I did try VMware Player a decade ago, but quickly discarded it. I found it installed and used more background resources–even when it wasn’t being used–than Virtual PC or VirtualBox. It had other limitations (back then, it could only play VMs, not create them) and no real advantages, so I ditched it and haven’t looked back.

    That said, I do agree VirtualBox makes it unnecessarily hard to manage virtual disks. I’ve gotten used to it, but there’s no reason for it to be so obtuse.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s