Windows Server

Oil & Water: Hyper-V and 3D graphics adapters

For a variety of reasons -- access to more RAM, better performance, etc. -- many people have chosen to use Windows Server 2008 as a desktop solution. With Hyper-V -- one of the newest products to arrive on the enterprise virtualization scene -- is a free add-on for this server product that might be a logical choice to use for virtual testing. However, Hyper-V and higher-end video cards don't get along. Scott Lowe, with a little help from the Microsoft Knowledge Base, explains why.

Partly due to perceived shortcomings in Vista and party due to need, there has been a great deal of interest in running Windows Server 2008 as a workstation.  Personally, I run the 64-bit edition of Windows Server 2008 on both my home and work machines.  The additional RAM possibilities and the ability to use some Windows Server 2008 features make this a desirable configuration.

If you do any amount of testing, installing the Hyper-V role might be on your to-do list.  Although there are a whole lot of options out there when it comes to virtualization, at first glance, adding the Hyper-V role might seem to be the easiest path to take.

If your desktop is doubling as your lab environment, you might want to reconsider this choice and pick a different virtualization option.

A couple of days ago, I installed the Hyper-V role on my work Windows Server 2008 desktop.  The installation of the Hyper-V role requires a reboot.  Immediately upon reboot, my system slowed to an absolute crawl.  It was so bad that it was almost to a point of being unusable.  The mouse pointer was choppy, application windows redrew slowly, if at all.  This was a weekend activity on my part, so I was also watching a Flash-based video on Hulu while I worked; the video performance was so bad that it wasn't watchable.  Of course, that alone wasn't a major problem, but the system slowing to an overall crawl was a big problem.

After I uninstalled the Hyper-V role, system performance returned to normal.

I was baffled and decided to do a little research into this weirdness.  I ran across a Microsoft knowledge base article that explains the reason for this behavior:

"This issue occurs when a device driver or other kernel mode component makes frequent memory allocations by using the PAGE_WRITECOMBINE protection flag set while the hypervisor is running. When the kernel memory manager allocates memory by using the WRITECOMBINE attribute, the kernel memory manager must flush the Translation Lookaside Buffer (TLB) and the cache for the specific page. However, when the Hyper-V role is enabled, the TLB is virtualized by the hypervisor. Therefore, every TLB flush sends an intercept into the hypervisor. This intercept instructs the hypervisor to flush the virtual TLB. This is an expensive operation that introduces a fixed overhead cost to virtualization. Usually, this is an infrequent event in supported virtualization scenarios. However, some video graphics drivers may cause this operation to occur very frequently during certain operations. This significantly magnifies the overhead in the hypervisor."

In other words, common tasks that use certain video functions require an inordinate amount of overhead when the Translation Lookaside Buffer is virtualized.  As a result, system performance is seriously impacted.  I'll admit that a lot of what is explained above is relative gibberish, but the last couple of sentences make it clear that video drivers are the culprit.

Microsoft's suggestion: Roll back to the stock Vga.sys or Vgapnp.sys generic video drivers that are included with Windows Server 2008; uninstall your high-end video drivers.  The generic drivers do not use the methods that create performance-degrading conditions and keep your system stable and responsive.  Of course, by not using the high performance drivers, a lot of other things won't work and the system isn't of much use as a decent desktop.

My recommendation: If your Windows Server 2008 system has a good graphics adapter and splits its time between desktop and virtual lab duties, don't use Hyper-V.  Instead, choose to use Virtual PC 2007, VMware Workstation or Server, Sun VirtualBox or Parallels.

—————————————————————————————————————

Have a topic idea or question you’d like me to address or answer in a future post?  Email me directly right here at trfeedback@slowe.com.

About

Since 1994, Scott Lowe has been providing technology solutions to a variety of organizations. After spending 10 years in multiple CIO roles, Scott is now an independent consultant, blogger, author, owner of The 1610 Group, and a Senior IT Executive w...

6 comments
Alan Shortall
Alan Shortall

Well, if it has already been tested that during installation of Hyper V causes the computer to go into an almost stand still, then I am having second thoughts in installing it. I would not want my computer to crawl and get boring. - Unilife Alan Shortall

lastchip
lastchip

VMWare server has the same (or similar) problem. It doesn't like Nvidia drivers on a Linux box. Strangely, VirtualBox doesn't seem to care!

Vandy-SJ
Vandy-SJ

If I choose a platform with high-end graphics interface but install Hyper-V using 2008 Server Core (64-bit) as the host OS, does the same video performance problem occur if my desktop is running as a virtual machine - either 2008 Server (64-bit) or Vista (64-bit)?

bjswm
bjswm

What happens if you use a different virtualization tool - such as vmware, virtualbox, xen, etc.?

seanferd
seanferd

Why install as Server Core with a high-end graphics card? Server core won't support anything but server roles and has nearly no GUI to speak of. I think Hyper-V is default installed in the core installation as well. Problem would be there none the less, assuming you could even install the graphics drivers in the first place. Server Core won't support anything but server roles.

AstroCreep
AstroCreep

No matter the setup on the host/hypervisor, the virtual machine/guest won't recognize your video card the same way. For example, if your physical system (w/ Server 2008 Core) has a GeForce GTX280 that's fine and all, but any guest VMs running on that hardware will use whatever the Hypervisor 'tells' the VM it has. I haven't used Hypervisor yet much, but I remember that Virtual PC tells the guest VMs that it has an S3 Trio. You won't be able to install the nVidia drivers on that guest VM because the guest thinks it has an S3 Trio; it'll say you don't have the appropriate hardware.

Editor's Picks