High-Performance Computing

The GPU-HPC combo finally comes of age


News Stay informed about the latest enterprise technology news and product updates.

NVIDIA's virtualized GPU eliminates VDI graphics performance barriers

VDI isn't known for delivering high-end graphics with great performance, but NVIDIA's virtualized GPU is about to transform virtual desktop graphics.

Many IT shops virtualize only a portion of their desktops because VDI doesn't deliver the type of graphics performance that end users get from a PC. NVIDIA changed that this week with the first virtualized GPU.

Maybe one of these days it really will become the year of the virtual desktop.

Shannon Snowden,
New Age Technologies virtualization advisor

The NVIDIA VGX platform is a major step forward for the virtual desktop industry because it allows IT to virtualize desktops and apps that were previously off-limits and deliver them to low-cost thin clients and zero client devices, industry experts say.

"It takes VDI to the next level," said Gunnar Berger, a virtual desktop and applications analyst and blogger with Gartner, Inc.

Indeed, one of the biggest stopping blocks for the desktop virtualization industry has been the inability to virtualize graphics-intensive applications, said Shannon Snowden, a virtualization advisor with New Age Technologies, an IT consultancy based in Louisville, Ky.

"It is a much tougher sell to organizations to introduce a completely different way to manage and deploy only a certain percentage of desktops, rather than having the capability to replace them all," Snowden said.

Solving VDI graphics performance problems

With virtual desktop infrastructure (VDI), graphics are typically rendered in the server CPU and then delivered to the end user. But rendering graphics is a parallel problem, not a serial computing problem that CPUs can handle, so graphics performance has been subpar, said Jeff Brown, NVIDIA Corp.'s general manager.

By adding the virtualized GPU to the server, graphics circumvent the CPU and go directly to the virtual GPU, then out to end users, Brown said. This means that graphics-intensive applications such as computer-aided design (CAD) can be delivered to remote employees using VDI -- with good performance.

With that, VDI shops should now be able to extend desktop virtualization to the types of desktops that weren't a good fit for virtualization before, including engineering desktops with resource-intensive apps.

"NVIDIA and other companies are certainly pushing [the percentage of applications that can be virtualized] higher," Snowden said. "Maybe one of these days it really will become the year of the virtual desktop."

The ability to virtualize more types of desktops translates into lower VDI costs, Berger said.

"In every VDI environment I have architected, Capex is never a play; it is always Opex," he said. "But if you can virtualize engineering desktops, you might be able to reduce the Capex."

The virtualized GPU can also be shared by many end users, lowering server hardware costs further.

GPU sharing has been available, but was limited in terms of scalability. Citrix Systems Inc.'s XenApp HDX 3D, for example, supports up to 12 users per high-end graphics card.

More on VDI graphics:

VMware to support hardware-accelerated graphics in View 5

Graphics-intensive application performance: You can't have it both ways

With NVIDIA VGX, up to 100 virtual desktop users will be able to share a multi-GPU graphics card, improving user density on a single server, said NVIDIA partner Citrix in a blog post.

In addition, the CPU offload benefit allows IT to add more end users per core.

"This way, the user gets a real [PC] experience, and from an IT perspective, you get CPU cycles back," NVIDIA's Brown said.

Another possibility that "could create big waves in the industry," Gartner's Berger said, is that VGX streams video using H.264 -- a different channel than the standard remote desktop protocols.

"You have this new [VGX] card, and you think, 'Great, now I can use CAD with VDI, but this could lead to a new shift in protocol,'" Berger said. "It is cross-platform, and [replacing protocols] could be a possibility in the future."

The nitty-gritty of virtualized GPU

NVIDIA VGX is based on three technologies:

  • VGX Boards made up of the VGX hypervisor and the virtual GPU, which can host a large number of users. The first NVIDA VGX Board is configured with four GPUs and 16 GB of memory, and it fits into the industry-standard PCI Express interface in servers.
  • NVIDIA VGX GPU Hypervisor, a software layer, integrates into server hypervisors such as Citrix XenServer.
  • NVIDIA User Selectable Machines, a manageability option that allows IT to configure the graphics capabilities delivered to separate end users in the network, based on their needs. IT can deliver standard native PC experience or enhanced professional 3-D design and engineering capabilities with NVIDIA Quadro or NVIDIA NVS GPUs.

The GPU includes a memory management unit and a Virtual Machine Manager sees it as a virtual device. The memory management unit allows VGX to be divided into dedicated channels per virtual machine.

IT shops can also use the VGX platform in heterogeneous hypervisor stacks. It will be part of hypervisor initial builds and will roll up in hypervisor patches if there are any new features or bugs to fix, Brown said.

As of now, only Citrix XenServer officially supports VGX. Citrix XenDesktop HDX 3D Pro with VGX support is due in June as part of XenDesktop 5.6 Feature Pack 1.

NVIDIA is working on a certification program to gain more VGX original equipment manufacturers. The company expects VGX platform-based products to be available by the end of the year.

VGX "will be nominally priced," according to NVIDIA, though pricing and licensing terms were not available.

Let us know what you think about the story; email Bridget Botelho or follow @BridgetBotelho on Twitter.

Article 1 of 4

Dig Deeper on Virtual desktop infrastructure and architecture

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.