GPU vs. CPU: Three ways to run graphics-heavy apps on virtual desktops

It's a pain trying to accommodate graphics-intensive applications on a virtual desktop, but don't despair. There are ways to deliver these apps fast.

Some applications just don't play well with VDI.

In part one of this series on graphics-intensive applications, I explained the difficulties of giving virtual desktops to people who work with 3-D graphics and video rather than words and numbers, such as artists or designers. In part two, let's look into issues with getting the pixels efficiently to the desktop screen.

To understand the problem with graphics-heavy applications, you need to know how a graphics processing unit (GPU) differs from a central processing unit (CPU). Then I'll cover some solutions for providing GPU power to the applications that need it.

Understanding GPU vs. CPU

A GPU is a specialized processor that uses parallel processing to calculate many elements of an image at once, whereas a CPU does only one calculation at a time. With a desktop PC, the 3-D rendering is done by a dedicated GPU on a graphics card, and for high-performance 3-D applications you need a high-performance GPU.

More on 3-D graphics applications

VMware View 5 supports hardware-accelerated graphics

Getting the most out of graphics-intensive application performance

How content redirection can help deliver rich media

More and more applications require 3-D graphics today. For instance, Windows 7's Aero Glass requires a GPU to operate, and Internet Explorer recently gained the ability to use a CPU and GPU to get better performance than it would by using a CPU alone. In addition, video production and computer-aided design (CAD) applications have spread into more organizations, making the requirement for 3-D graphics common among customers deploying virtual desktop infrastructure (VDI).

The problem is that most VDI environments use virtual machines (VMs) and a shared virtualization host, but VMs haven’t traditionally had the ability to use a GPU. Similarly, remote desktop sessions have been unable to use a GPU. Those two issues often rule 3-D workers out of the VDI equation.

Dedicated hardware can help

One way to get around this problem is to have a dedicated rack mount or blade PC for each user. These have dedicated GPUs for each user (along with dedicated CPUs, disk and RAM, too). This setup provides isolation between users and guarantees a lot of resources for each one. Essentially, it places the user’s workstation in the data center and simply remotes the display.

However, this approach to GPU applications can be expensive. Rack and blade workstations cost at least as much as their desktop counterparts, and you would need to buy one for each user. The workstations are used by only one user at a time, so there’s also the potential for a lot of wasted resources. For this to be a viable approach, you should have a very good reason for needing user isolation.

What about shared hardware?

A while ago, Citrix's server virtualization hypervisor, XenServer, acquired the ability to pass a PCI-e GPU card through to a VM. This feature allows for 3-D graphics acceleration on as many VMs as there are PCI-e cards in your virtualization hosts. Realistically this means a handful of users per physical host -- a big step up from one user per host but a long way from the 100 users per host you could achieve with users who don’t often require 3-D applications.

This approach suits full-time 3-D users who do not need the level of isolation provided by dedicated hardware. VMware has announced an alliance with NVIDIA to bring this capability to its VDI product. (Although there doesn't seem to be a product yet -- maybe it's been  overtaken by the virtualized hardware approach.)

Advancements in virtualized hardware

Over the last few years there has been a lot of engineering effort toward achieving a shared GPU architecture for VMs and Remote Desktop Services. Microsoft's RemoteFX display protocol allows multiple users to share GPU in Remote Desktop Session Host (Terminal Services) or Remote Desktop Virtualization Host (VM-based VDI).

With multiple users per PCI-e slot GPU card, you can scale to a lot more users per physical server. This enables you to run mainstream 3-D graphics, such as user interfaces, and to support infrequently used 3-D applications without requiring dedicated hardware. Both Citrix and VMware are working to bring this capability into their VDI products as well.

CPU still matters

All these solutions to the graphics problem are about getting a GPU into a virtual desktop so it can draw screen elements fast. However, the job of determining which 3-D elements need to be drawn still sits with the CPU -- as does much of the math that CAD applications need to do. As a result, a lot of users running 3-D applications on a virtualization host may experience CPU bottlenecks even though the GPU eliminates display-rendering bottlenecks.

That means that 3-D applications are likely to reduce the average user count per host because they will be high on CPU load and typically also require a lot of RAM.

It's harder to satisfy users who have graphics-intensive applications with VDI than those with normal office duties, but with a good design there are ways to accommodate many types of desktop requirements.

This was last published in December 2012

Dig Deeper on Virtual desktop management

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

12 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What do you think is the best way to assist GPU-heavy application rendering?
Cancel
Virtualized cloud processing allows freedom of processor heavy functionality from a greater range of user interfaces
Cancel
Graphics intensive software has always required capable hardware. I don't know how one changes this fact.
Cancel
If CPU is heavy giant stone GPU is like sand and can make a difference once we know how to use it cement it and start making buildings out of it with suitable support from hardware for making it work.
How to get all the sands(GPUs) to work together is up to the designer. I believe anyone who can work faster will eventually overtake the slow but smarter person or machine when learning is constant.
Cancel
I don't believe in Virtualized hardware as we'll need, in the end, to use physical hardware...
But we need to give best performances; try to look at a YouTube Video on a full screen...
Cancel
Use IDV!!
Cancel
the problem with dedicated or shared hardware is that hardware is subject to Moore's Law and will need to be replaced frequently to keep the system current with evolving technology.
Cancel
Develop better codec than MPEG Pixel Frame Streaming, designed for 3D paramteric rendering.
Cancel
Currently GPU solutions for virtualized environments are not cost effective for most implementations. We are probably 2 years away from correct blend of technology that is cost effective.
Cancel
hardware assistance will remove reliance on the Hypervisor's CPU
Cancel
Will HDX 3D Pro supports CPU based rendering, because we cannot add new GPU card hardware on the Host server, We need to play the web based unity3D games on virtual machines, which currently says 3D rendering error and says DirectX not available. Additional information is to play these games need Unity Web player to be installed. So is there a option for CPU based rendering and will that helps?
Cancel
what is the difference between GPU based compression and CPU based compression? Will CPU based compression good for online 3D games?
Cancel

-ADS BY GOOGLE

SearchEnterpriseDesktop

SearchServerVirtualization

SearchCloudComputing

SearchConsumerization

SearchVMware

Close