Everything you need to know about GPU virtualization
A comprehensive collection of articles, videos and more, hand-picked by our editors
GPU virtualization got a lot of attention in 2013, but organizations should take a hard look at their graphics...
needs before investing in the technology.
Virtualized graphics processing unit (GPU) cards offload graphics processing to the server, which improves application performance and brings virtual desktop infrastructure (VDI) into reach for more types of users. With all the hype surrounding this relatively new technology, it can be difficult for IT shops to determine whether it's something they'll actually benefit from.
"In the use cases where GPU virtualization will make a difference, the hype is probably understated," said Jeff Wilhelm, chief technology officer (CTO) of Pawtucket, R.I.-based Envision Technology Advisors. "Everywhere else, it's probably very overstated."
The "sweet spot" for virtualized GPU adoption is among companies in the manufacturing, higher education, architecture and engineering industries, according to Nvidia Corp., the primary provider of virtualized GPU technology. Those organizations tend to require three-dimensional, video-intensive or gaming apps, as well as computer-aided design applications, which usually won't perform well in a VDI environment unless there is some kind of processing offload in place.
Graphics offload vs. protocol offload
Not everyone can solve performance issues by offloading workloads to the GPU, however. Some companies might think they need graphics offload when what they really need is protocol offload, Wilhelm said.
"People often speak about video offload broadly, although it's broken into two categories: protocol offload and graphics offload," he said. "The use case for each is dependent upon the workloads in the environment."
For instance, in an "e-learning" scenario with video-based lessons and multimedia presentations, a Teradici Hardware Accelerator card could offload most of the CPU and allow for about double the virtual machine density per host, Wilhelm said. Those cards reduce CPU overhead by offloading the processing of PC over IP (PCoIP) session rendering to a dedicated hardware board or mezzanine card.
Mahesh NeelakantaDirector of technical services, FAU
While protocol offload might be enough to improve performance in that situation, a more demanding user doing photo-editing or 3-D rendering would benefit most from GPU offload.
The problem for IT is that protocol and graphics problems tend to present similar symptoms -- high CPU utilization and slow application performance -- to the end user, so it's tricky to diagnose what you need.
Florida Atlantic University's IT department uses a combination of GPU and CPU offload to get the best performance from 3-D apps its students access for video editing, engineering and game programming classes. They installed Nvidia K1 and K2 boards into a VMware environment, which allows 8 to 12 users to share GPU cores. Plus, they get about 5% to 10% CPU reduction by offloading the PCoIP protocol with Teradici's Hardware Accelerator cards, according to Mahesh Neelakanta, a director of technical services at FAU.
"It allows us to get better performance because the PCoIP protocol is being compressed and executed in the hardware," he said.
Consider GPU virtualization costs
Determining your offload needs before buying virtualized GPUs is especially important because GPU cards aren't free. Nvidia's K1 board costs about $2,775, and the higher-end K2 board comes to closer to $4,000. Since a K1 board comes with four GPUs and each one can support eight VDI sessions, installing a K1 board would cost about $70 per user, depending on the server and virtualization platform in place.
"It's definitely more expensive," Neelakanta said. "It's not a cheaper solution; it's a more flexible solution."
For FAU, that flexibility comes from the fact that users can work remotely, the boards can support more users at once, and IT could cut down on its physical footprint -- which also means less maintenance.
For other organizations, the cost comes out pretty even. Check-6 Inc., a software and services provider for the oil and gas industry, based in Tulsa, Okla., offers simulation-based training applications that use gaming technology. With Nvidia K1 GPUs running Citrix XenDesktop, the company centralized images and saw faster login times for its app developers accessing the applications from all over the U.S., according to Joshua "Doc" Lewis, CTO for Check-6's Training Systems division.
More on GPU virtualization
How virtualized GPU technology works
How to run graphics-heavy apps in VDI
Why graphics-intensive apps need special treatment
The cost of implementing GPU virtualization was about equal to buying a new laptop for all the developers that needed one, he said.
Still, there are plenty of IT shops that don't have graphics requirements at all -- call centers, for example, or any environment with light-power, task-oriented workers. Mitch Crane, chief engineer at Desktop as a Service provider tuCloud Inc., said he has seen customers who've been told by a graphics virtualization provider that they need GPU virtualization for Microsoft Office and Excel users, leaving the companies scratching their heads.
"If you're just running Office apps … you can't justify buying something like Nvidia," Crane said. "The amount of server spend that would go up based on that would really not be justified."
Those organizations have no need to jump on the virtualized GPU bandwagon -- yet.
But changes in app development and user expectations could mean more use cases for GPU virtualization down the road, a Nvidia spokesperson said.
"Companies may not need it today, but application developers are constantly adding new features and functionality to their applications," he said. "To deliver those rich experiences, you need a graphics processor to do the heavy lifting."
That could be true as more companies move from older versions of Windows to Windows 7 and 8, where graphics are more central to the user experience.
Alyssa Wood asks:
Do you think all VDI shops need GPU virtualization technology? Why or why not?
1 ResponseJoin the Discussion