WavebreakmediaMicro - Fotolia
GPU use in VDI deployments is gaining traction thanks to a number of technological advancements and emerging use cases, such as the ability to share GPUs and free up CPU resources.
In the early days of VDI, IT administrators used GPUs as a way to appease power users who would otherwise reject VDI. At the time, virtual desktops were generally a poor choice for users who spent their days working with graphically intensive applications, such as computer-aided design. By dedicating a physical GPU to a user's virtual desktop, running such applications became more practical.
As of 2018, organizations have started using GPUs with general-purpose virtual desktops, too. In fact, provisioning virtual desktops with GPU resources has become the norm.
Factors affecting GPU adoption
One reason GPU use is more common than it once was is that administrators can share GPU hardware with multiple virtual desktops. At one time, provisioning a virtual desktop with GPU resources meant dedicating a physical graphics card to the VM. If a host server had three physical GPUs, then it could support three GPU-enabled virtual desktops. Furthermore, the use of dedicated hardware meant that GPU-enabled virtual desktops couldn't typically be migrated to another host.
Now, not only can admins share GPU resources with multiple virtual desktops, but it's also no longer necessary to dedicate an entire GPU to a single virtual desktop, although some users might still benefit from doing so.
Nvidia provides virtual GPU software that works with the hypervisor to enable virtual desktops to share GPU resources. Similarly, hypervisor vendors have developed GPU-sharing technology. For example, VMware provides this ability with its Virtual Shared Graphics Acceleration feature and Microsoft's RemoteFX vGPU makes GPU resources available across multiple VMs.
GPUs also offload much of the graphical workload from the CPU, thereby freeing up CPU resources that would have otherwise been consumed by display rendering. Under the right circumstances, GPU use might actually help an organization increase its virtual desktop density.
Another factor driving GPU use is that guest OSes and applications have become far more graphically intensive over time. For example, Windows 10 uses animations, shading effects and other graphical elements that a GPU can render quickly.
Dig Deeper on Application virtualization and streaming
Related Q&A from Brien Posey
Disaster recovery may not be the most prominent use case for blockchain technology, but the idea of using it for data protection and recovery is ... Continue Reading
Appliance-based hyper-convergence integrates software with hardware, while software-only HCI enables you to use the hardware of your choice. Each ... Continue Reading
When failing over VMs to the cloud or a remote data center, slow data transfer speeds can spell disaster. Don't let failed replications and partial ... Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.