WavebreakmediaMicro - Fotolia
GPU use in VDI deployments is gaining traction thanks to a number of technological advancements and emerging use cases, such as the ability to share GPUs and free up CPU resources.
In the early days of VDI, IT administrators used GPUs as a way to appease power users who would otherwise reject VDI. At the time, virtual desktops were generally a poor choice for users who spent their days working with graphically intensive applications, such as computer-aided design. By dedicating a physical GPU to a user's virtual desktop, running such applications became more practical.
As of 2018, organizations have started using GPUs with general-purpose virtual desktops, too. In fact, provisioning virtual desktops with GPU resources has become the norm.
Factors affecting GPU adoption
One reason GPU use is more common than it once was is that administrators can share GPU hardware with multiple virtual desktops. At one time, provisioning a virtual desktop with GPU resources meant dedicating a physical graphics card to the VM. If a host server had three physical GPUs, then it could support three GPU-enabled virtual desktops. Furthermore, the use of dedicated hardware meant that GPU-enabled virtual desktops couldn't typically be migrated to another host.
Now, not only can admins share GPU resources with multiple virtual desktops, but it's also no longer necessary to dedicate an entire GPU to a single virtual desktop, although some users might still benefit from doing so.
Nvidia provides virtual GPU software that works with the hypervisor to enable virtual desktops to share GPU resources. Similarly, hypervisor vendors have developed GPU-sharing technology. For example, VMware provides this ability with its Virtual Shared Graphics Acceleration feature and Microsoft's RemoteFX vGPU makes GPU resources available across multiple VMs.
GPUs also offload much of the graphical workload from the CPU, thereby freeing up CPU resources that would have otherwise been consumed by display rendering. Under the right circumstances, GPU use might actually help an organization increase its virtual desktop density.
Another factor driving GPU use is that guest OSes and applications have become far more graphically intensive over time. For example, Windows 10 uses animations, shading effects and other graphical elements that a GPU can render quickly.
Dig Deeper on Application virtualization and streaming
Related Q&A from Brien Posey
Your organization could accomplish Microsoft Exchange backup with native protection or through third-party offerings. It all depends on what you are ... Continue Reading
Edge data storage backup has become a key component of data protection plans. File sync-and-share software can complement this important backup ... Continue Reading
Some older file formats continue to find life in the enterprise decades after they were developed for their versatility -- in certain situations. Continue Reading