Desktop virtualization can be difficult to implement and does not always save money, so get to know some of the...
virtualization challenges you're facing before you dive in.
The fairly simple methods in place today for image-based OS deployment, patching and personalization using Windows local profiles are well-known and have been fairly static. Desktop virtualization, on the other hand, has increased the complexity of delivering Windows desktops and applications.
Fortunately, some advancements in the virtualization industry, such a virtualized graphics processing units (GPUs) and workspace management, can help IT overcome some of these challenges.
More on desktop delivery models
Use cases for desktop virtualization
Moving to a nonpersistent delivery model to reduce total cost of ownership increases virtual desktop complexity. For desktop virtualization to perform as intended, many technology layers have to work in harmony. Plus, the use of application virtualization in an effort to simplify application delivery can create its own problems, ranging from slow performance and app crashes to communication problems between virtualized and nonvirtualized applications.
Lack of user personalization of virtual desktops can result in users refusing to switch to nonpersistent desktops because they don't work the same way their physical desktops have worked. Additional products to properly capture personalization data can increase the complexity and cost of desktop virtualization.
For example, desktop virtualization can involve the following technology stack: storage, compute, hypervisor, virtual desktop operating system, virtual desktop applications, virtual desktop personalization, broker, remoting protocol, endpoint operating system, peripherals, endpoint applications and endpoint user personalization. All of these elements should work together smoothly, and different IT teams may be responsible for each of these layers, further increasing complexity.
Most desktop virtualization setups today move compute resources from the endpoint to the data center. Companies are trading the relatively low cost of a PC for a high-powered server that is stored in a power- and temperature-controlled data center.
In most deployments, enterprises are still using shared storage, and this component alone is reportedly priced, on average, at about 40% of the cost of a desktop virtualization system. Most companies now expect that capital expense costs for desktop virtualization will probably be the same as for a traditional PC deployment.
Vendors have made many promises about the Opex savings of desktop virtualization. These savings are also usually based on a nonpersistent desktop delivery model, the most difficult desktop delivery model to achieve. Replacing lower-salary desktop technicians with tenured and highly skilled desktop virtualization administrators can erase some Opex savings.
In addition, the user experience is not easy for IT shops to predict and measure. Moving the desktop or applications from the endpoint and into the data center can undermine the user experience. Poor LAN or WAN conditions can harm the experience of remoting desktops and applications. Applications that used to run entirely locally using dedicated resources can be less responsive in some cases when centralized in the data center. Applications that require offline access can complicate the design of a virtual desktop infrastructure.
Some of these challenges may be mitigated by new technology and advances in the desktop virtualization industry. Here are a few things to look forward to in the future of desktop virtualization.
Future of desktop virtualization
Looking ahead at the future of desktop virtualization, the increasing use of NAND memory for storage not only reduces the cost of deploying server-hosted virtual desktops, but also increases the rate of success that customers have. The amount of IOPS that even small amounts of NAND can handle reduces the likelihood that an undersized disk infrastructure will cause performance problems. As this technology becomes less expensive to deploy, the performance and overall cost of desktop virtualization should improve.
Virtualized GPU. In the next few years, enterprises will likely use physical GPUs installed into VDI hosts to provide virtual GPUs (vGPUs) to the virtual desktops running on them. VGPU technology could deliver near-native user experiences with video while offloading more of the workload to the endpoint, further increasing scalability.
VGPUs would allow a virtual machine to use physical GPU resources in the same way a physical desktop would. Video rendering and encoding could be done in hardware versus in CPU, as it is today. In addition, vGPUs could use transcoding video coder/decoders to the H.264 standard to allow for potential offload to the client as well as more efficient use of the bandwidth by transcoding down to reduce bit rates.
User workspace management. Desktop virtualization offerings are evolving to include native application delivery, mobile application management, mobile device management, externally hosted applications in Software as a Service and "follow-me data." Follow-me data allows users to sync their files across different platforms and devices to provide ubiquitous access to their user data. This is commonly referred to as a "Dropbox-style" solution.
As vendors add these complimentary technologies to virtual desktop products, the workspace is shifting from the Windows desktop to a vendor-provided workspace that aggregates these solutions into a single portal. With multiple devices and an increasing number of non-Windows devices in consumer hands, the market is shifting to support heterogeneous endpoints. IT should prepare for a future in which workers gain access to internal and external applications and data, regardless of device, using brokered authentication.
Layering and personalization. IT can use desktop virtualization to rapidly provision department- and user-installed applications. Personalization can provide user settings, preferences and so on across platforms, and it can be applied to any device on demand. These layers of technology can reduce the complexity and cost of using nonpersistent Windows systems.
A combination of these developing desktop virtualization technologies will allow the user experience to approach that of a native local experience or even exceed it. The more seamless the delivery of desktops, apps and data is, the more likely it is that users will accept virtualization and that IT will be successful in managing it.
Dan Brinkmann asks:
Which of these desktop virtualization challenges trips you up the most?
0 ResponsesJoin the Discussion