This content is part of the Essential Guide: The go-to guide for managing VDI users

Essential Guide

Browse Sections

How to quantify VDI user experience

IT can take steps to quantify the virtual desktop user experience by recording data on factors such as latency and image quality, as well as conducting user surveys.

VDI user experience is ultimately subjective because it comes down to the user's emotions, attitudes and perceptions. Even so, IT professionals must be able to measure it in some way.

A combination of specific metrics and detailed user feedback can help administrators quantify VDI user experience. This week, in a webinar hosted by BrianMadden.com and Team Remote Graphics Experts, attendees learned about these methods.

"Finding a way to measure user experience is very important," said Magnar Johnsen, owner of Firstpoint AS, an end-user computing consultancy in Norway, in a webinar session. "If the user experience is bad, it doesn't matter how expensive your solution is, how fancy it is -- users are going to hate it."

Quantify the subjective with hard numbers

The goal in every deployment should be to deliver a VDI user experience that at least matches physical desktop performance. Achieving this goal is challenging because a physical PC has dedicated resources, often an integrated or dedicated graphics processing unit (GPU), and limited end-user latency -- the time it takes for a user's physical action to translate to the screen.

A virtual desktop, on the other hand, has to share resources with other desktops, often does not have a GPU and suffers from increased latency. The remote display protocol also tends to optimize frame quality at the expense of bandwidth and CPU.

To measure the VDI user experience, there are five key metrics for IT to focus on, said Erik Bohnhorst, performance engineering lead architect at Nvidia, in another webinar session.

Remoted frames. The more frames a user receives, the better the experience. Without enough frames, applications can run into issues, such as stuttering videos.

End-user latency. IT pros can hone in on end-user latency to find out how long certain actions take to complete, which helps determine if the experience lags. They can determine lag by simulating how a user interacts with his virtual desktop and measuring the time between when the user clicks his mouse and when the action translates on screen.

Image quality. IT can measure image quality based on the difference between what the image should look like and what it looks like on the user's endpoint. Discrepancies occur when the remote display protocol has to change the image to save on other resources, such as bandwidth. To identify the difference, IT can use the structural similarity algorithm -- a model for evaluating image quality.

Prove you're a master of VDI user management

Prove your knowledge of user environment management tools with this quiz covering VMware's built-in UEM tool, UEM alternatives and more.

Functionality. Virtual desktops must be able to support the same range of apps as physical desktops.

Consistency. Users are generally happier with a mediocre but consistent performance level rather than large fluctuations. It might be better to have five-second login times every day, for example, than one-second times on some days and 15-second times on other days.

There are several tools IT can use to track VDI user experience. One example is benchmarking software, such as Remote End-User Experience (REX) Analytics, to capture and analyze the perceived VDI user experience. REX Analytics is protocol-independent and does not require databases or separate management software.

Take user feedback into account

VDI administrators should also make sure to combine the technical metrics with subjective information about virtual desktops gleaned from their users.

"If the user thinks it's slow, they will complain even though you say, 'Here are the charts; it's very fast,'" said Rody Kossen, senior systems engineer at AWL-Techniek, in another session. "If the user still says it's slow, you fail."

If the user thinks it's slow, they will complain even though you say, 'Here are the charts; it's very fast.'
Rody Kossensenior systems engineer, AWL-Techniek

To go beyond the metrics, Kossen's organization, a global company based in the Netherlands that builds welding robots for the automotive industry, conducted a survey of its users, asking questions about UX aspects, such as screen quality and text readability. This strategy enables users to understand and communicate their problems, and it helped IT compare what users were seeing with what the technical data showed.

Combining this information enabled Kossen and his team to identify the difference between heavy 3D workloads, in which mechanics use applications to build 3D models of equipment, for example, and text-only workloads, such as everyday office apps, as well as combinations of the two.

This was useful because each of the company's offices has varying needs for each type of workload, Kossen said. The Czech Republic office has one engineer who needs only 3D workloads, 34 mechanics who need 3D and text workloads, and 10 other office users who need text only. The company's China office, on the other hand, has eight engineers, 12 mechanics and 10 office users. With the survey information and technical metrics, IT could optimize what resources each user type needs and deliver different codecs, for example, to employees based on those needs.

Dig Deeper on Virtual and remote desktop strategies

Enterprise Desktop
Cloud Computing
SearchVMware
Close