What kinds of applications are bad candidates for virtualization, and why don't they work well when virtualized?
The biggest obstacle to a positive experience when virtualizing applications is resource consumption. Applications or services that use a lot of resources -- particularly local resources such as RAM or display access -- are usually bad candidates for virtualization. It's easy to come up with some compelling examples, including:
Steer clear of those apps that max out from time to time.
Graphics-intensive applications. Applications that require a lot of graphics, such as immersive, high-definition games, computer-aided design and 3-D ray-tracing generally fall over (or run painfully slowly) if virtualized. Even virtualizing applications like these on the same host where graphics cards and RAM reside tends to lengthen code paths and slow things down. Moving such apps to a remote virtual machine is not at all a good idea.
I/O intensive applications. Virtualizing applications such as big production databases is generally not a good call either. They want to consume a lot of CPU cycles, RAM and IOPS, and virtualizing such things only makes them consume more of these resources rather than less.
Resource demand is what distinguishes a good virtualization candidate from a bad one. So, how do you determine the demand from a certain application? Monitor it carefully, over a typical production cycle (including end of period, end of quarter or other peak demand situations), and measure peak loads and average use of CPU cycles, RAM and disk I/O. Performance monitoring tools can tell you what you want to know if you give them enough time to do their jobs properly.
At the end of the day, virtualizing applications is OK if they don't impose big loads for resources based on your testing and observation. Steer clear of those apps that max out from time to time (or all the time). You'll be doing yourself -- and your users -- a big favor.
Submit your own Ask the Expert question! Email the editors and we'll get our experts working on an answer.
Dig deeper on Application virtualization and streaming
Related Q&A from Ed Tittel
Disconnected VDI means remote users can access their desktops from anywhere, but there are some downsides.continue reading
VDI requires new hardware and software, so make sure you get some VDI training and certifications under your belt before you deploy virtual desktops.continue reading
Virtualized GPU technology is still new, so it's a good time to get in on the ground floor and learn how it renders graphics for remote users.continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.