When you think about it, a PC has three core components: the desktop operating system (OS), the applications required by the user and the user's data. The desktop OS is relatively easy to work with and prepare, especially if you are working with Vista since it supports image-based installations and also supports the concept of one single worldwide image per organization. In fact, the OS itself consists of no more than the actual operating system, updates and any core utility you require for each desktop. This is easily captured into a single image that can be applied to any desktop, whether physical or virtual.
Where the desktop construction process becomes more complicated is when you begin to work with the applications you need to deploy to support user workloads. Most organizations have several types of PC users -- basic productivity user, knowledge workers and specialized workers, among others -- each of which will require both common and specialized applications. Installing applications on each system changes the very nature of the system because applications impact the OS at such a deep level, changing both core files as well as the PC registry. And, each time you install one application, you can never be sure it won't affect another. "DLL Hell" or the possibility of one application breaking another during installation is still a fact of life despite all of the efforts Microsoft and others have expended on resolving this issue.
The first is application isolation. Applications are isolated from the operating system and from each other through a protection layer provided by the virtualization agent (see Figure 1). The desktop OS is protected from changes, and application components are isolated from each other even while in memory, yet they work together normally as would any Windows application. That's because applications are not installed on the OS. Instead, you capture the application's running state -- everything you need to actually run the application -- and copy that to the system. This feature completely eliminates DLL Hell once and for all.
Application virtualization protects the OS and other applications all the time.
The second feature provided by application virtualization is software streaming. While traditional applications are pushed to target systems, sometimes requiring long delivery and installation windows, streamed applications are pulled by a user when the user needs it. The application can start as soon as enough bits are downloaded. The pulled bits are cached locally and streaming can continue in the background as the user works. When the user requires a new function from the application, more bits are pulled or are called from the local cache.
Streaming transforms the traditional application delivery system. For one thing, applications are delivered in real time only when the user actually needs them. Microsoft Word, for example, will begin to work with less than 200 KB. Second, only those applications the user actually needs are delivered. For example, if a user is mostly a typist, then they mostly rely on Word and rarely need the rest of Microsoft Office, yet in traditional delivery mechanisms, you deliver the entire suite regardless of actual need.
Application virtualization offers a promise that Microsoft has not been able to deliver up till now through traditional Windows application installation mechanisms: Any application will work all the time on any system, physical or virtual. Taking advantage of this technology when or even before you move to VDI will definitely reduce costs and improve overall application availability in your network.
For a discussion on how to work with applications, look up Chapter 6: Preparing Applications from the free Definitive Guide to Vista Migration.
Table of Contents
- Tip 1: Verify device support in your hypervisor
- Tip 2: Identify desktop virtualization audiences
- Tip 3: Prepare and protect user profiles before virtualizing your desktop
- Tip 4: Use application virtualization before moving to VDI
- Tip 5: Lock down systems by switching to a VDI technology
ABOUT THE AUTHORS:
Danielle Ruest and Nelson Ruest are IT professionals focused on technology futures. Both are passionate about virtualization and continuous service delivery. They are authors of multiple books, including Windows Server 2008: The Complete Reference (McGraw-Hill Osborne), which is focused on building virtual workloads with this powerful new OS. They are currently writing Virtualization, A Beginner's Guide (McGraw-Hill Osborne). They are also performing a multi-city tour on Virtualization in the U.S. Feel free to contact them at firstname.lastname@example.org for any comments or suggestions.
This was first published in December 2008