Virtualization may be one of the hottest trends in IT, but there are plenty of organizations that haven't adopted...
the technology. This is because, despite the potential benefits, management often views server and desktop virtualization as an IT toy rather than something that can improve the bottom line.
Whether or not virtualization saves money on hardware or systems management costs is open for debate. However, virtualization can lower electric bills. Of course, if you pitch virtual desktop infrastructure (VDI) to your manager based on this premise, then you better be prepared to provide an estimate of how much money can really be saved.
But calculating the amount of money that virtualization can save on power bills can be a wildly speculative process. If you enter the phrase "virtualization power consumption savings" (or something similar) into Google, you will find links to countless websites offering free power-savings calculators. However, these calculators have several problems.
First, many of the power-savings calculators are offered by organizations with agendas, such as virtualization product vendors or environmental groups. These conflicts of interest make the information that such calculators provide suspicious.
Furthermore, power-consumption calculators are notoriously inaccurate. If you enter the same information into five calculators, you are almost certain to get five different answers. In some cases, the results won't even be close. For example, when I tried to calculate the savings of taking one server off of my network, one calculator predicted that I would save about $8 per year, while another estimated the savings to be closer to $300 per year. Several factors make it difficult to accurately predict the amount of power a computer will actually consume.
Determining power consumption in your organization
Calculating power consumption seems easy at first. Every computer has a power supply at a certain amount of watts. Therefore, if you multiply the number of watts of power a device consumes by the number of hours that the device is used, and then divide the answer by 1,000, you get the number of kilowatt hours used. Since the price of electricity is based on kilowatt hours, you can then translate this result directly into costs. For example, I have a computer with an 800-watt power supply that I run 24 hours a day. In a year, the server is powered on for 8,760 hours. Since I know that I am paying about 10 cents per kilowatt hour, the math works out like this:
(800 watts * 8,760 hours per year) / 1,000 = 7,008 kilowatt hours
At 10 cents per kilowatt hour, the price per year works out to $700.80.
But although the formula is accurate, the end result isn't.
A power supply's power rating is the maximum power consumption (without taking the monitor's power consumption into account). But just because my computer has an 800-watt power supply doesn't mean that it is consuming 800 watts at all times. In fact, it probably is consuming closer to 200 watts of power.
Furthermore, the latest computer hardware includes all kinds of power-saving features, and the Windows 7 operating system has several power management enhancements. For example, Windows 7 uses a technique called Core Parking to shut down CPU cores that are not being used, thus saving power. If a desktop goes to sleep, that saves even more power.
So how can you determine exactly how much power your desktops use? Since most organizations have a variety of different makes and models of desktops, asking the manufacturer isn't an option. Besides, the manufacturer wouldn't provide a completely accurate answer because overall power consumption varies depending on how heavily a computer is used. Upgrades that have been made to the computer can also affect its power consumption.
Your best bet is to select a representative sampling of the desktops in your organization and plug a power meter in between the computers and the outlet. Power meters normally sell for about $20, and this model from SmartHome is a popular option.
Once you get an idea of the average amount of electricity your users consume, you can begin to examine virtualization-related cost savings. The only way that VDI is going to reduce power consumption is if desktops are replaced by terminals. But rather than making a huge initial hardware investment, first purchase a terminal and use a power meter to find out how much power it consumes.
After you know how many kilowatt hours of power are being consumed by the PCs on your network and how much power the terminals will consume, you can translate the power consumption into electricity costs and determine how much money you can save on your utility bills by switching to virtualized desktops.
Since it can be tough to get a copy of the power bills from accounts payable, consider using Google Power Meter to monitor power consumption. This free utility can help you to track exactly how much power is being used, and it can report cost savings based on decreased power usage.
ABOUT THE AUTHOR
Brien M. Posey, MCSE, has received Microsoft's Most Valuable Professional Award four times for his work with Windows Server, IIS and Exchange Server. He has served as CIO for a nationwide chain of hospitals and health care facilities and was once a network administrator for Fort Knox. You can visit his personal website at www.brienposey.com.