Modern Infrastructure

Application performance management sets new goals


Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

VDI costs less, works better in 2014

Vendor TCO models inflate how much money VDI can save you, but VDI costs are finally coming down -- thanks in part to Moore's Law.

Contrary to what vendors would have you believe (and what many users have already found), VDI is not cheaper than traditional desktop computing. Of course, those selling VDI counter with complex cost models and TCO spreadsheets and the like, which they claim prove that VDI is cheaper.

My study of cost models (see the "How to Lie With Cost Models" chapter of my book, The VDI Delusion) shows that whoever is conducting the analysis can make the numbers show whatever they want.

I don't have the column inches to go into the details, but I can sum them up like this: In most cases, yes, companies can save money when they move to VDI. But the reason they save money is because they deliver an inferior computing experience with VDI. Thus the savings come from cutting the quality -- for example, giving each user one-eighth of a processor core versus two full cores on traditional desktops, not providing GPUs, etc.

It's tricky. Companies make this cut at the same time they implement VDI, so at first glance it seems like the savings come from VDI. But there's no causality there. It's more of a coincidence, like, "We're going to VDI, and we're saving money by cutting the quality of what we're offering, so therefore the savings must be due to VDI!"

If a company really wants to save money when it comes to desktop computing by cutting down what they offer, then they should just do that. Why bring VDI into it at all?

Moore's Law to the rescue

While I've been talking about the farce of VDI cost savings since VDI's birth in 2006, we're actually starting to see some changes that affect this notion.

The $5,000 per-user we needed in 2006 for a decent VDI experience can be bought for a mere $312 today.

The effects of Moore's Law and technology trends in general cause computing power to double every few years for a given price. Desktop PCs have benefitted from this trend for the past 20 years or so, with today's $800 desktop computers having multiple cores, GPUs, gigabytes of memory and hundreds of storage IOPS. So that's the "standard" for a desktop.

When VDI came onto the scene, it cost $5,000 per user to buy the equivalent power in a server that users already had on their desktops. That's what caused the people selling VDI to say, "Do users really need that much computing power? Let's trim that down to $500 worth instead of $5,000 worth." And boom! VDI was cheaper.

But $500 worth of server capacity in 2006 provided an awful user experience (which is part of the reason VDI never took off). Thanks to Moore's Law, however, that same $500 per user in the data center could buy double the capacity in 2008, and double again in 2010, 2012 and 2014. In other words, we can buy 16 times the data center computing capacity today as compared to 2006. Or put another way, the $5,000 per-user we needed in 2006 for a decent VDI experience can be bought for a mere $312 today.

Compare that to the cost of today's desktop computers, which still hovers around $500 to $1,000 -- the same price points as 2006. (Moore's Law doesn't have as much of an effect on desktop computers since most of those costs are in the metal chassis, the power supply, the shipping, the supply chain and the OS license.)

So when it comes to looking at VDI costs in 2014, I still don't believe that VDI is "cheaper" than desktop computing. But I do believe that thanks to 2014's technology costs, we can now deliver a decent user experience via VDI at a $500 price point, something that was flat-out not possible before.

About the author
Brian Madden is an opinionated, supertechnical, fiercely independent desktop virtualization and consumerization expert. Write to him at

Article 5 of 6

Next Steps

Hidden VDI cost savings

Calculating VDI ROI

Why VDI might not be right for you

Dig Deeper on Virtual desktop management

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

Yes, the server/processor costs are more in-line with what we are experiencing with comparable mid-range desktops. To properly license VDI in a Windows environment will at the minimum double your investment. As Brian's book points out, you will not save any money by implementing VDI. You have to be implementing to solve another issue, be it security or mobility.
You've hit the nail on the head here except for the part about sub-$500/seat being flat out not possible before 2014. Regardless of the VDI platform, the demands of running a full virtual machine for each user are similar. Moores law may have reduced server horsepower costs, but not IT staff time, data center op costs or microsoft VDA licenses. There are four other ways to reduce VDI costs that have a much bigger impact: 1) Only delivering full windows VMs where required with lighter-weight and/or no-license cost centralized desktops for the remainder; 2) reducing endpoint device costs; 3) reducing dependence on sophisticated IT staff and data centers; and 4) enabling more types of endpoints to be delivered to broaden the payback of your VDI platform investment. This is what we at Userful have been focused on for years, for us delivering full VDI (including endpoint device, licensing, server costs and IT staff time to setup the solution) for $250-350/seat is very achievable.
Too bad server memory costs don't follow Moore's laws these days.

Get More Modern Infrastructure

Access to all of our back issues View All