Contrary to what vendors would have you believe (and what many users have already found), VDI is not cheaper than traditional desktop computing. Of course, those selling VDI counter with complex cost models and TCO spreadsheets and the like, which they claim prove that VDI is cheaper.
My study of cost models (see the "How to Lie With Cost Models" chapter of my book, The VDI Delusion) shows that whoever is conducting the analysis can make the numbers show whatever they want.
I don't have the column inches to go into the details, but I can sum them up like this: In most cases, yes, companies can save money when they move to VDI. But the reason they save money is because they deliver an inferior computing experience with VDI. Thus the savings come from cutting the quality -- for example, giving each user one-eighth of a processor core versus two full cores on traditional desktops, not providing GPUs, etc.
It's tricky. Companies make this cut at the same time they implement VDI, so at first glance it seems like the savings come from VDI. But there's no causality there. It's more of a coincidence, like, "We're going to VDI, and we're saving money by cutting the quality of what we're offering, so therefore the savings must be due to VDI!"
If a company really wants to save money when it comes to desktop computing by cutting down what they offer, then they should just do that. Why bring VDI into it at all?
Moore's Law to the rescue
While I've been talking about the farce of VDI cost savings since VDI's birth in 2006, we're actually starting to see some changes that affect this notion.
The effects of Moore's Law and technology trends in general cause computing power to double every few years for a given price. Desktop PCs have benefitted from this trend for the past 20 years or so, with today's $800 desktop computers having multiple cores, GPUs, gigabytes of memory and hundreds of storage IOPS. So that's the "standard" for a desktop.
When VDI came onto the scene, it cost $5,000 per user to buy the equivalent power in a server that users already had on their desktops. That's what caused the people selling VDI to say, "Do users really need that much computing power? Let's trim that down to $500 worth instead of $5,000 worth." And boom! VDI was cheaper.
But $500 worth of server capacity in 2006 provided an awful user experience (which is part of the reason VDI never took off). Thanks to Moore's Law, however, that same $500 per user in the data center could buy double the capacity in 2008, and double again in 2010, 2012 and 2014. In other words, we can buy 16 times the data center computing capacity today as compared to 2006. Or put another way, the $5,000 per-user we needed in 2006 for a decent VDI experience can be bought for a mere $312 today.
Compare that to the cost of today's desktop computers, which still hovers around $500 to $1,000 -- the same price points as 2006. (Moore's Law doesn't have as much of an effect on desktop computers since most of those costs are in the metal chassis, the power supply, the shipping, the supply chain and the OS license.)
So when it comes to looking at VDI costs in 2014, I still don't believe that VDI is "cheaper" than desktop computing. But I do believe that thanks to 2014's technology costs, we can now deliver a decent user experience via VDI at a $500 price point, something that was flat-out not possible before.
About the author
Brian Madden is an opinionated, supertechnical, fiercely independent desktop virtualization and consumerization expert. Write to him at firstname.lastname@example.org.
Hidden VDI cost savings
Calculating VDI ROI
Why VDI might not be right for you