Organizations turn to VDI—Virtual Desktop Infrastructure—to save money, manage desktops more easily, increase security, and provide a way for employees to work from wherever they need to, whenever they need to. It sounds like a great idea…until the implementation begins.
At that point, the desktop aspect of VDI goes out the window because companies have to focus most of their resources on the infrastructure aspect. That’s because outdated technology makes the infrastructure ridiculously complex. However, it doesn’t have to be that way anymore.
This has been the typical scenario for more than a quarter-century: When the IT department buys desktops, they focus on how much CPU, memory and storage comes with each desktop. Why should a virtual desktop project be any different? IT staff should spend their time thinking about those same desktop attributes. Instead, they are forced to spend most of their time dealing with VDI “infrastructure”: servers, storage, layers, management tools and much more. Managing all this infrastructure is exhausting and expensive, and when the focus is on the “I” and not on the “D,” users end up unhappy and IT staff end up frustrated. How did this happen?
Over the last seven to ten years, IT teams have been facing the challenges of changing workstyles, information security risks and constrained resources. They have been accustomed to spending about $1500 per physical desktop (or laptop) and amortizing that cost over three to four years. Additionally, they would take care of all aspects of managing those desktops and keeping them updated. But the overhead of dealing with physical desktops is really unsustainable, and once the world went mobile, tethering your users to a desktop was a sure way to give your competitors the advantage. In response, some IT teams made the decision to deploy virtual desktops and apps.
VDI boasts a more productive mobile workforce, along with stronger information security and greater IT efficiency. But in order to implement VDI on-premise, IT needs to translate desktop attributes into expensive and complex data center technologies. IT staff had to start asking questions such as, If I have 1000 users, how many servers do I need? How much shared SAN/NAS storage do I need? In which data centers do I put this infrastructure? And many, many more.
The IT department’s first step in the VDI journey is to look at desktop usage in their organization to determine the number of servers needed. Which applications are used? What is the CPU and memory usage rate? How many users can I fit onto a certain class of server? Do I need 20, 30 or 50 servers for 1000 users? It all depends on usage.
It has been even more difficult to determine storage needs. Local storage on PCs is the cheapest storage available – about $100/TB. SAN/NAS can be 25-100 times that cost. If each user had 1TB of storage on their desktop, then you would need 1000 TB of SAN/NAS. That is massively expensive.
To deal with this drawback, VDI vendors initially came up with various ways to optimize storage. The conversation went something like this:
“You can optimize with a single image so you don’t need to have 1000 copies of Windows OS. Now, let’s put in layers so you don’t need to have 1000 copies of each application. Wait, what about profile management tools to store end-user personalization? You need that, too. Oh, and you can no longer manage it with your existing PC management tools like SCCM and Altiris. So, your VDI infrastructure is a stand-alone management framework.”
Perhaps this seems reasonable to you, but the problem is that Windows wasn’t architected to operate in this manner. So, IT teams end up struggling with app compatibility, corrupted profiles, and application updates that blow away desktops. Storage vendors also tried optimization strategies such as implementing de-duplication; this way the 1000 copies of Windows and all those applications in each user’s desktop were automatically de-duped at the storage layer. Hyper-converged infrastructure (HCI) vendors ultimately adopted de-duplication, and even though HCI really began to reduce the cost of VDI implementations, it hasn’t gone far enough.
So, what’s the next hurdle? The answer is determining where all this infrastructure is going to reside. Which data center should it be in? How far away will your end users be from that data center? What does that mean for latency? What will users’ real-world experience be like? How much bandwidth will they require?
Refocus on the Desktop
As you can see, the virtual desktop discussion has been focused on infrastructure, not on the desktop, due to the technical complexities involved. IT departments have had to jump through complex infrastructural hoops to deliver a mission-critical workload to a class of users. We all know that IT teams have more important things to do and more value to add than dealing with all this complexity.
Fortunately, the cloud has dramatically changed the old-school way of dealing with VDI, and we have an opportunity to completely re-imagine what the phrase “virtual desktops” means. Indeed, we are seeing companies issue edicts that prohibit further spending for on-premise data center infrastructure. Why? Because now, the “data center” is any region of the public cloud you select. Essentially, the infrastructure becomes invisible in that region – at least in terms of having to worry about it. Desktops can be placed close to users so they have a great experience. All IT needs to do is determine the configuration of the desktop, just like they used to determine the configuration of a physical PC.
But now, it’s even simpler to buy a desktop cloud solution than it is to buy a PC. The IT team just chooses a desktop configuration running in Azure. They order the number of units needed for their end users. Then they use their corporate image to create copies of desktops in the various regions where the users are. But rather than shipping PCs to the end users, IT simply emails the links for the desktops to them.
Because of the cloud, a focus on the “D” in VDI has returned. Instead of sorting out layers of complex infrastructure, IT teams now simply determine what class of desktop an organization’s users need. Forget about all that infrastructure micro-management and start reallocating your IT resources to higher-value projects. Now you can have VDI delivered simply and efficiently as a service, at last fulfilling the promises that legacy solutions could never deliver.
Amitabh Sinha is co-founder of Workspot.