Edge computing technology has gained traction in recent years because of its ability to overcome some of the growing...
pains around processing large amounts of data.
Edge computing -- sometimes called fog computing -- uses the extra power on users' endpoints and satellite servers rather than central data centers to process information. The satellite servers are located near the endpoint devices they assist. As a result, the data users need immediate access to is saved close to or directly on the endpoint device so the data can get to users. The devices themselves can process the data. Any other data that is possible to archive goes to the cloud for long-term storage.
Edge computing technology can come in handy in many deployments, including in the public cloud, which does not process vast quantities of data as well as some people may think. Another place where edge processing can be helpful is VDI, where explosive data growth is possible.
Edge computing withstands data downpours
Edge computing technology comes in many different forms, but the basic idea is that each endpoint device has its own storage, memory and CPU resources. The technology can quickly send time-sensitive data to the endpoint devices and process it without uploading the raw data to the cloud because the satellite servers are in close proximity to where the data is created or it is saved on the device itself. Edge computing technology lends itself to massive scalability because IT can distribute workloads across numerous physical devices and servers, while saving room in the cloud for stored data.
Will edge computing drift to VDI?
In a VDI deployment, the edge devices would create data and send it to virtual desktops for processing. The idea is that if the users' endpoint devices have any unused CPU capacity, then that excess capacity could process raw data on the devices themselves rather than having to send the data back to the data center.
Edge computing works great with physical desktops. In theory, then, edge computing should work with virtual desktops because they are essentially virtual representations of physical desktops. The potential problem is that virtual desktops share a finite pool of hardware resources, and VDI shops often configure host servers to achieve the maximum possible virtual machine (VM) density. As a result, IT must consider the effect of edge computing on the available resources.
The practicality of using edge computing with VDI depends on the VDI deployment's available CPU capacity and the nature of the workloads at the edge. Fortunately, edge computing workloads tend to be CPU-intensive, and VDI deployments usually have plenty of extra CPU capacity to support edge workloads. If the workload produces a significant amount of storage I/O, however, it is less likely to be suitable for use in a VDI deployment.
Whatever the case, it's always a good idea to put throttling controls in place in a VDI deployment. These prevent any of the virtual desktops from consuming system resources to the point of negatively affecting the performance of other VMs.
Discover the new frontier of edge computing
How mobile technology is enhancing edge computing
Data centers are replaced with edge computing architectures
Related Q&A from Brien Posey
Mistimed updates in Windows 10 can drive users crazy and cost organizations in terms of productivity. There are steps you can take to pause these ... Continue Reading
App layering, which separates apps from the underlying OS, helps IT in several ways, including allowing it to deliver apps to specific groups, but ... Continue Reading
If IT pros effectively use either app layering or app virtualization, they can simplify management while improving the end-user experience. Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.