The term "server-based computing," or simply SBC, has been around for several decades. Fundamentally, SBC occurs...
when computer processing is done on a server, instead of a workstation or laptop. In the days of mainframes, everything was SBC, since the "green screen" dumb terminals were nothing more than remote keyboards and screens hooked into the central mainframe.
As desktop computers became more popularized by Microsoft in the 1980s and 1990s, companies started moving to a "distributed computing" model; mostly because each desktop computer was running its own programs, and thus the computing was "distributed" throughout a company. Distributed computing really took off and SBC was assumed for dead.
However, companies soon realized that distributed computing had some problems. Software updates were difficult because companies had to update software on dozens, hundreds, or even thousands of individual desktop computers instead of simply updating a central server. This also meant that employees had to get their software programs installed on their workstations before they could use them.
Distributed computing also led to performance problems. This was mostly evident in wide-area network (WAN) scenarios with client/server applications, because the client software would run on a users' workstation, but it would access server data or databases across a slow WAN connection. This would often lead to users having to "click and wait" constantly as they used an application.
Laptop security is another issue that surfaced in the distributed computing model. For example, if you're receiving a letter in the mail every week from some company offering you a year of free credit monitoring service it's most likely because one of their laptops was lost that contained personal customer information.
These challenges of distributed computing that emerged in the 1990s and early 2000s led many companies to shift their focus away from distributed computing and back towards SBC.
Of course, SBC of the 1990s and 2000s is very different than the mainframe days of the 1970s and 1980s. The "new" world was Windows-based (as opposed to the old single-color text-based mainframe world), and so the "new" SBC would have to be Windows-based too.
A company called Citrix licensed the Windows source code from Microsoft in the mid-nineties and "converted" it so that it could be used in an SBC format. Windows would run in the data center, and users would connect via graphical dumb terminals (or "thin clients"). The users' keystrokes and mouse movements would be transmitted to the remote server, and the remote server's screen images would be transmitted down to the thin client.
From the user perspective, this new Windows-based SBC looked and felt just like the "normal" Windows they were familiar with. In many cases the user didn't even realize that their copy of Windows was not running on their desktop. Many companies transitioned from distributed computing to the "thin client"-based SBC at the same time that they transitioned from glass CRT monitors to LCD flat screens; therefore, many users thought the term "thin client" applied to the new, thin LCD display!
Many companies adopted this new Windows-based SBC gradually, rather than a full-on transformation. These companies continued to provide full desktops or laptops running Windows locally (via distributed computing), but they also built some of these central Windows SBC servers to host certain key applications. Then instead of having a thin client device, the users would run thin client software on their workstations along side their normal Windows applications. This "blend" of thin client and traditional applications is what most companies use today. There are, however, still some companies who are all "thin" or all "fat."
With all the benefits of SBC, why isn't every company moving everything they have to SBC? Some of the reasons are as follows:
- SBC means your applications are running on a remote server; therefore, you MUST have a network connection in order to use those applications. This works in an office or at home, but it means that SBC just flat out doesn't function on a disconnected laptop.
- Since all processing happens on the remote server, a slow network connection might lead to a bad user experience as the users wait for their screen updates to be transmitted over the slow link.
- Some applications just do not lend themselves to be used in a way where they run on a different computer than where the users are. Video editing and highly-graphical applications are a good example of this today -- as the SBC communication typically cannot happen fast enough to provide a good experience for the user.
A lot has happened since Citrix brought the concept of SBC to the Windows world in the mid-nineties. Ever since Windows 2000 Server, this SBC capability has been built right into the Windows product. Depending on the version of Windows you're using, SBC is either called "Terminal Services" or "Remote Desktop Services."
Citrix still exists, and about $1 billion of its annual revenue is from products that enhance the out-of-the box capabilities of Microsoft's baseline SBC functions. Other companies, such as Quest Software, Ericom, and Jetro Platforms offer SBC-products as well, and almost 100 million people use some form of SBC to do their jobs every day.
In the past few years, a new form of SBC, called "virtual desktop infrastructure," or "VDI," is starting to pop up. VDI is just like the SBC that Citrix introduced fifteen years ago, except instead of connecting to a remote server running Windows on physical hardware, each user connects back to a remote desktop operating system (like Windows XP or Vista), running as a virtual machine in the datacenter.
At the end of the day, SBC is here to stay. It's not to be used for every user and every single application, but most companies can find a blend where some applications are delivered to certain users via SBC, and others run locally via the traditional distributed model.