Hardware

Should users be allowed to supply their own computers?

In an attempt to show the power of virtualization, Citrix has implemented a policy to allow users to purchase any computer they want. Is a BYOC policy a good idea or a bad idea?

In an attempt to show the power of virtualization, Citrix has implemented a policy to allow users to purchase any computer they want. Is a BYOC policy a good idea or a bad idea?

————————————————————————————————————-

Deploying new systems in an organization always presents a challenge. As we've discussed before, there are issues surrounding who gets what PC and when you should replace old equipment for starters. Additionally, there are the problems of getting the best price,  deploying a consistent image, and choosing the best machine for a user's given situation.

Citrix thinks that it has a solution: Give users a stipend and allow them to purchase whatever machine they want.

Eating its own dog food

According to an article in USA Today, Citrix has implemented a solution whereby they give each user a flat $2,100, and with that money, the user can purchase whatever machine they like and bring it into the office.

Although such a strategy may sound like a complete nightmare to anyone in IT who has ever had to support user-supplied equipment, Citrix has a trick up its sleeve. Rather than locking down the equipment via group policy and enforcing access to the network, Citrix uses its own virtulization techonology to make it work. The article doesn't go as far as to say what the product is, but it has to be some variation of Xen, probably XenDesktop.

As the article points out, Citrix enforces a minimum set of requirement on users. Linux users need not apply, because Citrix supports only Mac and Windows users. Also all users have to have current virus protection. These requirements help ensure basic security and connectivity on the network.

Would it solve a problem or create more?

Naturally it would be hard for Citrix to sell a virtualization system that it wouldn't be willing to use itself. Plus, if anyone could make such a system work, it would be the people who created it to begin with. However, would it work as well in a regular organization?

Virtualizing desktops has long been problematic. There's an issue of network bandwidth. Additionally, if there's not enough server horsepower on the backend, then desktop applications can run very slowly. Beyond the strength of the servers, you have to have enough servers to support the number of desktops that are being virtualized. The investment in connectivity, as well as numbers and power of support servers, can eat up any savings on the desktop if you don't plan properly.

The bottom line for IT leaders

Virtualization has been all the rage these days. So far most of the talk has been on the server side, but more thought has been given to doing the same thing on the desktop. Such technology has been around in various forms for a while now if you think back to WinFrame and Terminal Services, and never has gotten much traction. Although XenDesktop, XenApp, and related products offer new technology, problems still may be ahead. Approach with caution and plan ahead if you're tempted.

Do you think you could use desktop and application virtualization to reduce costs on the desktop and maybe allow users to purchase their own equipment? Or are you just asking for problems? Share your opinions in the Comment section below.

Editor's Picks