PCs

Sanity check: Is the cloud ready to handle desktop virtualization?

Start-ups and software vendors are tripping over each other to launch new products for desktop virtualization. While IT loves the idea, desktop virtualization is severely limited by the current state of LAN/WAN infrastructure -- with a couple interesting exceptions.

Start-ups and software vendors are tripping over each other to launch new products for desktop virtualization. While IT loves the idea, desktop virtualization is severely limited by the current state of LAN/WAN infrastructure -- with a couple interesting exceptions.

-------------------------------------------------------------------------------------------------------------------

Virtualization has always been about servers -- improving server utilization, enabling server consolidation, and simplifying server management. Now, VMware, Microsoft, Citrix, and a host of startups want to use virtualization to solve a different set of problems -- ease desktop deployment, centralize desktop management, and provide portability of the user's desktop experience.

This is desktop virtualization and the big three virtualization players and a host of venture capitalists are betting that it is going to be one of the next major trends in enterprise IT.

"I think it's a huge opportunity," said VMware CEO Paul Maritz in his keynote address at VMworld 2008. "It's as big as VMware is today."

Maritz believes that because he said he's been hearing a simple message from IT leaders: "We need to get control of the desktop." As a result, Maritz sees desktop virtualization taking hold over the next 12-24 months.

However, there are technical challenges that could inhibit that kind of growth.

The way desktop virtualization typically works is that the PCs that employees use to get their work done are virtualized and  hosted in the data center. The user can then access their virtual computer from a bare bones terminal station -- usually about the size of a Cable modem -- and that saves the money and complexity of deploying full PCs.

Alternatively, a user can access the virtual PC from a traditional desktop or laptop machine -- even an older one that is underpowered since the processing power is done on the servers in the data center and only images of the active screen are sent over the network and processed locally. A diagram of this configuration is shown below.

Source: VMware

The challenge with this type of topology is that it puts a lot more pressure on the network. Server-based desktop virtualization demands a network with low latency and a symmetrical connection of at least 5 Mbps. That's not a problem for most LANs but as soon as you bring a WAN connection into the equation, latency and bandwidth become an issue. That's important because around 70% of employees work in an office other than the company headquarters.

Companies like Teradici with its PC-over-IP system have done a great job of improving the technology for running PCs hosted in a data center. They have successfully made the user experience of the virtual desktop feel just like a normal desktop, with all the same apps and accessories. Nevertheless, even companies like Teradici will admit that its technology is completely dependent on a low-latency, high-bandwidth network connection.

One of the most interesting things VMware did at VMworld 2008 last week was to hitch its wagon to "Cloud computing" -- one of the hottest and most overused buzzwords of the year. A Cisco executive at VMworld even joked that right now any startup that uses "Cloud computing" in its business plan is guaranteed to get funding from venture capitalists in Silicon Valley.

VMware told IT professionals that they could use its software to build their own internal cloud. And, of course, this also has implications for the larger "Cloud" since the eventual goal with desktop virtualization is to make a person's desktop available to them from almost any computer and over virtually any Internet connection.

In the meantime, there are a few desktop virtualization solutions that take a different approach, focusing on local processing and much better online/offline synchronization. The most notable are Kidaro and MokaFive.

Kidaro -- which was bought by Microsoft in March 2008 and will be released in 2009 as Enterprise Desktop Virtualization -- provides a seamless experience in which a virtualized app runs from the data center, but all the processing is done on the local machine and the app itself even looks like any other app. These apps can even dial their own VPN connection, if needed.

MokaFive also uses local processing for its desktop virtualization program, but it accomplishes it with a USB thumb drive (or any other USB storage device). MokaFive separates the User State from the System State and can back up both of them to its site, or you can set it up to back up to you own server. The end result is that the users have a highly available system that can always be accessed by simply plugging in the mass storage device where their User State is saved.

Bottom line

Cloud computing demands a level of uptime and performance that simply are not universally possible right now -- even in highly-developed nations -- and won't be for at least a decade. You can only have that type of connectivity in small pockets, such as the LAN at the corporate headquarters.

In those places, desktop virtualization will have a chance to thrive. It could be especially useful in environments where there are lots of shared systems and a robust network -- such as health care and manufacturing. However, in environments with lots of knowledge workers, the Kidaro and MokaFive solutions make a lot more sense because they take advantage of the current infrastructure and build on it to produce a more usable system that works with today's diverse networks.

About

Jason Hiner is the Global Editor in Chief of TechRepublic and Global Long Form Editor of ZDNet. He is an award-winning journalist who writes about the people, products, and ideas that are revolutionizing the ways we live and work in the 21st century.

43 comments
michael.mckay
michael.mckay

From mjmhelp.wordpress.com. I?m more confident in the success of virtualized desktops. I have, along with many others, been using them for years as remote desktops. First with GoToMyPC and later with Microsoft remote desktop. I would leave my desktop computer running at work and then access the desktop remotely from home or on the road. My primary access method was a laptop connected on my home?s wireless network or the hotel Internet. Never needing to transfer a file or install software, after two years, my laptop was in the same condition it was when I bought it. Yes there were limitations. Editing a Power Point presentation was annoying and multimedia files were poorly displayed. If the Internet was slow, the mouse and keyboard screen updates could be jerky. If I didn?t have Internet, I couldn?t work. But there were always work-arounds and compromises. For Power Point, I learned to turn off the background graphics or use a different template altogether. I spent more time on content than format. For surfing, it was often better to do the surfing locally - but not always. I read on the plane and worked in the airport terminal. However, the benefits of a single working desktop, of not having to sync files, of always being able to get to the desktop no matter where I was or on what machine: these things outweighed the reduction in the ?user experience?. I was willing to put up with less in order to get these benefits. It was a classic cost-benefit trade-off that I think many people and companies will make in the cloud?s favor. I?d also like to point out that many companies are not using state of the art multimedia machines as desktops. A quick walk through some local offices shows 14″ monitors, e-mail and word processing, two-tone text based data entry screens that look like they were programmed in 1970, no multi-media capabilities - basically bare bones corporate only workstations. These are also the targets for virtualization. What I am looking forward to with the virtualized desktop approach is being able to get rid of the corporate desktop altogether. Virtualization has been a key term in servers since it allows servers to be consolidated. If I had 10 servers, I may be able to get away with five or three or even one with the appropriate virtualization technology. If you can virtualize the desktop as well, consider the additional savings. How many computers are there out there in total? What is the ratio of desktop computers to servers? Its a probably more than 2 to 1. If employees have a desktop at work and a laptop for home or the road, the number of non-server computers is even higher. Now consider that desktop virtualization can reduce the number of redundant computers by up to a 50:1 ratio (as claimed by Qumranet?s President Rami Tamir in April.) Yes, there will still need to be terminals with screens and keyboards but these can be much cheaper than the multi-cpu, multi-core machine that currently sits on my desk. In all, this means a dramatic reduction in the amount of desktop hardware out there with a concomitant reduction in the IT support requirements. The savings are too hard to ignore. I agree with Jason that the importance of latency in the network will be important. But just as working with remote desktops in a hotel, it is still possible to do even with tardy connections. In return, the bandwidth requirements are significantly reduced. A terminal for a virtual desktop will only need a fraction of the Internet bandwidth since it handles no files, transfers no data and only displays updates to the graphics. The server in the cloud, on the other hand, has access to the Internet backbone and can deal with files over a high-bandwidth link. For many companies, the server may have access to more Internet bandwidth than their own servers. Jason?s article also mentions a bridge approach from MokaFive. I realize the appeal of MokaFive?s approach, but I have lost (and given away) more USB keys than I can count and would not want to be dependent on one in order to use my computer. I want the freedom of the web even with its restrictions. What would you be willing to sacrifice to implement virtual desktops in your company?

Chaz Chance#
Chaz Chance#

Once upon a time there were mainframe computers, and mini computers, and... nothing else, except in the home game market. You youngsters may not have heard of mainframes, a big centralised processing unit, with thin clients, or dumb terminals as they were called then. Mini computer were smaller versions of the same thing, catering typically to around eight users. It used to be the only way computing was done. The IBM released the microcomputer, which they called the "personal computer". They didn't expect much take-up, so they sold out before they realised what was happening. Then Amstrad demonstrated that open standards and competition meant that even quite small companies could make a good living, and suddenly there were more brands than you could shake a stick at, leading to the world we know and love today. ;) Now, history shows that IBM nearly lost it, because they wanted to stick to the big (where big = mainframe) computer market. And all the other big players surviving from those days want it too. After all, only the big companies can afford to make and market the big computers, so goodbye competition. So how can they win everyone back? By rebranding mainframe computing as "virtualisation". Come on suckers, lap it up.

ericswain
ericswain

We've been running a Citrix MetaFrame Terminal Service for the last 6 years, with 6 remote offices 2 still using a Dial-Up connection, 1 on Satallite connecting at 256kbps and we have found that even though bandwidth may be a issue, centralizing the data at our corporate office creates better fail safes in security, data integrity and overall control on what the end user has access to. With a thin client you cut out the ability to loose data, due to disgruntaled employee's, hardware failure or act's of god,(what ever that might be). So if desktop virtualization is going to centralize the desktop and give the control back to the I.T. Department, I"m all for it. If we are going to be in charge of all the data that is processed within an organization and it's our necks on the line to make sure we have a fail safe backup and contingency plan then let us control the desktop and centralize it. WAN bandwidth is improving with the installation of more and more POPS. WAN technologies such as MPLS allows for a grid like WAN connection allowing Sattelite offices to talk to each other as well as with the corporate office. I see a big future in desktop virtualization, maybe even greater than server virtualization.

pjboyles
pjboyles

No. The bandwidth and reliability is just not there. I doubt it will be there even in the next decade. Virtualized desktops with or without thin clients is a great point solution. It is not a panacea. Only if you can tolerate the remote connection and have a compelling driving cause is this a good fit. Cloud computing is a great point solution. It cannot and will not ever replace the local computer and applications. No business that does a risk analysis will be willing to take that kind of exposure. Items to add: Printing, a huge bandwidth hog! Client management, you still got to mange, support and update that thin client! No one wants to admit you are trading one set of headaches for a similar set of headaches. It will ALWAYS cost more. Having priced this out 3 of the last 4 years there was a 20% to 30% premium to virtualize. Remember that these have their own licensing, management, patching and support costs. Don't forget those licensing costs, they add up quickly!

tjohnston
tjohnston

This is a great post, we are currently investigating expanding our Citrix environment to encompass our Calgary branch office and the more information I can get the better.

fungusAmongus
fungusAmongus

jesus, i've been hearing about the takeover of the thin client since the late 80's, and i dont think its going to happen. there are too many compelling reasons why 90% of users *should* have a real pc on their desktops, although the heavy-handed network guys always drool over this idea.

whalenkcj
whalenkcj

Hasn't Sun already accomplished this entire concept?

jbgarver
jbgarver

This is the same thing that Citrix has been doing for years. It will probably have the same draw backs as Citrix and therefore; I don't expect it to take hold anymore than Citrix did.

Osiyo53
Osiyo53

Interesting idea, and I can readily see where it'd be useful and advantageous. There are, of course, problems and issues. And there are situations where desktop virtualization and hosting apps on central servers is not appropriate. LAN connections are fairly reliable and speedy. However they do have their failures and downtime. WAN connections are not nearly as reliable nor as speedy. If everybody and everything is virtualized and all apps and data are hosted on central servers then an organization runs the risk that at times when something goes wrong ... then the whole organization comes to a grinding halt and nothing will get done. This could be very expensive. In some cases, disastrous. For instance, in my work world I routinely do business with a certain commercial parts supplier. Its a nationwide organization with many outlets. It supplies electrical, mechanical, and electronic parts and other commercial/industrial materials. Now at one time they'd centralized all records at central servers in the home office. These were ONLY accessible via WAN. Each and every branch of the organization was utterly reliant on that WAN connection being up and running. Seemed to work fine. Until it didn't. I'm not sure exactly what caused it but one day when locally we were having severe storms (multiple tornadoes, etc) their system took a dump. No WAN connection. A bad time for that to happen. Because at precisely such a time of problems, their local branches were flooded with repairmen, technicians and the like all clamoring for parts and materials that were needed NOW in order to make emergency repairs to many things damaged by the storms. But the local branches could not process orders, could not even do an inventory check to see if they had the items on their own shelves (this type of supplier keeps nothing up front, everything is stored in huge warehouse area in back not normally accessible to customers) or on shelves at other nearby branches. And they had no backup method of conducting business without that WAN connection. Heck, couldn't even access customer data, account numbers, credit lines, etc. End result. A LOT of needed emergency repairs got delayed. And a LOT of their customers stormed out and went to alternative suppliers to get the materials they needed. They could not and would not wait. As a local branch manager told me, they took a huge, multi-million dollar loss immediately, and a large hit that lasted who-knows-how-long or how much it cost due to some customers deciding the place could not be relied upon during an emergency ... so they took their business elsewhere. Just an example. This organization, by the way, has since decided to keep everything local to each branch (the data and the apps) but implemented a system to update and synchronize the master database back in the home office. This way, during outages of the WAN, each branch could access at least most of the data it'd need and conduct business. Obviously they'd not be able to do a system wide stock check, etc if the WAN was down. Likewise, in my particular line of work I routinely and regularly need a laptop computer that can and does operate independently of anything and everything else. As routinely during my work there is neither LAN nor WAN connection available. At least not to me. On a personal and private level, I'd not use a system that would require the apps I use or the data I keep to be on a computer/server that is located somewhere else and controlled by someone else. Simply won't happen. The data I consider private, I consider private. Period. And all the assurances in the world by some stranger located who knows where in the world that he and his organization will respect my privacy means nothing to me. Nor do I wish to HAVE to use one of a limited selection of word processors, for instance. Or have to relearn how to do certain operations every time someone at the central host for said app decides to make changes/updates to said app. Gad, I get disgusted enough with Microsoft's regular updates and new versions of this and that. And do not jump to installing and using the newest whatever. Any new version of whatever must contain significant enhancements that I actually care about and would use before I'll bother. It has to be actually worth my time to install and relearn. Or I'm not gonna bother. My time is too valuable to waste in that way. More than just a few others I know, privately and in my business dealings, feel the same way. Now as far as portability goes, I do have some data and apps which I need, in my work, to be portable. As there are times when I need to be on somebody else's desktop or laptop. A customer's, for instance. Could break out my laptop and swap data back and forth. But that's a pain. So certain things I keep on a thumb drive. Portable apps in some cases. In the case of data files only, I often keep those in a generic format that is readable by whatever apps are most likely to be installed on most any customer's computer. I'm not knocking the fact that virtualized desktops will have their appropriate applications. I'm sure they will and will be used. But some thought will need to be given by each organization as to what the effects will be when a LAN or WAN goes down. And will the consequences be acceptable to that organization. It's one thing if an individual desktop dies. Loss is one worker's productivity for whatever amount of time. But what if a LAN or WAN outage causes a hundred, or a thousand, or ten thousand workers to be sitting around twiddling fingers? Contingency plans and alternative methods need to be considered and put into place. Just some thoughts. No more than that.

joseph.schweitzer
joseph.schweitzer

I would be interested to hear your thoughts on support and whether the traditional service desk could handle this technology or would you have to do an overhaul on the support model. Bearing in mind more companies are offshoring itf will be interesting to see the impact to these orgnizations.

haroldthehors
haroldthehors

Here at a primary school in The Netherlands, we are using this concept for years. Network stations and a server are bringing all the desktop functions securely to the students. They all carry a personal chipcard and password that enables them to check into their desktop from any networkstation in the building. It only takes a second to start up their work. And it will be taken to the next level in the coming year by bringing the server into the datacenter once the glass fiber gets to the schooldoor. No pc's to manage, no hassle, low energy, high ROI.

brianmilke
brianmilke

I was wondering if the USB user profile could be used by a doctor in his office to seamlessly move from exam room to exam room, not only accessing his own files, but pulling up his pateint's records as well? I am working on a project for my Capstone class at ITT-Tech. It involves bringing together 5 previously unconnected doctors offices that have merged. There will be a central/main office and 4 satelite offices which must be able to send and recieve files and patient data, have VoIP with the ability to transfer calls anywhere in the system, and video conferencing at any site for patient consultations as well as business meetings. It was suggested that tablet PCs be used for the doctors walking from room to room, but this USB roaming profile seems a lot more cost eficient as well as much safer in terms of durability of the equipment and no longer needing to carry the computer with the doctor. Another added bonus I see is the ability of the doctor to go to any of the offices in the system and have the ability to get logged into any computer. I would appreciate any suggestions as to how this would become a reality, as well as any other suggestions to make this a fully functional doctor office system. HIPPA compliance would also be a factor, and I could use some suggestions for software that would allow for patient record access and give the doctor the most flexibility in servicing their patient. It also seems to me that having Iron Mountain doing the data back-up and security would be the strongest solution for that side of the picture. Thanks for your comments and suggestions!

p33d33
p33d33

It makes sense. The only reason we all have our own computers is because it was cheaper than purchasing a mainframe & terminals. Now it's cheaper to purchase a server and virtualize the desktops. Full circle in effect? I'm all for it, my only concern is the current state of Cloud Computing security. Cheers Jason, nice article.

hlhowell
hlhowell

Thin clients, or "cloud" computing is most certainly not new. And if you run a data base, or if you have a lot of cash register terminals or you run an insurance company, it is probably good enough PROVIDED that you have 24/7 support that really works, applications that don't fail in critical areas, sufficient world wide redundancy and total UPS on all servers and backup servers, and provided you do not use Microsoft with their "single point of failure" issues. For all other cases, the bandwidth will never be sufficient, the flopwidth will be totally insufficient, and the network would collapse under the weight of the bit transfers. But of course if you were around during the initial network days with centralized servers you would know this. Since you appear enamorad of this technology, I strongly suggest that you look deeply into the world of server client software and the history that lead everyone to independent computation. Think about the flowwidth of the entire network as it exists today, and name ANY company or government that could single handedly process that much information. Good luck with that! Regards, Les H

fr
fr

Are the cost savings worth it, if the quality of service suffers? What are the real costs to an enterprise when you have poor response times? For instance, in a customer service environment, response time delays directly impact customers who end up waiting far too long for simple tasks to get done. Response time delays beyond the norm also negatively impact the image and credibility of the company, because customers repeatedly hear the excuse of "a slow computer today" or "the system is very busy today"... Detailed response time metrics need to be outlined in a Service Level Agreement with an involved and knowledgeable user community, with teeth in the SLA and a follow-up process. Stress-testing should precede implementations with a fall-back if response times do not meet the mark. Without this kind of arrangements the welfare of customers falls into the hands of accountants and pure technologists, with unpredictable results. Further, response times for specific situations should also be compared to non-virtual response times for the same situation. Suggestions: Local user station cache for non data-bearing components (e.g. an empty small data entry window) should be explored as a solution to response time problems, as well as EVERY other opportunity to do processing locally, while still maintaining some centralization benefits.

mike_patburgess
mike_patburgess

Well said, Once you have given the users freedom, it is damn near impossible to take it back. I am surprised that the datacenter guys have let this happen though given that their job security depended on huge numbers of servers... guess what guys, you have made your jobs redundant.. yeah the big guy in your company has not caught on yet that they can "virtualize" your position.

nathan
nathan

How have they done this already?

despich
despich

I think it's a bunch of hype about basically a variation on the same technology that been around for years. Sure you can now virtualize the entire desktop OS. Why would you want to do that instead of say something simpler like citrix or plain old Terminal Services. Great now you have all these virtualized PC's running on a server and each one will still need updates and some degree of maintenance (I guess they call that virtual maintenance). Contrary to what virtualization advocates would argue thin clients do require at least some management. They still have a screen, network connection, keyboard and mouse and anyone that has worked for a while in IT knows that it's usually one of those 3 things that users need physical on-site maintenance on. Thay also argue that it's because you have better control of the Virtualized PC's than real PC's. I don't see that it's any real difference. I have complete control of all the workstations I manage and they are just regular PC's. (Yes that even includes power) Don't get me wrong I love Terminal services and Citrix and I love Virtualization for some tasks but most of the talk I hear about virtualization I usually think to myself "you don't need virtualization for that". Dan

mark.silvia
mark.silvia

I had a similar issue with WAN connections. Back in the 90's when DSL, CABLE were not readily available and T1 was too expensive. So we settled for frame relay, slower than DSL but faster than dial-up. I had over 45 locations that need to connect to a SQL server. Due to the notoriously slow and unreliable connections over the frame relay, I set each location up with a server that basically mirrors the SQL server (in modern terms, replicate). This gave users fast immediate access in ther application and in the back-end it trickled the data back and fourth to head quarters. My point is, that if latency and connectivity is an issue between sites, it is worth buying a servers servers for each site. This will give you both centrality in terms of replicating and independence in terms faster localized access and the site can operated independently when the WAN goes down.

fredscomprepair
fredscomprepair

What's really wrong with the whole deal of cloud computing are several issues. Buying licenses? Each computer must buy their own licenses. Cost to company, IE hooking up all the servers and dumb terminals. Cost per month from the CLOUD itself. And last but not least is it really what we need? When Solid State Hard Drives become the norm, usually the only thing you would have to worry about on a typical computer would be the power supply going out. As cheap as they are making Computers nowadays there probably would not be such a big difference between dumb terminal and an actual self contained computing device at each desk. Certainly not at smaller companies, the cost to switch and maintain the servers would overun the business. I'm sorry I'm just not impressed with this new IDEA.

it
it

The key to any server based system is making sure the server is accessible. If you put all your eggs in one basket, then you protect that basket with all your might. If everyone is accessing that data from a WAN connection, you could just host it someplace that has really good redundancy, backups, etc. (i.e. Rackspace).

roxroe
roxroe

I work for a school system and have suggested this technology for 2 years now no one is listening. And resources for convincing him? Need the simple explaination, a powerpoint or brief demo What VMware sends me is over their heads.

btd
btd

I'm working in a small hospital and we're seriously looking at SunRay systems. It essentially a dumb terminal with a smart card reader. You insert your card and log into a Windows TS session once, do what you need to do, pull your card, the session closes, walk to another room and insert your card into different terminal, and your right back to the screen you were on in the other room. Docs would have to carry a smart card, and have a dumb terminal in each exam room. Between demos and seeing one in action it's very impressive.

erik.miles
erik.miles

At our main office we use Centricity-EMR as the main patient database software. Our satelite offices connect to a terminal server via VPN for patient visits and updates. They access the Centricity-EMR interface on the terminal server and update patient data that way. If I remember correctly, from HIPPA guidelines, any patient data that is physically moved must be tracked, logged, and signed. I can only imagine that would be a very tedious process with USB thumb-drives.

marcaccini
marcaccini

EPIC would be a perfect application and would solve all of your issues if your networks were all converged. Requires a doctor to login with username and password and then allows access to all patient data and allows for client data entry, labs, x-rays, etc.

it
it

""HIPPA compliance would also be a factor, and I could use some suggestions for software that would allow for patient record access and give the doctor the most flexibility in servicing their patient."" Look into IronKey USB drives. that should be secure enough for you.

Scholia
Scholia

But aren't you still buying desktop PCs for use as cheap terminals? If not, what do you suggest that's cheaper and that actually works as well? Does your "cheaper" include the cost of the expensive staff time wasted when using a virtual desktop, including start-up time? (Especially when a few hundred people start at the same time...) I can believe that a server drive can be faster than a local drive, but my experience of these things is that the "cheap" version is seen by IT as a solution and by users as a huge problem. And I've seen that problem "solved" by people plugging in their own laptops instead ;-)

mweallans
mweallans

Whilst the concept of vitualisation is very commendable I am having similar problems providing vritualisation in the non-Microsoft world. We have a number of systems deployed in disparate locations which we want to virtualise in a central location. We are definitely talking WAN. What solutions have been used elsewhere, and with what success?

wdewey@cityofsalem.net
wdewey@cityofsalem.net

Virtual desktops are separated logically by the VM software where as with terminal services they are not. To illustrate this I have had problems with people getting other peoples printers when connecting to a citrix system. They can't print to them because they don't have permissions, but it causes confusion and time loss. With a virtual desktop this would not happen. Bill

mdeoliveira
mdeoliveira

If you dont have the network infrastructure with the fault tolerance to provide continuous availability, I cant see any IT Manager willing to roll the dice on centralizing all desktop services. Server virutalization has reaped huge benefits in speed of implementing new servers and maintaing those servers. I can see the same benefits for desktop deployment and maintenance but if you lose connectivity, your simply dead in the water.

SgtPappy
SgtPappy

This is the correct way of doing it.

Osiyo53
Osiyo53

Of course one would like really good redundancy, backups, etc. But no matter how good that redundancy is, there will be times when the WAN connection is down. That's just a matter of fact. For any of an endless list of reasons. Lines accidentally cut by construction or repair crews. Programming glitches. Virus attacks. Storms. Earthquakes. You name it. Sooner or later it happens. I worked for a major telecom for 10 years. One of the biggest. And despite having the best available minds and skill sets, and a BUNCH of available cash they could and did spend on safeguards, they had their outages. Of course their outages were limited by the nature of the way they did things. A lot of redundancy. Plus each switch, think of such as a local computer LAN system, could in fact operate independently. An outage of the "WAN" simply meant that while locals could not make long distance calls, they could call others as normal as long as they were within the same area covered by the local switch. Larger switches, those covering denser population areas had both battery backup and emergency generators. AND the telecom kept repair crews on hand and ready to respond to outages with repair parts readily available. The local switches kept on doing their thing. And once the wider area network connection was restored, any data that was meant to be stored/transferred to a major network hub was then forwarded and data in central servers was updated. Likewise, in my current job, one of my customers is a large Casino. They also have significant redundancy. Mirrored servers, in separate locations. Each location having it's own commercial power supply, plus backup generators, plus redundant cooling systems, plus separate staff personnel at each location. Both IT personnel and security personnel. In each case, each organization also built the rooms or buildings for major hubs/server locations with extraordinary fire prevention and control means, earthquake proofing, so on and so forth. In each case the organization does its best to ensure that an outage does NOT cripple everyone, everywhere. However, all this sort of thing is sort of expensive. Is it worth it? Depends on the business. But such issues need to be considered in the planning and business modeling, at least. Assumption ... the system WILL go down from time to time. So, can we live with it? Or do we spend some money to limit the damage (work stoppage)? If so, how much? And what methods should we employ? This was the point I was attempting to make. Sometimes over reliance on always having centralized data access, control, and decision making is not a good thing. Whether we're talking about the IT biz or governments.

JimTeach
JimTeach

I am an IT networking instructor, in a technical high school with 1000 PCs, in a very large school system. Central control did not work with the mainframes. Distributed processing and decentralized authority gave much flexibility to the classroom teacher. Now the IT department wants total central control again. Teachers had much more control and authority over their classroom content. Do we want IT telling teachers what to teach and how to teach it, limiting access to what they want us to have, not what we need? The IT department is not concerned with classroom success, only how easy can they make their job. IT is not evaluated on how successful the systems help teachers complete their mission of teaching students. They use the 99.999% uptime as their evaluative measure. Examples: (1) I am limited to which articles on TechRepublic I can see because the central IT department is afraid my IT networking students may read the material and break their network. It hasn't happened in 15 years but they want to control all. I still read the articles at home and give the info to my students. (2) This weekend we had a scheduled power outage in our school. When we came in this morning much of the network was down. It is now 6 hours later and we are still not entirely up and running. Where was the quick responsiveness of the central IT department? We had to call the Help Desk to log a ticket before they could (would) have someone look at the network devices. The IT department turned off the network alarms for the power outage so they did not get any alarms and of course school starts before the IT department arrives at work. Education is the critical mission of our business. Do we want to turn all control over to IT? Since our school was not completely down, their 99.999% was not affected. It's only affected when we are completely down. (This has happened twice in the past 30 days.) IT is suppose to support the needs of the organization so it can meet the critical mission of the business, educating students. Giving IT complete central control, by any means, hurts education.

wdewey@cityofsalem.net
wdewey@cityofsalem.net

If the doctor was not saving any patient data to the USB drive this would not be an issue. If the USB drive just kept his profile settings and all the customer data was handled through an application interface it would be a very good solution. Bill

pwoodctfl
pwoodctfl

I am working on a vitualization project for a company that has semi-autonomous offices that sell our product. We provide much of their software, but getting them to leave their desktops alone and not load anything on to them is a hassle for us as well as them. Then there is the lost time to download patches, the costs associated with the service desk to maintain our machines because there is secure data on the desktop, not to mention the problems of specifying and supplying computers for those offices... Vrtualization would permit us to create a desktop for the user when they need to connect to our programs and let the offices load whatever they want on whatever type of computers they wanted to use. No more scheduling 4000 deploys, no more service desk calls when they don't get all the upgrades and have to be walked through the upgrade process. Oh, and the maintenance on the hardware would become the responsibility of the computer owner....read that....not us. Everytime they logged in they would get all the software that they need with all the upgrades, and totally independent of anything they have on their desktops. That is the cost savings.

scarville
scarville

http://www.nomachine.com/ I've used it more or less successfully with Linux on Xen and VmWare as well as no virtualization. I'd recommend at least 384K in each direction tho I used it during testing on 56K dialup.

jred
jred

You don't say, but X Windows has been around for *nix forever & a day. Not sure, but I'm pretty sure it'd work on MacOS, too.

dmacleo
dmacleo

its all fine and good to talk about redundancy and stuff for those in non-rural areas. I live in an area where 3 towns are serviced ONLY by 1 single telecom that does NOT offer business solutions and only offers residential dsl. 1 lightning strike takes a dslam down for 48 hours and cripples it for 5 months before the telecom has the money to replace it and associated equipment. power is also an issue, 1 snowstorm causes outages that last days. after restoration of power the dns servers (whose batteries ran out on day 2) take 2 days to get back up and running after bringing building from -30F to 70F. at least with a normal pc I can run on generator and get some stuff done.

Cincinnerdi
Cincinnerdi

In IT but used to be a classroom teacher and I totally agree with what you say. IT departments only work when they realize they are a SERVICE department. The district administration must allow teachers to prepare an IT report card annually focusing on where IT gets and "A" and where it gets an "F." BUT, rather than blaming the IT staff, figure a way to set reasonable goals and get the most from the often negligible budget. Then set teacher-IT liasons to get some positive communication going. That said, the LTSP is designed for schools and you may want to look into it.

Chaz Chance#
Chaz Chance#

Would show you how to configure the computers so that software cannot be installed on it. It's a couple of registry settings, and it takes five minutes. Surely Linux and macs have similar security features?

Editor's Picks