Data Centers

Will Microsoft, Google, and Amazon talk you out of your data center?

Several big technology vendors are racing to build a fleet of big data centers that will enable them to offer more Internet-based services to consumers and enterprises in the next 5-10 years. See why they think they will be able to talk you out of running your own data center.

Several big technology vendors are racing to build a fleet of big data centers that will enable them to offer more Internet-based services to consumers and enterprises in the next 5-10 years. See why they think they will be able to talk you out of running your own data center.

------------------------------------------------------------------------------------------------------------------------------------------------------

The race for your data center has already begun. Google, Microsoft, and Amazon are the leading players in a global data center build-out that has not been slowed by the current economic recession and that over next decade will change the face of both consumer computing and IT departments.

The reason why these three companies are building out data center capacity around the world at a breakneck pace is that they want to be ready with enough capacity to handle the two big developments that are preparing to transform the technology world:

  1. Cloud computing: Applications and services delivered over the Internet
  2. Utility computing: On-demand server capacity powered by virtualization and delivered over the Internet

With both of these trends, the biggest target is private data centers. Cloud computing wants to run the big commoditized applications (mail, groupware, CRM, etc.) so that an IT department doesn't have to run them from a private data center.

Utility computing wants to simply take over server capacity for private services and applications, using virtualization to seamlessly scale up and scale down those services so that an organization only has to pay for the bandwidth and server capacity that it uses. What most IT departments do now is pay for maximum capacity at all times, with very low utilization, and also risk downtime at peak times if their systems get overloaded because they haven't planned for enough capacity at the high end.

This is an aerial view of Microsoft's data center in San Antonio, Texas. Microsoft has been adding roughly 10,000 new servers per month as part of the its ambitious data center build-out. Photo credit: Microsoft

It makes perfect sense that Microsoft and Google would want to go after this market. Microsoft already runs a lot of the server software that powers the back office, while Google already has lots of expertise in running data centers to power the Internet. Amazon's place here may appear odd to some, since the company started as an online book seller and has evolved into the Web's biggest mega-retailer.

However, Amazon has arguably become the current market leader in utility computing by using the knowledge it gained in building the infrastructure for its e-commerce business and turning it into Amazon Web Services in which it rents server capacity to other companies. In 2008, CEO Jeff Bezos even revealed that Amazon Web Services now uses more bandwidth than Amazon.com (see chart below).

I also expect IBM and Hewlett-Packard to join the party. While neither of those two are being noticed for building new data centers the way Google, Microsoft, and Amazon are, both of them are in the midst of massive, multi-year data center consolidations, and it's certainly possible that they are quietly building lots of extra capacity as part of those projects. HP has been a long-time proponent of utility computing, and IBM recently gave a public endorsement of cloud computing.

What all of these vendors will argue is that they can save organizations from overprovisioning and overspending on server capacity while also adding 24/7/365 monitoring, scalable load management, and a high level of IT service management (ITSM). Of course, the trade-off is that IT departments give up some control, and usually some staff positions as well.

This is essentially an outsourcing arrangement in which IT turns over a chunk of its operations to a third party. Many companies will be fine with that since IT is probably not one of their core competencies. They will welcome the expertise from a third party and will be happy to find a new way to control IT costs.

However, other IT departments and organizations are going to be far more reluctant to turn over their services, applications, and company data to a vendor. Just last week when Google announced that Google Apps can now use Microsoft Outlook as a client, I asked TechRepublic members, "Would you trust Google with your company's Exchange Server data?"

"I'm not sure it wouldn't land you in prison in many countries for violation of various laws on privacy and data security," wrote Deadly Earnest, an IT consultant in Australia.

"In the UK the Data Protection Act states that you must be able to disclose the location of your data, i.e. at any given moment you at least know what country is," wrote Tom-Tech, a UK software developer. "I'm pretty sure Google doesn't do this as they aren't exactly forthcoming with the locations of their data centers and the data belonging to a specific company probably shifts between a few of them anyway."

Zeplenith, an IT manager in Virginia, asked, "Where is the data stored? Are they SAS-70 certified?"

As such, security, privacy, and compliance are major hurdles that cloud computing and utility computing still must overcome. Nevertheless, we should expect that vendors are well aware of these hurdles and will be working with governments, regulators, and standards agencies to develop services that are fully compliant.

We can also expect that vendors will trip over each other trying to prove which one has the stronger security and privacy policies, because they know those factors are game-breakers. It's not going to happen overnight, but these obstacles will very likely be overcome. The companies involves have invested too much in the future -- and IT has too much to gain from a cost and management perspective -- for these issues to not be resolved.

For more insights on cloud computing and other tech topics, follow my Twitter stream at twitter.com/jasonhiner

Bottom line

For governments, large financial institutions, and other high-security environments, outsourcing the data center will probably never make sense. For virtually everyone else, it's going to become a very attractive option in the next 3-5 years. I suspect that a decade from now running your own data center will be the exception and not the rule, and IT departments will need a strong business case to justify the existence of a private data center.

The massive vendor build-outs currently underway are evidence that day is coming sooner than you might think, and IT leaders should prepare by experimenting with low-priority apps and workloads now, and thinking ahead about which parts of your current infrastructure will make the most sense in utility computing and which parts will present the biggest challenges.

Further reading:

About

Jason Hiner is Editor in Chief of TechRepublic and Long Form Editor of ZDNet. He writes about the people, products, and ideas changing how we live and work in the 21st century. He's co-author of the book, Follow the Geeks.

41 comments
mashraqi
mashraqi

... Amazon is definitely talking companies out of their data centers. Data center management nowadays is an unneeded, time-consuming and expensive distraction for most startups. Thanks to availability of specialized and hybrid cloud technologies, multi-cloud deployments make building any kind of startup (real time, data-intensive, low-latency) possible without blowing budget or time to market. Frank Mashraqi http://bigdatalowlatency.com

DWRandolph
DWRandolph

Not an issue with the content of the article, but the format. I cringe whenever someone uses the meaningless phrase "24/7/365". The base "24/7" is perpetual, 24 hours a day by 7 days a week. But how does 365 days a year extend the equation? Will the datacenters shutdown for a day in Leap Year? Or is it just some poor marketing drone who has no understanding of basic math trying to show how much more important their presentation is than folk who use the shorter phrase? Sorry Mr. Hiner, but I just a couple points confidence in all your articles.

s31064
s31064

While compliance and government regulations are one consideration, my main problem is with vulnerability. Yeah, I know the common thought process is that moving to a hosted, cloud or utility environment is a positive step for DR/BC projects. Now stop thinking like a clone and take a step back. Let's not worry about the small stuff for a minute, you know, the fires, the local blackouts, etc, let's look at a biggie -- terrorism. Technically speaking, taking out the Twin Towers barely caused a hiccup on the national IT scene. It was a disaster on other fronts obviously, but none of the resident companies that had a DR plan really had an issue with continuing their business. Now if all those companies had their data in Amazon's data centers (as an example, I'm not singling out Amazon), and terrorists took out those centers, how many of those companies would still be around? Never put all your eggs in one basket, and never, ever let someone else hold the basket.

swilsonw
swilsonw

My company has created their own cloud facility out of open source tools. We have full redundancy for two sites at a third site. The knowledge of how to do this is out there. It is not yet formalized but it will be within the next two years ( bit slow ). Though costly it allowed us to move aggressively against our competitors - and now we are there we realize that the costs of providing the levels of service we supply by doing it the old way would have been triple what we paid to develop and implement the cloud. Don't be cowed by the big boys. You can do it too and enjoy the benefits.

swilsonw
swilsonw

I absolutely agree that corporate IT will embrace cloud/utility computing. But, it's not the best thing to do. From a cost perspective, yes. But for other reasons it is the wrong move. What's great for business is bad for society and for infrastructure. Competition between the clouds will put pressure on the big utility computing vendors to reduce their costs even more to stay in business. This will be done by reducing excess capacity which will result in lower quality service and service interruptions. Eliminating excess capacity creates instability which requires additional staff to troubleshoot and maintain service. However the move towards Cloud/utility computing will have eliminated a large percentage of IT professionals formerly employed in corporate IT shops - they were unneeded. Companies that attempt to retake control of their IT will be faced with higher costs and a skills shortage. Am i being unecssarily pessimistic? No, There is more to business life than cost based control. Cost based control, by it's nature reduces or eliminates excess capacity in the whatever system in which it is applied. Unfortunately the evolutionary principle decrees that the organism or system which runs out of capacity or margin dies off. In a static environment you can afford to have little or no margin, you have reached the peak of evolutionary development. Like the dinosaur you are at the top - but if things change ebven a little bit, because you have no margin left, you die. Organisms, organizations or systems witha greater capacity or margin will dominate. Evolutionary success means having a reasonable amount of excess capacity to survive changes to the environment. Cost based control is a balance to having too much excess capacity - however current IT trends are giving it an undue and dangerous dominant role.

pjboyles
pjboyles

Anyone considering this should have a thorough legal review. Not only are there privacy and regulatory considerations, if the data is held by a third party that third party can be served a subpoena to turn over your data and you need never be notified (in fact the third party could be ordered to not notify you of the subpoena). Letting your data out of your control is potentially a very bad thing. Think many times before you let your data escape your control.

jslick
jslick

No one has the bandwidth even available to pull this off properly. Companies of any size cannot afford the amount of bandwidth it would take to run this efficiently. To boot, think about the productivity loses having employees sitting there waiting for their transactions (be it e-mail, database related, or whatever). Customers suffer then as well. Everybody loses. Honestly, if the datacenter is built correctly, the costs aren't overwhelming. We need to start concentrating on datacenter efficiency from an employee standpoint.

Craig_B
Craig_B

Cloud computing sounds good however the key is the data. The data is what companies and people make, it is their product, without which they would lose money or go out of business. If you could somehow outsource the machines and technology but keep control of the data, maybe you?d have something. However, it will only take one incident for all this to come crashing down, one breech of data security, loss of access to data, wrong data, compromised data, etc. This article seems to state that cloud computing is inevitable and that may be true to some extent; that is data centers will appear to be more of a cloud. Though that may not mean that private data centers are all turned over to outsourced data centers. This same argument was made when everything was going to be outsourced to a different country. The future is hard to predict as we all have free will to choose a different direction.

sales
sales

So thought other big monopolyst companies in Romania, like Romtelecom and RDS, and they invested huge amounts into bringing DSL and cable connections all over the place. They could not, however, outrun 1) fast ethernet and, more recently, gigabit ethernet connections offered by small ISPs (neighborhood networks) nor 2) 1-2 minutes intervention time - since usually the tech department is in the same neighborhood as the customers nor 3) the strong bond between clients and providers who used to play in the street together some 10-15 years ago nor 4) confidentiality aspects which tend to be not so confidential after all at a big profile ISP. This is why nowadays Romtelecom has a huge monthly churn rate in favor of small ISPs on both residential and small business targets, and RDS has credits of millions of $$$ and, as a last resort, guarantees a payback to the bank with its general HQ and will soon face bankruptcy. Mammoths are now lying deep into Siberia's permafrost even if some 20,000 years ago they ruled the Earth. Squirells however are still cute and driving us nuts (lol).

richard
richard

The real arrival of cloud computing will be marketshare, when enterprises are legitimately outsourcing 20% of IT that's a potential tipping point otherwise it's water vapor. By this point SMBs should already be outsourcing 50% or more of their infrastructure. Until then, I'm exhaling.

mattohare
mattohare

I'm not saying I'll never do it. But for now, I want to keep all the steps of production in-house. I don't like to depend on my broadband connection to stay productive.

rduncan
rduncan

This is interesting, I'm working in the Education sector in Ireland and a representative from MS is coming in to meet our IT dept this wednesday to talk about moving our datacentre to an MS cloud & also software as a service,- for free. Google aren't nearly as pro-active even though both companies have thier european HQ here..... I can see more adware in our future somehow!

Deadly Ernest
Deadly Ernest

or people I know as it would put the business owners in prison for breach of too many laws regarding privacy. Once you find you need, by Australian law, to keep a tight control of information covered by the privacy laws you have a clear need for an internal data centre. So the addition of an external (other than an off site in country back up) is not sensible or really viable. I can something like this getting by the privacy laws and in corporate use does place the data at risk but not violate the privacy laws only by splitting it into two databases. One internal with all the privacy and identification information that applies a client code number that doesn't relate to the client in anyway. This code is then attached to another database with all their other data and that can be stored off site. But this would leave all that data, and access to it, at the risk of problems with the entire connections process, Internet bandwidth levels, and the laws of the other site's country. Technically feasible, but a legal minefield and a major increase in operational risks. BTW: What I meant about non-client related IDs. Many places use account IDs that incorporate part of the user name in it, say and account number with the first three letters of the client surname - such as system would NOT be acceptable in the scenario above.

george.hickey
george.hickey

... because management has asked us to, but given that we're a government, health sector body I'm not sure we'll be able to do it since we've a lot of sensitive data. The best we could do I think is a local managed services centre (i.e. in-country so governed by the same laws) which will preclude us from using Google, Amazon or Microsoft and, more than likely, negate any benefits of scale. Still, it would probably be cheaper than trying replicate the same level of availability / redundancy ourselves - it sucks having to research outsourcing your own job! ;o(

M_Teixeira
M_Teixeira

The follwoing must be true before adoption. 1 - the data center is underground - nuclear resisting bunker 2 - redundant high bw cable access 3 - third access wireless (microwave, whatever) 4 - guaranteed backup in another country - as close to realtime as possible 5 - after this there will still be the privacy issues - the most problematic when all this is fixed maybe big companies will use it... maybe small companies start using it, but a terrorist (or natural) event bringing several hundred thousand small companies to a halt is a very big problem.

Deadly Ernest
Deadly Ernest

All along I've said this can be very good if you keep the full control inside your perimeter and do it as an in house option to thin client. But going outside puts you at risk.

millsy17
millsy17

Hesitancy I understand but a complete and utter disbelief that cloud computing will eventually be the norm seems very shortsighted to me. Companies like amazon and google have multiple datacenters around the world. Their scale allows them to provide services at lower costs. It also enables them to provide uptime well beyond what any of us can do individually. To run their businesses they spend billions on developing the most efficient data centers possible. There is no way for an individual IT department to keep up in the long run. I highly recommend a book by Nicholas Carr called The Big Switch to get a better understanding of where this is heading. IT departments who believe they can compete on these points are the ones heading for an evolutionary dead end.

Deadly Ernest
Deadly Ernest

ever heard of a thing called 'Just in time processing?' In this you set your logistics system up to have the minimal stock on hand possible and have stock reorders coming in almost daily, and all orders based on delivery times from the supplier; yes, you usually add a fudge factor of one or two day's supplies. The problem with this is when the supplier is in another state over a huge river. Flood comes in and major bridge is washed away, delivery now goes from one day to four due to having to reroute through two other states as the only other local bridges can't take the heavy freight trucks. Worse if it comes by rail and the rail bridge went. A few years ago I saw three companies go bust because of a strike at another company. Five companies manufactured goods that used a common component made by a sixth company. Each of the five catered to a particular part of the market. Three companies kept costs down by having minimal stock holdings and used Just in Time to order the components needed for manufacture. Two companies used an older stock management system and kept three months supply of parts in stock. The sixth company that supplied the common component (an electronic control system) also used Just in Time for their processes and ordered as required to fill orders on hand. The electronic control company had a shipment of some essential components due in via sea freight, their usual method from this supplier. A strike at the wharf meant their shipment was still sitting on the wharf. The strike took four weeks to resolve and another two weeks to sort out the backlog of stuff on the wharf. After one week the sixth company could no longer make control systems as they had no parts and to get another order flown in from their supplier would be six weeks away due to their current list of orders. Another few days and three companies could no longer make their goods for their retail contacts. By the end of the wharf strike's second week the control company had temporarily closed shop to reduce loses and gave everybody an unscheduled holiday - the agreement with the union was only have the time would come of the staff leave credits. Three of the manufacturers of the retail product had done a similar deal. the other two not only kept making gear from their stores on hand, they increased production to meet extra orders being placed by retailers to fill the demand gaps from the other three. By the end of the strike three of the manufacturers had lost nearly all their clients and had to close down. This is not an isolated incident, yet more and more company managers see switching to Just in Time as a way to save money. Despite evidence it is way too fragile. Cloud computing is the same - it outs your business viability at the end of a very fragile piece of strike with many places it can be cut along it.

Deadly Ernest
Deadly Ernest

to the meeting and raise the issue of MS having, and maintaining, the servers in country - and watch how they respond. make a point of raising the need for annual compliance audits to ensure the data is NOT being moved or made accessible outside of the country in any way. You may also wish to ask them to prove, and contract to ensure, there is never a downgrade of access or usability should a user be accessing via a NON MS browser or system. That alone will make them flinch.

mattohare
mattohare

I was on a bus ride through Co Donegal while reading a business biography of the Google boys. I realised that the outside air in the hills of Donegal is cool enough most of the year to be 'free air conditioning' for a data centre. Put up some windmills, and the energy could be free. I'm sure Harney and Gormley would purr with such an option.

Deadly Ernest
Deadly Ernest

you could use the cloud technology to simply establish your own organisational data centre and then use the cloud technology as the way to access it from anywhere in the country and generate some savings in regards to costs for VPNs as you'd just use the Internet to gain access. The system would need a good encryption and decryption process plus a very strong log on ID process. You can also use your involvement in such a project to help mould it to be one to stay within the organisation and you can stay involved in maintaining it. Much as I hate these sort of things, I can see some advantages in having a government managed and controlled centralised health service data centre easily accessible from anywhere in the country with an Internet access. The centre will need to be located near a reliable broadband Internet service and a fairly reliable power source to ensure maximum uptime and usability. Once you organisation has one up and running, they can then sell the idea of using it to other government agencies in the health sector, and maybe sell it to some private health organisations to sign on and help pay for it. It's worth floating such ideas with your management to see if they're receptive and allow you to retain your employment as well as improving things.

dcolbert
dcolbert

So, we've all been over the pros and cons of the cloud - I'd like to say one thing, regarding my opinion on the cloud... ma.gnolia.com Which is all I am going to say about *that* for now. Regarding utility computing. In their patent application for on-demand consumer delivery methods, Microsoft even admits that the costs associated with "metered, use-what-you-need" application and internet access are likely to exceed the costs that a person today pays for their PC (which they OWN), their "perpetual" software license and their "unlimited" internet access. I think we're going to find this across the board. When you boil it down, right now the problem with the Cloud and with Utility Computing is that *both* only serve the benefits of those who want to see the industry move that way (Microsoft, Google, Amazon). The value-add just isn't there. (Unless you look forward to the idea of giving Microsoft, Google and Amazon the kind of Orwellian control of your digital life that Verizon and AT&T Wireless currently enjoy). All the risks, less control, and it is going to cost me MORE?!? That sounds *GREAT*! Where do I sign up? Can I also be assured that I will take ALL the blame for any downtime or loss of data, as well? Maybe once a year Steve Ballmer can fly out in his corporate jet to poke me in the eye with a sharp stick, too.

pivert
pivert

So in fact, we all should hand over all data to global enterprises? How would you react if this was not an U.S. company but a Chinese? or Russian? We all freak out when some worm might be able to phone-home our creditcards but we have absolutely no problem in letting another commercial company store our data. Spionage becomes so easy (speach technology, e-terrorisme, nsa)... I'm not saying anything negative about the technology, only about the players we all know have a great record on running their business on a "neutral" base...

masonm
masonm

I work for a school with 4 campuses that are all at least 30 miles apart. We house several applications (webservers and database servers) at our largest campus and serve the other campuses across our WAN. This model works well for us because it lowers cost (consolidation) and improves service by allowing our staff to access ALL records from any campus. We also host an exchange server and file server at our primary campus. We consider the benefits of this model to outweigh the downsides. The main downside is the issue of reliability because of the connection. (We do have a backup VPN that runs across our internet connections). The point I am making here is that even internally hosted clouds have their downside. As I said, if the connection is down our smaller campuses cannot access their email, files, and our critical applications. We consider this an acceptable situation because of the money we save with this model. However we are a government funded school that doesn?t have to turn a profit. Some businesses cannot afford the possibility of downtime.

biancaluna
biancaluna

We have state legislation, Commonwealth legislation, record keeping acts (lets not forget record keeping, different kettle of fish to Privacy and IP), national research standards, and there are changes in the wind in the Privacy Act wrt transparency where data is stored. One of the things I ask managers with angst is - so we are doing it well in house? Actually, the answer is typically no. Inhouse privacy, IP, record keeping, security, OSAC, SAS etc is not done well at all. I reckon we are not comparing apples to apples by comparing cloud with inhouse and incountry anyway. Managers make the assumption that the baseline (i.e. comparing to what is done today) is adequate. It ain't. The privacy act does not answer all questions and issues in relation to cloud computing, my experience is that it only raises more. The sad truth of the matter is that we have folks running around with perishable media with critical data that could land someone in jail if lost or stolen. But we focus on picking holes in organisations that make it their business to provide some level of security. We know the data centre is not the major weakness in data theft, data loss and security. Layer 8 on the ISO reference model is typically the problem. You are right, we have Dell, IBM, EMC, MS techs with access to our corporate data. But we don't see the issue in that. And what about staff forwarding corporate data to their external mail accounts? These are interesting issues, Deadly :)

rduncan
rduncan

I'm not advocating the 'simple life- brought to you by MS' in anyway, I can imagine it all looks quite apealing. o' reilly media putting the web 2.0 umbrella up over it aswell, that particular catch-phrase has lost all meaning for me now!

george.hickey
george.hickey

'Tis a great idea but we're not set up to deliver it (we're way too small & overworked). The concept of a pan-government shared services platform has been kicking around here for a few years now but, of course, it was never acted upon while the government had loads of money to do a project like that. Also, we've had one or two well publicised incidents of losing (thankfully well-encrypted) customer data so it would be a very hard sell to go from an almost entirely closed system to one that is open to the Internet, no matter how strong the log-on process was. Thanks for the thoughts though - it's not all doom & gloom - I'll probably have a role here if I want to move away from the technical end and into vendor / service management & project management. Haven't made up my mind about that yet! ;o)

Lazarus439
Lazarus439

It's not only the cost of using "their servers" vice "your own servers", what will it cost to get pipes big enough to provide acceptable response times when someone needs a file that's, well, "elsewhere"? With internal LAN speeds routinely at 100 Mbps and 1 Gbps speeds becoming less rare, even a couple of T1 lines between you and your data is likely not going to cut it. What will the MRC of a T3/DS-3 or maybe even an OC3 going to do the cost equation?

Deadly Ernest
Deadly Ernest

with all data stored internally, more or less as an easier way to access the company server farm. but as to outside storage, better slow down real quick. As to where it's stored, another thread included news reports about a recent glitch with access to Google as a lot of their traffic was being redirected through their China server from the US -- That was a real WHAT THE BLOODY HELL?? moment for me when I read it. US services redirected through Google in CHINA? wo.

Deadly Ernest
Deadly Ernest

you case in terms they can understand and relate too.

masonm
masonm

Which I think is at least partly the result of having so many IT managers who have NO background in IT. Especially at the Director level most of the IT managers come from a management background and not IT (at least this has been true in my experience). I think this is often why the short term ?bottom line? takes precedence over the long term stability of the network. The irony is that the stability of the network will affect the ?bottom line? far more than the short term savings they may or may not see.

Deadly Ernest
Deadly Ernest

the aspect of downtime - what concerns me is the number of managers focussed on a bottom line only who do NOT consider the possibility of a connection going down.

Deadly Ernest
Deadly Ernest

useful solution I've seen applied is the relevant personal information does NOT leave a country without a specified signed approval from the person concerned being on file before hand. When I was with immigration the whole Privacy Act circus was just started and it was quickly resolved for many by the establishment of a policy where all applications had to be lodged at the embassy by hand or post - thus they came to the department on Australian soil and only covered by Australian laws. Yeah, a cop out, but it worked.

biancaluna
biancaluna

Actually, a lot of states in Oz do not have separate Privacy Laws or acts that are not superceded by the Commonwealth act. They may have record keeping acts, but that is a totally different matter. Also, not every agency (body, organisation) is subject ot the Commonwealth Act with the NPPs. I've just been through a whole review wrt to a cloud computing service, it is a nightmare. So what you are saying is not quite covering the issues. You have not touched on NPP2 and NPP9, which mention consent. The record keeping acts are much more explicit than the privacy act, and as such we must not confuse these and the impact they have on cloud computing (wether data centres or services or mail) decisions. Even if an organisation is commercial, but was established for a public purpose, you must comply to those acts. Within the research arena, what you say is also not completely covering the complexities, on the contrary. A lot of research is funded by commercial institutions, govt and defense. There are standards far beyond the privacy sphere, and the Commonwealth act is hopelessly inadequate. Interesting stuff, eh. Waht if an agency operates internationally? MIne does, my word!

Deadly Ernest
Deadly Ernest

legal mix. Here in Australia the major Privacy Law you need to worry about is that of the state in which you operate and collect personal information, except commonwealth government departments as they have to worry about the federal privacy laws. There are differences between state privacy laws but the essentials are the same. Another bite you in the bum aspect is if your organisation operates across state borders and collects information in two states you have to either store it in two states or make sure your system meets both states' laws. One basic of the privacy laws in all states is the information MUST be kept safely secured and NOT made available to anybody for any reason except for the purpose for which it was gathered or by court order or other legislated requirement. The company officers are held responsible for the company's actions in this, and the critical part is your storage is required, by law, to restrict access to only those with a need to know. And this covers personal information on clients and staff and anyone else you collect it on - even stuff related to survey data if it can identify individuals. In that regards, the moment you store the privacy data outside your organisation you can no longer be sure it is NOT improperly accessed. The further away the storage, the harder it is to ensure proper security. As to corporate data, as long as it is not data covered by the privacy act or other legislation, that is purely an internal company matter. For example: a senior Civil and Civic engineer sending the specs of a civilian contract bid to an external mail account to have easy access elsewhere to review them while away from home is a totally internal issue. The same engineer doing the same thing with the specs of a new Dept of Defence security centre could find themselves and the company in hot water for breach of national security if found out. As to a company's data security, someone in the organisation should be checking all the laws that apply to their data and conducting a full security audit every year, to ensure they're compliant. After that, any access matters are internal for the company to decide. One interesting aspect about techs and contractors. You hire an outside tech, they are in a position to check company data and look up the company expansion plans out of idle interest. That's a purely internal matter and of no legal concern to anyone else. BUT, should that same tech look up the home address or home phone number of one member of staff or a client, bingo, a major breach of privacy laws. This is an aspect not often looked at, so you need to consider how the tech staff access the data as well. One place I know of only a few senior technical staff are allowed near certain data storage systems. If they need to get outside help the discs with the live data are removed and others with test data put in. Remember, the biggest risk in any security system is your staff; be it deliberate or accidental. edit to fix typo - I think I got them all.

Deadly Ernest
Deadly Ernest

technical or security talk as much as legality and money. A few cost figures on how much it costs in lost productivity if the Internet connection to the US goes down for an hour can scare the pants off a been counter mentality manager more than a tonne of papers on security.

masonm
masonm

My brother is the ?IT Guy? for a small start up that is providing remote managed services for private medical practices so they can get in compliance with the electronic records regulations. They will be a primarily local service (within 100 miles, which for Oklahoma is local) and will provide redundancy, data backup, management, etc. for the practices. Just curious if your employer would see something like that, a local company, as an option rather than a large ?remote? company. As far as security goes, it?s just like I used to tell the banks I worked for when I was in the consulting arena. I am your greatest security risk. It?s strange how many businesses don?t seem to understand that all the encrypted VPN, IDS, and Firewall equipment in the world doesn?t matter if someone inside is the threat. When I quit consulting I actually sat down with the banks I worked for and talked to them about security for whoever they got to replace me. They gave me WAY more information (passwords etc.) than they should have. I hope they took me seriously and put some policies in place to limit what information their consultants had access to.

dcolbert
dcolbert

With burstable fiber rolling out throughout the country, we're going to see T3/DS-3s going the way of IDSN. The infrastructure for incredibly increased communications is rolling out even as we speak. The problem is that inevitably between you and your cloud hosted data center, you're dependent on several different layers of service provider - and any guy with a backhoe can put an impeneterable barrier between you and your data with just a few careless shovels-full of earth. How long that fiber-cut puts you out of business could be anywhere from hours to days - but there is absolutely *nothing* you can do about it. Business leaders do not understand this. They haven't gotten that yet. When you tell them this, they kind of glaze over and get that "Far Away" look that they're prone to get when you've told them the annoying loophole in their plans to save hundreds of thousands of dollars. "I can't get to my WENIS report? What do you mean? I NEED my WENIS report! It is month end and without my last 4 WENIS reports I can't compile my MENIS!" Yeah, um, only you put all of your WENIS reports in the cloud, and AT&T is having a site outage that is affecting all of our network connectivity, and we don't have an alternate pipe as that would be prohibitively expensive, so, thanks for playing, but your MENIS is going to be late this month, sorry... bye-bye. "Well, you're I.T.! you need to FIX this"... Um... yeah... what part of "AT&T is having a site outage" did you not understand?!? I'll call them right up and tell them they better get cracking or they'll get a strong "What-have-you" from us and we'll see how that goes. By the time you figure out all the ways to increase redundancy to avoid these kind of situations, guess what... It was cheaper to keep your data center ON-SITE and pay a bunch of propeller-heads to keep it running along. Increased external network traffic means a bigger pipe, which costs more monthly. Because all of your critical data is now offsite, that pipe is mission critical, so you need REDUNDANT pipes. Now you're talking a LOT of extra expense. When you find that your redundant pipe through two different telcos still relies on a single, common trunk, then what? Store those WENIS files locally just in case. Only, didn't we OUTSOURCE so that we wouldn't NEED to store locally?!? Well... that seems kinda stupid... Mark my words. This happens *every* time some new variation of "thin-client" computing comes along. It'll cost more, TCO, over the life of the business model - for almost ANY business model you can imagine.

Deadly Ernest
Deadly Ernest

articles by the big buys. Even places like TR, much of what you see will be comment on the available products, not instructions on how to build your own. And that's because all most people do is choose which item for sale suits them due to speed of delivery and pressure to act.

toysarefun
toysarefun

Of course our intranet's and inside networks are still so crude, stifling real creativity with power monger CEO's and managers who don't trust employees. Still, building your own mini-datacenter, should be as simple as one, two, three, but the industry tries to tell you no, buy into third party software products, or give up the goat altogether and go all in with the big boys who produce feature rich software, storage, and the whole farm.

Editor's Picks