In an age of on-demand computing, firms that buy their own hardware are sometimes portrayed as wasting money.
The argument goes that as the price of cloud computing falls businesses should be dumping their servers in pursuit of "datacentre zero".
This month Andy Caddy, CIO of Virgin Active, told the Interop conference in London that, in many cases, CIOs buying IT infrastructure were saddling the business with "instant legacy".
"The problem one faces as a CIO is that we're solving software problems quicker than hardware goes out of date," he said, referring to the prodigious rate at which business-oriented, cloud-based services are springing up.
The cost of these services and the cloud infrastructure they're based upon is falling, and will continue to do so as cloud providers pass on more of the savings that the scale and automation of their datacentres makes possible.
Rob Fraser, CTO for cloud services at Microsoft UK, agrees with the assessment that eventually the price of cloud services will fall to a point where it becomes near impossible for firms to compete with the public cloud on computing cost.
"Fundamentally, from an economic point of view, there must come a point at which the cost per unit of compute, unit of storage, unit of analysis becomes hard to compete with the scale of public cloud. Economically look at all the forces of commoditisation and that point will have to occur," he said.
But to believe that businesses will move wholesale to the cloud on the basis of cost alone, he said, is to ignore a swathe of issues beyond price.
"There are still going to be hugely valid reasons why on-premise infrastructure needs to run its own level of scale, even if it might be more pricey, because of other issues around the business."
Fraser references regulatory restrictions, such as those that limit data to residing within certain countries or continents.
Others point out the surrender of control that comes with cloud services: over who has access to your data, over the long-term availability of the service, over whether your data will be accessible when you need it.
That loss of control even extends to certainty over how systems will work. The technologies underpinning cloud services are frequently opaque and that ambiguity can leave users subject to the whims of a third party. A development house found that, ahead of a spike in users for their application, they had to notify Amazon to "warm up" the Elastic Load Balancers that distribute traffic - a fact they only discovered after the balancers failed.
Companies that have moved wholesale to the cloud, such as the AWS-hosted video on demand service Netflix, have engineered around some of this uncertainty by building architectures that can handle failures without disrupting services to the end user - regularly inducing failures to check how their systems cope.
Ultimately, Microsoft's Fraser said that proponents of "datacentre zero" are elevating price above the complex mix of competing needs that many businesses have.
"There's never going to be a case of saying - it's always going to be cheaper in public cloud therefore everything sits in the cloud. There's going to be cases where this hybrid estate is going to be necessary and likely."
Nick Heath is chief reporter for TechRepublic. He writes about the technology that IT decision makers need to know about, and the latest happenings in the European tech scene.