Consumerization is one of the big words in IT these days. Technology trends have historically started first with businesses, only later expanding into the general consumer market. This was true for everything from calculators and mobile phones all the way through personal computers, for mostly one reason: cost. The high cost of development of new technologies led to high price tags for them, which meant that only businesses could afford them at first. As the technology evolved and the prices dropped, it could then be delivered to consumers at a reasonable price.
While thinking about consumerization usually involves hardware-related issues – inevitably leading us down the path of Bring Your Own Device (BYOD) – the fact is that this trend is much more present in the services and applications market. Social networks, for instance, are entirely consumer-driven, and yet there isn’t a single company in the world that isn’t worried about their reputation on social media. Even mapping services today are much more focused on the consumer, rather than on business applications.
For many of these services, businesses are only a small part of the ecosystem. But how and why have companies, who were traditionally responsible for footing the bill for the development of new technologies, been relegated to a secondary role?
Cost and speed
The development costs of new technology, be it in the form of hardware, software, or services, was historically very high. This high cost of development had two main consequences: first, very large investments were necessary if a company wanted to develop anything new; second, in order to recover these investments, the outcome of these developments needed to be sold at high prices. The former meant that only large companies – or whoever was backed by them – were able to develop new things, and the latter meant that only other large companies could purchase the newest technology.
While this still remains somewhat true for the hardware market, where physical objects still need to be made, several revolutions upended this relationship for applications and services. Skipping over the rise of the PC, which created the consumer market for applications, and going to the appearance of the World Wide Web, which allowed for easier distribution of these applications and services, it is easy to see how much development costs have dropped: developer tools which used to cost hundreds or thousands of dollars can now be had for free by any small company, and anyone can add an application to an app store and start selling instantly to millions of users.
Speed is another great issue. If your application needs to be customized to run on the environment of each company you sell it to, it’s going to take you a long time to get many customers. The longer it takes you to get customers, the longer it takes you to recover your investment costs, unless you charge higher prices. With today’s distribution channels, however, anyone can create a useful app or service and market it to millions of people all over the world. Since distribution is so cheap, the price can be really low. Today’s channels allow developers to pursue a “high-volume, low-margin” strategy unlike ever before.
The role of the cloud
Cloud computing is another big enabler of the consumerization of applications. While it is great to be able to distribute your application to millions of people, distribution will be useless unless you are prepared to handle the loads that may come from it. Before the cloud, a company would need to make a huge investment in server capacity to handle this. Huge investments mean getting someone to pay the bill (usually investors).
With the dynamic elasticity of the cloud, however, this initial investment is no longer necessary. This means that more people can develop more applications, increasing the chance of consumers finding useful applications or services that they then take back to their business lives. Cloud-based applications and services are a side of cloud computing that is often overlooked, but are the elements that are the most quickly taken up by consumers.
Cloud computing has created a “Bring Your Own App” movement (rather than “Bring Your Own Device”), where IT is forced to support the multitude of applications that business users are relying on to get their work done. Perhaps even more importantly, the rise and fall of applications and services are more and more in the hands of the consumer market, so that IT might not even have a choice with respect to what it must support.