The energy crisis has made cost critical for consumers and businesses alike. Amidst the economic downturn, 81% of IT leaders say their C-suite has reduced or frozen cloud spending.
Every company today faces the imperative of modernizing. Operational resiliency for energy and utilities companies — especially across various business functions, technology and service delivery — has never been more important than it is today. To compete, or survive, they must embrace hyper-digitized business capabilities allowing flexible work for critical operations. That means leveraging advanced capabilities of IoT, advanced analytics and orchestration platforms.
SEE: Hiring Kit: Cloud Engineer (TechRepublic Premium)
Artificial intelligence especially will prove one of the most transformative technologies used in conjunction with the cloud. Companies that can successfully leverage AI will be able to gain an edge not only in their ability to innovate and remain competitive, but also in conserving power, becoming greener and reducing cost amidst economic uncertainty.
AI in an energy-constrained crisis
Although some think AI is overhyped, the technology is built into almost every product and service we use. While the smartphone and voice assistants are prime examples, AI is having a dramatic effect across all industries and product types, speeding up the discovery of new chemical compounds to yield better materials, fuels, pesticides and other products with characteristics better for the environment.
AI can help monitor and control data center computing resources, including server utilization and energy consumption. Manufacturing floor equipment and processes also can be monitored and controlled by AI to optimize energy consumption while minimizing costs.
AI is being used in a similar manner to monitor and control cities, buildings and traffic routes. AI has given us more energy-efficient buildings, cut fuel consumption and planned safer routes for maritime shipping. In the years ahead, AI could help turn nuclear fusion into a reliably cheap and abundant carbon-neutral source of energy, providing another way to battle climate change.
Power grids also can benefit from AI. To operate a grid, you must balance demand and supply, and software is helping large grid operators monitor and manage load increases between areas of varying energy needs, such as highly industrialized urban areas versus sparsely populated rural areas.
SEE: Artificial Intelligence Ethics Policy (TechRepublic Premium)
Harnessing the power of AI brings the additive layer needed to easily adjust the power grid to respond appropriately to prevent failures. Ahead of a heatwave or natural disaster, AI is already being used to anticipate electricity demands and orchestrate residential battery storage capacity to avoid blackouts.
To intelligently leverage AI and reduce compute resources when unneeded, you need automation by way of cloud-native platforms like Kubernetes, which already streamlines deployment and management of containerized cloud-native applications at scale to reduce operational costs. In the context of a power grid or a data center, although Kubernetes doesn’t inherently solve growing demand for data or power, it can help optimize resources.
Kubernetes is an ideal match for AI
In a worst-case scenario where the U.K. runs out of energy to power grids or data centers, Kubernetes automatically grows or shrinks compute power in the right place at the right time based on what’s needed at any time. It’s far more optimal than a human placing workloads on servers, which incurs waste. When you combine that with AI, the potential for optimizing power and cost is staggering.
AI/ML workloads are taxing to run, and Kubernetes is a natural fit for this because it can scale to meet the resource needs of AI/ML training and production workloads, enabling continuous development of models. It also lets you share expensive and limited resources like graphic processing units between developers to speed up development and lower costs.
Equally, it gives enterprises agility to deploy AI/ML operations across disparate infrastructure in a variety of environments, whether they are public clouds, private clouds or on-premises. This allows deployments to be changed or migrated without incurring excess cost. Whatever components a business has running — microservices, data services, AI/ML pipelines — Kubernetes lets you run it from a single platform.
The fact that Kubernetes is an open source, cloud-native platform makes it easy to apply cloud-native best practices and take advantage of continuous open-source innovation. Many modern AI/ML technologies are open source as well and come with native Kubernetes integration.
Overcoming the skills gap
The downside to Kubernetes is that the energy sector, like every other sector, faces a Kubernetes skills gap. In a recent survey, 56% of energy recruiters described an aging workforce and insufficient training as their biggest challenges.
Because Kubernetes is complex and unlike traditional IT environments, most organizations lack the DevOps skills needed for Kubernetes management. Likewise, a majority of AI projects fail because of complexity and skills issues.
ESG Research found that 67% of respondents are looking to hire IT generalists over IT specialists, causing worry about the future of application development and deployment. To overcome the skills gap, energy and utilities organizations can devote time and resources to upskill DevOps staff through dedicated expert training. Training in combination with platform automation and simplified user interfaces can help DevOps teams master Kubernetes management.
Spend now to prosper later
Cost cutting is unavoidable for many companies today, including energy providers. But even in downturns, CIOs should balance technology investment spending with improved business outcomes, competitive demands and profitability that come from adopting cloud-native, Kubernetes, AI and edge technologies.
Gartner’s latest forecast claims worldwide IT spending will increase only 3% to $4.5 trillion in 2022 as IT leaders become more deliberate about investments. For long-term efficiency cost savings on IT infrastructure, they would do well to invest in cloud-native platforms, which Gartner included in its annual Top Strategic Technology Trends report for 2022.
As Gartner distinguished vice president Milind Govekar put it: “There is no business strategy without a cloud strategy.”
Cutting back on cloud-native IT modernization initiatives might save money in the short term, but could seriously hurt long-term capabilities for innovation, growth and profitability.
Tobi Knaup is the CEO at D2iQ.
Subscribe to the Cloud Insider Newsletter
This is your go-to resource for the latest news and tips on the following topics and more, XaaS, AWS, Microsoft Azure, DevOps, virtualization, the hybrid cloud, and cloud security. Delivered Mondays and Wednesdays