The serverless movement seeks to advance technology using new and innovative concepts. Learn how Red Hat approaches the topic.
The old-school “one server/one function” concept has prevailed for veritable decades in the technology realm, whereby a single server stands duty to perform authentication, file, print, web, messaging, and other services.
That’s the past. The future is moving towards a serverless model whereby functions (e.g. applications) are more important than actual server implementations.
SEE: Special report: Prepare for serverless computing (free PDF) (TechRepublic)
TechRepublic spoke with William Markito Oliveria, senior manager of product management at Red Hat, to find out how the open source enterprise solution company leverages serverless computing technology.
The serverless movement
Scott Matteson: What is the serverless movement?
William Markito Oliveria: It’s all about focus. Developers and system administrators want to focus on what really matters and deliver value faster. In order to achieve this speed they have to delegate certain concerns to a platform, and serverless platforms provide a lot of those same benefits by abstracting and simplifying the workflow of building and running applications.
As part of those abstractions, the most common things associated with the term serverless are: Auto-scaling (up or down) on-demand and event-driven applications.
The name itself may not be the best, but we are probably used to that in this industry already, there is no actual cloud in cloud computing and there are still servers in serverless.
Scott Matteson: What is driving this shift in technology trends?
William Markito Oliveria: The need to adapt to frequent changes in business needs and requirements forced technology leaders to adopt modern practices such as devops, containers, and serverless.
For example, there was a time when you could justify having application downtime for maintenance or releasing new features three times a year, but today that is unacceptable, and you better have weekly–if not daily–rollouts that won’t impact your current users. Since serverless can optimize the whole development and deployment lifecycle it’s been seen as one of the trending solutions to solve a real business need.
Serverless advantages and disadvantages
Scott Matteson: What are the advantages/disadvantages of going serverless?
William Markito Oliveria: Some of the most sensible advantages are around developer productivity and overall agility to deliver software focusing on what developers do best–code– not servers and infrastructure. But other advantages are around optimizing resource utilization by auto-scaling applications down for example, or by providing a well-defined set of standards to build event-driven systems.
Disadvantages are still around portability or lock-in and local developer experience, problems that we already address in initiatives like Knative.
SEE: Serverless computing: A guide for IT leaders (TechRepublic Premium)
Future of traditional server systems
Scott Matteson: What will happen to traditional server systems?
William Markito Oliveria: They will always continue to be around and used for many workloads, especially stateful applications and more specialized workloads, like ultra low-latency systems.
Scott Matteson: What products does this tie into?
William Markito Oliveria: For Red Hat, we are working to integrate our entire portfolio with the serverless strategy, top to bottom. We are blending serverless functionalities within Red Hat OpenShift and doing multiple integrations with Red Hat Middleware, Management and Monitoring portfolio.
Red Hat’s vision
Scott Matteson: What is Red Hat’s vision here?
William Markito Oliveria: We believe that we are in the middle of an evolution of the serverless technologies. When it started, it was all about running small snippets of code (functions) for a very short period and triggered by events, but it has evolved to become part of the platform blended with the normal Platform as a Service (PaaS) flows.
You don’t need to rewrite everything as functions in order to leverage the serverless benefits, microservices and almost any Linux container can run as a serverless workload with OpenShift Serverless, our offering on this space based on the Knative project. Functions can still deliver a lot of developer productivity, and it is a perfect fit for many use cases, but serverless has expanded beyond that.
In 2019 we already see many initiatives, including from the cloud providers, offering serverless containers, serverless databases and extending the term with complex orchestration capabilities with minimal state. For a customer, this complexity is hidden and managed through automation, which where initiatives like Operators can play a big part.
Scott Matteson: How does this tie into other platforms like Kubernetes?
William Markito Oliveria: An already popular saying is that ‘Kubernetes is the platform for building platforms,’ which is why projects like Knative, targeting specifically Serverless for Kubernetes, was created and launched by Google, Red Hat, IBM, SAP, and others.
Kubernetes offers the basis for hybrid cloud deployments and infrastructure abstraction allowing for portability, but it was missing some of the developer focused APIs and the ease of use promoted by Serverless and those problems are now addressed by Knative. Since OpenShift is positioned as the Enterprise Kubernetes, we are bringing that expertise and experience from running Kubernetes to this space, making sure that technologies such as Knative have the same enterprise readiness and maturity as Kubernetes and making it an integral part of our story with OpenShift Serverless.