Why serverless computing is not ready for prime time—yet

Some surprising ideas make too much sense not to succeed. Take Reese’s Peanut Butter Cups. H.B. Reese went through a few years of experimenting and tweaking before he created the innovative delectable (90 years ago this year, incidentally) and saw it become the top candy brand in America.

And so it is with serverless computing—a concept that has mouth-watering appeal for business and IT leaders (It simplifies! It saves resources!), but whose supporting technology isn’t yet fully baked.

For those who have yet to catch wind of the serverless computing trend, it enables developers to do what they love most—write code—without having to be concerned with underlying infrastructure considerations such as number of servers, amount of storage, etc. They merely upload code snippets to a serverless computing platform maintained and run by a cloud provider, which then executes the functionality only when needed (scaling to a level appropriate to the event) and charges only when the code is consumed.

Voilà! Organizations no longer have to pay for a fixed number of servers, and they are spared the chore of capacity planning and other aspects of managing, provisioning, and maintaining servers when deploying code. (The usual caveat here when talking about serverless computing: “Serverless” is a bit of a misnomer, since the applications, after all, still do run on servers.)

However, emerging platforms take time to mature—and require mature tools to manage them. Here's why you might want to put a pin in serverless for now.

Ovum Decision Matrix for Multicloud and Hybrid Cloud Management 2018-19

Containerize your excitement

Serverless computing (a.k.a. function as a service, or FaaS) is just a few years old, born when Amazon Web Services launched Lambda at the 2014 AWS Reinvent conference. Microsoft Azure and Google Cloud have since announced their own FaaS offerings.

In many ways, serverless computing is the next logical step in the containerization craze that has dispensed with traditional hypervisor-based virtualization in favor of a bundled method that provides independence from underlying infrastructure, such as differences in OS distributions.

Serverless eliminates key problems in today’s containerized and platform-as-a-service (PaaS) environments. While these systems have allowed enterprises to better focus on developing and deploying applications instead of managing and maintaining infrastructure, they come with pain points. 

PaaS has proved effective for relatively simple applications, ones that don’t have multiple components interacting with each other, but has had trouble with large, enterprise-grade apps. FaaS aims to solve this problem by enforcing microservices rather than simply encouraging them.

There is an air of inevitability around serverless computing because its main value proposition—less overhead in the application delivery process—is so strong at a time when companies must deliver new features to customers as rapidly as possible to stay competitive, while also watching operating costs.

Lock-in looms

It’s true that serverless computing could expose companies to increased lock-in to proprietary cloud environments such as Amazon Lambda instead of relying on more infrastructure-agnostic platforms such as Cloud Foundry or OpenShift, but customer concerns over lock-in have eased across the industry as digitally native trendsetters such as Netflix and Airbnb focus on content over technology.

Serverless computing in all likelihood will become the ultimate solution for running microservices—the increasingly prevalent architecture that breaks large, monolithic applications into multiple loosely coupled services that are easier and faster to develop, test, and run.

Notice my use of the future tense. There’s a big “but” with serverless computing, and it’s that the available solutions haven’t yet caught up to the promise. Shortcomings include limitations in memory and CPU capacity and other bugaboos such as a short runway to timeouts. If code takes too long to run, some serverless platforms will simply kill it.

Emerging platforms that last take time

These kinds of issues are not unusual for emerging platforms. In the early days of PaaS, there were severe restrictions and API lock-in as well. Only as time and competition allowed were these restrictions slowly lifted.

Serverless computing offers enormous opportunities for a new wave of features from cloud providers, startups, and the open-source community (the Serverless.com and OpenFaaS projects being two examples).

Serverless computing is a win-win for organizations and promises a sort of nirvana where developers can do what they’re wired for—write code without having to worry about how it scales and deploys—while the operations team takes advantage of the new serverless platforms to keep its mission of ensuring that applications are unbreakable and secure.

It will happen—all that’s needed are better tools.

Share you experiences with managing serverless. Is your organization going all in? How will you cope? 

Hybrid Cloud: New Challenge For Monitoring Solutions