Watch out for serverless computing’s blind spot


of public cloud computing: You no longer have to provision virtual servers in the cloud; that’s done automatically to meet the exact needs of your application.

Although the value of serverless computing is not in dispute, it’s my job to find potential downsides in new technologies so that my clients—and you—can avoid them. In the case of serverless computing, we may find that cloud architecture as a discipline suffers. Here’s why.

When building applications for server-oriented architectures (where the virtual servers need to be provisioned, including storage and compute), you have built-in policies around the use of resources, including the virtual server itself. After all, you have to provision servers before the workloads can access them. That means you’re well aware that they’re there, that they cost money, and that they’re configured for your workloads.

The serverless approach means you get what you need when you need it, which then exempts the cloud architect from critically thinking about resources that your applications will require. There’s no need for server sizing; as a result, budgets become a gray area because you’re basically in a world where resources are available from a function call.