is all the rage right now—and for several good reasons:

  • It removes you from having to provision a server yourself; you simply write functions, and the resources you need are automatically allocated to that function.
  • You pay only for the resources you use. No more leaving servers up and running, then getting a big cloud bill at the end of the month.
  • It can automatically scale, determining what cloud services need to scale with demand and then making it happen.

Amazon Web Services’ and Microsoft’s  are the best-known examples of serverless computing, and both have existed for a few years. Still, although we’ve had some great success in some serverless areas, there are other serverless areas that need work.

It’s not the technology that’s falling short with serverless computing, but its use. You can’t blame this one on AWS or Microsoft—enterprise development shops simply picked the wrong applications to apply serverless computing to.

For example, while new applications are a good fit for serverless computing, old applications often are not—and migrating those old applications to serverless could be more work than enterprises bargained for. One big reason is the fact that serverless computing systems don’t support all programming languages.