A recurring argument about cloud application development suggests that “it’s someone else’s computer.” While there’s a kernel of truth in there, especially if you’re only lifting and shifting code from an on-premises datacenter to an , it’s an aphorism that ignores the economic benefits that come from adopting newer, cloud-first development models. Well-designed cloud apps can process hundreds of thousands of transactions for a few cents, operating costs that are hard to match even in the best run of on-premises datacenters.
Microsoft has been investing heavily in serverless computing models, with the new Service Fabric Mesh and its . They’re key elements of any new cloud development, especially if you need to respond to events sourced from other applications. Azure also includes tools for managing those events, and , the publish-and-subscribe framework that Microsoft uses to marshal and direct events to serverless microservices running as Azure Functions (or to its serverless-hosted low-code and Azure Flows).
It’s easy for Microsoft to source events from its own first-party services. Drop a picture in Azure Blob Storage, and an Azure Function can fire and process the image through or any other API. But things get harder when you’re working with third-party event sources, either in your own code or from other applications and services. You could use a webhook to deliver an event, but that requires code that can handle asynchronous callbacks, with compute resources required to monitor the callback URL for events.
The challenge is to find a common way of describing and delivering event information. If you can deliver that, you no longer need to learn a new way of working with each new event provider that your code needs to use. Instead of custom code, you can use common libraries, and your code becomes more portable and more resilient, making it easier to switch to new event providers in the event of changes.