One of the more fascinating trends of this past year was the move to conversational computing. Instead of building complex apps, using bots over services like Facebook’s Messenger or Microsoft’s Skype can help simplify customer interactions.

Using bots is a technique that can also work over internal chat tools like Slack and Microsoft Teams, giving rise to self-service “chatops” tools that can manage common service-desk queries. Last week, , with bots to take content and use it as the basis of conversational services.

Inside the Microsoft’s Bot Framework

Microsoft’s Bot Framework is designed to help you build and deploy chat-based bots across a range of services, including non-Microsoft platforms and through open web and SMS gateways, with minimal coding and with tools for delivering cross-platform conversations from one bot implementation. Like much of Microsoft’s recent development tools, the Bot Framework is intended to be cross-platform and cloud-based, building on Azure services and on the APIs in the company’s machine learning-powered Cognitive Services APIs.

At the heart of the Bot Framework are two SDKs, one for use with .Net and one building on the open source cross-platform JavaScript-based Node.js. There’s also a set of RESTful APIs for building your own code in your choice of languages. Once built and tested, bots can be registered in any of the supported channels (with their own user names and passwords), before being listed in Microsoft’s Bot Directory. You don’t need to register a bot unless you’re planning on using it with Skype. However, in practice it’s a good idea for customer-facing services because it adds a layer of discoverability.

at its Build 2016 developer event, it’s continued to regularly update the service, adding new features and functions. That includes support for the Azure Bot Service, a cloud-hosted bot development platform, as well as open-sourcing its emulators and key controls. There’s also now support for more flexible conversations, allowing users to interrupt precomposed conversation streams, using their inputs as triggers to launch new actions and jump out of a dialog.

One useful new feature is a QnA Maker service designed to take content and turn it into a bot. It’s an ideal tool for anyone building a customer service bot because it can work with your FAQs and support documentation to deliver conversational support to users. Once it has generated pairs of questions and answers from your documentation, you can then test and train the new knowledge base bot to deliver the answers to the questions you expect to get. The resulting service can be wrapped as an API for use with Cortana Intelligence Suite or used as the back end for a bot running on the Azure Bot Service.

One advantage of hosting bots on a cloud platform like Azure with the Azure Bot Service is that you can use it with serverless compute resources. If you’re expecting bots to need to respond quickly to an unknown number of users, it’s well worth considering this approach. You won’t be spending money on unused virtual infrastructure, and at the same time you won’t risk losing a customer when a bot fails to respond to a query. As demand increases, serverless bots are spawned as needed, and you’re charged for only the cloud resources you use.