Microsoft smartens up Azure Cognitive Services

try {
threshold : 0, // You can set threshold on how close to the edge ad should come before it is loaded. Default is 0 (when it is visible).
forceLoad : false, // Ad is loaded even if not visible. Default is false.
onLoad : false, // Callback function on call ad loading
onComplete : false, // Callback function when load is loaded
timeout : 1500, // Timeout ad load
debug : false, // For debug use : draw colors border depends on load status
xray : false // For debug use : display a complete page view with ad placements
}) ;
catch (exception){
console.log(“error loading lazyload_ad ” + exception);

Microsoft is introducing new artificial intelligence capabilities for developers on the company’s . An enhancement to Azure Cognitive Services, called “Decision,” provides user-specific recommendations for better decision-making.

Azure Cognitive Services is a collection of APIs to intelligent algorithms that developers can tap to perform image recognition, speech recognition, natural language processing, anomaly detection, and other intelligent tasks. Decision adds a service called Personalizer, which leverages reinforcement learning to offer users specific recommendations to assist with decisions.

Azure Search also gaining AI capabilities, via a cognitive search capability that uses Cognitive Services algorithms to extract insights from structured and unstructured content. In addition, Microsoft is previewing a capability that allows developers to store AI insights gained from cognitive search.

In other developments pertaining to AI capabilities on Azure:

  • Azure Machine Learning improvements intended to simplify building and deploying machine learning models, including MLOps capabilities with Azure DevOps integration. This provides developers with automation of the machine learning lifecycle.
  • Automated ML advancements and a UI to develop models.
  • A visual machine learning interface with no-code model creation and deployment with drag-and-drop functionality.
  • Hardware-accelerated models for low-latency inferencing on FPGAs (field programmable gate arrays).
  • Support for the on the Nvidia TensorRT deep learning inference platform and on the deep learning compiler. This support will provide high-speed inferencing on Nvidia and Intel chipsets.