Skip to main contentSkip to page footer

 |  Blog

Speed Up AI Prototyping - How It's Done with Azure ML Designer

When developing intelligent products, the transformation of a company's domain knowledge into AI-enabled performance indicators, the development of a suitable data strategy and the software-technical realization is based on expert knowledge. This expert knowledge is usually limited to a few employees with strong analytical and programming skills. That's why it is so important for companies to make effective use of existing in-house expertise.

The use of machine-learning-as-a-service platforms is particularly suitable when it comes to providing new AI models for a prototypical use case. Here, the majority of the programming work is abstracted by prefabricated visual building blocks containing setting options, which are linked in a structured manner and represent logical work steps.

When PoCs (proof of concepts) are realized in the traditional way - without the aid of abstract, partially automated tools - the time required for software implementation is problematic. If, for example, a large number of models are trained, evaluated and transferred to the operational process, an overhead arises not only at this point, but also in a new training and tuning process when new training data is fed in.

Here, MLaaS platforms offer a solution. The visual building blocks represent large parts of the machine learning lifecycle such as data preparation, feature development, feature selection, model training, model deployment and model monitoring as one process. This form of representation and interaction saves experts the tedious task of writing code line after line. By saving time, the expert team is able to invest more time in solving complex problems.

With Azure Machine Learning (Azure ML, AML), Microsoft offers enterprises a collection of cloud services and tools to train, deploy, automate and manage machine learning models. Its central element is the Azure Machine Learning Studio (AMLS), a web-based development environment with all the functionality needed to create and manage AI models. Trained Azure ML models can also be quickly and effectively transferred to PoCs of intelligent products and AI-powered services, such as Power BI, Synapse Analytics, Data Factory or Databricks.

Realizing an AI PoC requires an AML workspace whose task is to organize shared computational resources, data, pipelines, registered models, published pipelines, and real-time endpoints. The basic building block of model training is a pipeline consisting of data resources and analytic components - visual blocks that are linked on a stage. The analytical components include data entry functions, training, scoring, tuning and validation processes, as well as actual AI models along with the ability to make adjustments directly with Python / code.

 

Pipelines have several properties that go beyond pure project organization and reusability. It is possible to create specific pipelines for data cleansing in the context of data quality or to prepare pipelines for real-time predictions. As part of monitoring and troubleshooting, users may return to stored pipeline jobs that outline configuration and results. It takes little effort to clone a pipeline job to run a new pipeline. A further convenience is the grouping of pipeline jobs into experiments to make the job history traceable.

For real-time predictions, a pipeline can be deployed as an online endpoint in an Azure Kubernetes service cluster just by clicking a button.

From our point of view, the use of ML-Designer makes sense if models with moderate complexity are to be realized for PoCs. For specific application tests in safety-critical areas, the use of pure code solutions via e.g., notebooks in AMLS is recommended.

 

Are you interested in learning more about the possible applications as well as the strengths and weaknesses of AML, please feel free to contact our team of experts.

About the author

Rainer Duda is a Data & AI Consultant at M&M Software and supports companies in the development of data-driven business models and the realization of AI-supported applications. For many years, he worked as a data scientist at the renowned Institute for Telematics (TECO) of the Karlsruhe Institute of Technology (KIT) on the Smart Data Solution Center Baden-W├╝rttemberg (SDSC BW) project, among others, and holds lectureships in multivariate statistics and applied data science.

Created by