In the future, Artificial Intelligence will play an increasingly central role in all industries. It has long done so in the consumer sector: parking aids, face and image recognition and smart homes are already firmly anchored and established in our everyday lives. Interaction with an AI, such as Siri, Alexa & Co., has become a matter of course for most of us.

Currently, the research and consulting company Gartner has presented the Technology Top Trends 2020.

One of them: Hyperautomation

Hyperautomation deals with the use of state-of-the-art technologies, including Artificial Intelligence (AI) and Machine Learning (ML), to increasingly automate processes and assist people.

Since it is impossible to replace humans with a single tool, hyperautomation consists of a combination of tools, including Robotics Process Automation (RPA), Intelligent Business Management Software (iBPMS), and Artificial Intelligence. The goal is to make more AI-driven decisions in a company.

Source: https://www.gartner.com/smarterwithgartner/gartner-top-10-strategic-technology-trends-for-2020/

What does the future look like?

Forecast: The Business Value of Artificial Intelligence
Source: Forecast: The Business Value of Artificial Intelligence, Worldwide, 2017-2025”, Gartner, April 2018

In a study published in March 2018, Gartner predicts that the global business value derived from AI will be $3.9 trillion in 2022.

Artificial Intelligence

The term "Artificial Intelligence" ("AI") stands for a separate scientific field of computer science which deals with human thinking, decision-making and problem-solving behavior in order to be able to mimic it by using computer-aided processes.

Definition from (translated): https://wirtschaftslexikon.gabler.de/definition/kuenstliche-intelligenz-54119

 

Where are we today / Baseline?

Currently, we live in a world in which every single intelligent device generates data almost permanently. As soon as these devices (Things) are connected, enormous streams of data are generated, which are used in IoT solutions.

However, the degree to which IoT solutions are realized within individual companies greatly differs. While some companies consider not processing the existing data traditionally - as before -, other companies are already evaluating IoT solutions or are already using them productively.

However, all companies have the same starting point: they collect, sort and analyze their data.

What are the challenges?

KI Process

All components involved in a production process generate a huge amount of (real-time) data almost non-stop. They should be evaluated in a smart way to gain insights.

The aim is to get the biggest possible benefit from the information collected. This requires powerful, secure and intelligent algorithms. They need to be able to derive from the flood of data, which they are being "fed", the exact type of information a company needs without any room for error.

Out in the field

Research Project: Electricity Grid Forensics

M&M has started a small research project aimed at the practical testing of various data science and AI methods for pattern recognition in an industrial environment. The field of application is the detection of traces left by an electrical consumer in the power grid.

Power consumption, active and apparent power, generated harmonics, etc. are equivalent to fingerprints and DNA traces. The challenge is that a consumer (the "suspect") is recognized even when other consumers are switched on or off.

We used a WAGO controller with 3-phase power measurement to record the current patterns. It also serves as a gateway and transfers the data to the Cloud, where the patterns are evaluated and matched to the consumers (perpetrators). 

Consumer recognition is not the only application of these methods. Rather, we also see here the potential to use indirect measurements to derive knowledge about the status and usage of machines and systems or a production process.

The right "preservation of evidence" - or: Work before play

It is often claimed that data acquisition is the simplest part of AI projects and the real art belongs only to the AI. But this project has also shown that the compulsory exercises must be completed properly first and that this takes up a large portion of time.

The selection and correctness of the data and thus a good understanding of the measuring hardware and the measured variables are the foundation for the further steps. Inconsistencies such as data gaps and anomalies were noticed during the first attempts.

At an early stage it also became clear that too much noise in the data and strong similarities between different devices have a negative impact on detection. Better results can only be achieved by increasing the rate of data collection.

Profiling: Let’s play - Selection and Training of Methods

There are more data science and AI methods for pattern recognition than stars in the sky and each one delivers more or less good results, depending on the task at hand. Choosing the right method is complex and often must be done exploratively.

The methodological research in this project has led to a flood of academic approaches. It quickly became clear that it would not be possible to achieve the results desired with pre-engineered services. Because of a lack of extensive data sets, all "Machine Learning" algorithms also fail for the time being. These require "big data" for training - which is simply not possible with a few measuring points. Not even if they are recorded at millisecond intervals over a longer period.

For this task, good results are achieved by classical pattern matching algorithms that cope with our comparatively small data sets. That way the consumers are reliably detected.

The next step was to search for - and find - similar (publicly available) data sets with which we could train machine learning methods. This also makes it possible to identify consumers, but the results are less good than with pattern matching. The best results were achieved by linking the two methods, i.e. combining classical / mathematical methods with AI - this is commonly referred to as transfer learning.

We use the following tools:

We work closely together with Microsoft and preferably implement solutions based on Microsoft Azure. M&M Software is a Cloud Solution Provider and Partner of Microsoft with the competencies Gold Application Development, Gold Cloud Platform and Silver Data Analytics for the Azure Cloud.

Time Series Insights:

Azure Time Series Insights is a fully managed analysis, storage and visualization service that makes it incredibly easy to examine and analyze billions of events simultaneously.

Stream Analytics:

Azure Stream Analytics is a fully managed, real-time analytics service that analyzes and processes fast data streams and triggers alerts and actions.

Azure Cognitive Services:

Azure Cognitive Services are APIs, SDKs and services that help developers create intelligent applications without direct AI or data science skills or knowledge.

Why AI needs Edge Computing

Edge Computing forms the basis for the upcoming Machine Learning/AI approaches. The typical approach of machine learning is to train models in the Cloud (using almost unlimited resources). However, high costs are often incurred both for the transmission of large amounts of data and for the use of Cloud storage.

This results in two essential requirements for practical solutions.

Data sent to the Cloud must be carefully selected. Large amounts of data should best be processed on site, as transmission to the Cloud is too cost intensive.

This is where Edge Computing comes in and becomes an enabler for machine learning and other AI approaches.

Realize your AI project:

We are experts around the topics IoT, Cloud and AI. With our widespread experience in the automation industry, we support you in the implementation of your AI project.

Our services:

  • Collection of data from different data sources
  • Preparation of data for analysis
  • Use of algorithms for further analysis
  • Evaluation of results
  • Implementation of the results into the process

Modeling data:

  • Visual and statistical representation of the data
  • Interactive data browsing
  • Comparison of data
  • Summarization / Aggregation of Data
  • Identification and visualization of any kind of data relationships
  • Identification of unknown correlations and patterns
  • Detection of load peaks
  • Detection of anomalies

Get ready

The WAGO Cloud is the ready-to-use IoT system for data collection.

Find out more!

 

Sprache der Seite ändern