Supervised vs. Unsupervised Machine Learning

Supervised vs. Unsupervised Machine Learning

Machine learning is a category of artificial intelligence that includes a number of algorithmic approaches. In manufacturing, the 2 most common approaches are supervised and unsupervised learning.

There are endless application opportunities in industry for machine learning. Here are some examples:

  • Predicting process disturbance in chemical production
  • Predicting quality failures in a production line
  • Predicting production waste in food or PCB manufacturing
  • Predicting asset failure in a power plant
  • Determining the parameters of a “golden batch” for optimal production throughput

The topic of supervised vs. unsupervised machine learning is actually a somewhat contested one in the Industry 4.0 domain. The reality is that there is no one-size-fits-all machine learning technique that can meet the requirements of every type of manufacturing application.

So, what’s the plan of action?

Process engineers should aim to improve their understanding of machine learning techniques to support decisions regarding the use of AI in production optimization for their specific manufacturing challenges.

While this topic cannot possibly be covered completely in a single blog post, this article aims to touch upon the basics of common ML approaches as they pertain to manufacturing, and to describe when each might be appropriate.

Supervised Machine Learning

“Supervised Learning” describes a relatively didactic process by which predictive machine learning models are developed. For this type of machine learning, historical input and output data are made available to the model.

The method used to create an algorithm from a training dataset resembles a teacher guiding a student to reach a specific goal. The “student” algorithm progresses by making iterative predictions based on the training data, and is corrected by the “teacher”.

Supervised Learning problems can be split into 2 main types:

a. Classification – used when the output is categorical such as “normal” or “warning”.

An example of a classification algorithm is one that receives sensor information as input, e.g. pressure, flow rate, and vibration velocity, acceleration, and displacement, and determines the asset health of a machine.

b. Regression – used when the output is a continuous value such as temperature, voltage, or rpm.

An example of a regression algorithm could be one that receives a component’s code number and performance history as input, and predicts the component’s next malfunction. (An algorithm like this could be used to inform maintenance scheduling.)

Classification vs. regression in machine learning

Unsupervised Machine Learning

In unsupervised learning only input data is required. The goal is for the algorithm to do the work and discover the innate structure of the dataset – to model the distribution of the data and automatically provide insight into correlations.

Like supervised machine learning, unsupervised machine learning problems can be split into 2 main types:

a. Clustering – used to discover groupings found in the input data.

In manufacturing, clustering is used to detect behavior anomalies in the production process and equipment. Using measurements from sensors on a production line, clustering can detect and analyze anomalies/outliers, in turn identifying the root causes of process malfunctions or equipment failure.

Machine learning algorithms - Clustering

b. Association – used to discover rules that can describe relations in the distribution of the input data.

An example of association can be any instance of pattern/behavior detection such as the rise in a pump’s pressure as a result of a temperature increase in a cooling vessel earlier in the process.

Supervised Vs. Unsupervised Machine Learning

Semi-Supervised Machine Learning

In semi-supervised machine learning, labelled and unlabelled data are used together to train the algorithm.

Labelled data significantly improves the learning process of an algorithm. The problem is that large labelled datasets are labor-intensive to create.

This is why semi-supervised machine learning can be very advantageous. Data scientists have found that even when a small group of labelled data is used for training in conjunction with a large unlabelled group, learning accuracy is greatly improved.

 

Reinforcement Learning

Unlike cases where the input is formally fed to an algorithm, in reinforcement learning the algorithm receives input based upon experience.

For example, a robot (agent) can be given the task of learning how to connect two components together (reward). The robot can start off without any data about the task, but through experimentation (actions), will start to collect data about its movement, surroundings, and how the two components interact (observations).

When an action is taken that leads to the two components connecting, or coming close, the data related to that action is labelled accordingly and analyzed. As the robot continues to take more actions and record more data, it improves its knowledge about its task.

Human in the Loop

Unfortunately, the above scenario can be very challenging when it comes to real-world problems like the ones we see in manufacturing.

An algorithm can only perform at the level of our input definitions – how we define the reward, the methods of analysis, and other feature engineering attributes. In manufacturing, with an abundance of parameters affecting one another, it’s extremely difficult to account for everything when building this type of model.

Sometimes, it’s easy to see what an algorithm is doing wrong when you’re observing it objectively. This is the idea behind Human in the Loop (HITL).

With HITL, machine learning applications leverage human knowledge to rule out the obvious “bad ideas”. Instead of investing endless time in attempting to perfect a model, human-sourced experience can be consolidated with the algorithm’s process. This leads to improved results and a more efficient learning process.

 

Choosing the Right Machine Learning Algorithm

In manufacturing, a large number of factors affect which machine learning approach is best for any given task. And, since every machine learning problem is different, deciding on which technique to use is a complex process.

In general, a good strategy for honing in on the right machine learning approach is to:

Evaluate the data. Is it labeled/unlabelled? Is there available expert knowledge to support additional labelling? This will help to determine whether a supervised, unsupervised, semi-supervised or reinforced learning approach should be used.

Define the goal. Is the problem a recurring, defined one? Or, will the algorithm be expected to predict new problems?

Review available algorithms that may suit the problem with regards to dimensionality (number of features, attributes or characteristics). Candidate algorithms should be suited to the overall volume of data and its structure.

Study successful applications of the algorithm type on similar problems.

 

Interpretation is Key

How we interpret the algorithm’s output is crucial to how that algorithm helps us solve real-world manufacturing problems.

It’s important to keep in mind that the output is the result of how the algorithm was defined, how the data was collected and aggregated, and how the output is presented.

The interpretation stage holds a number of risks such as overfitting, which can distort our understanding of the results.

 

Machine Learning Approaches Used in Manufacturing

As in many cases of applied mathematical theory, the answer of which machine learning algorithm to use in manufacturing is the unsatisfactory “it depends”.

Every industry, facility, production line, and problem has its own characteristics. Accounting for as many of these factors as possible will improve the chances of building a system that can provide the desired results.

The decision is also affected by business factors, industry regulations, and the availability of expertise. Keeping sight of all of these parameters, and being able to come up with a machine learning solution that can meet the respective demands, will generate the most value.

 

Getting Started with Machine Learning in Manufacturing

For many manufacturers, the diversity of machine learning – the variety of theories, algorithms, methods and platforms – actually presents a barrier in the path to adoption.

It’s important to note that taking advantage of the benefits of machine learning doesn’t necessarily require a huge investment or major changes to the production floor.

The fact is that in many plants and factories, such as in the chemical processing industry, data is already being captured and stored in a well-structured way. Simply by gaining a better understanding of the type of problems machine learning can solve, manufacturers can begin to explore how their data can drive significant improvements.

 

Cut your production losses with machine learning built for manufacturing:

Get a 1-on-1 demo of the Seebo platform and see how accurate and timely alerts can significantly improve maintenance, product quality, and profitability in manufacturing.

 

Leverage your data to cut production losses.

Get Demo

Overall Line Efficiency - An Important Metric for Industry 4.0

Why Overall Line Efficiency is a Necessary Metric for Industry 4.0

Overall Equipment Efficiency (OEE) is a widely accepted and utilized manufacturing evaluation method, but Industry 4.0 is raising the standards of production, and OEE is limited in its ability to take into account more complex systems.

To respond to this need for a better-suited metric, a technique known as Overall Line Efficiency (OLE) is being used, mostly because of its ability to describe multiple production lines and the interaction of a number of various sub-processes within a larger production process.

A complete performance evaluation approach will incorporate both OEE and OLE methods with appropriate modifications to suit the operation.

 

Efficiency Vs. Effectiveness

To differentiate between OEE and OLE, it helps to start by clarifying the difference between effectiveness (in OEE) and efficiency (in OLE), two terms often misused in the context of manufacturing.

 

Differentiate between OEE and OLE

 

The simple diagram above demonstrates that effectiveness focuses on performing the right tasks and aiming for the right goals while efficiency is about performing tasks in an optimal way.


Overall Equipment Effectiveness (OEE)

Overall Equipment Effectiveness is a fundamental KPI that is used to improve manufacturing processes by using benchmarking and analysis to pinpoint inefficiencies and to categorize them.

OEE seeks to describe the overall utilization of materials, equipment, and time in a production process. OEE is calculated according to the below equation, although there are a number of ways of defining the 3 contributing parameters...

 

OEE  = Availability X Performance X Quality

For production lines consisting of a number of unbalanced (unpaced) machines, OEE is not ideal, being better suited to evaluate individual assets.

 

Overall Line Efficiency (OLE)

Overall Line Efficiency is a fairly new metric in manufacturing and builds off of OEE to compare the current performance of a production line with how well it could be performing.

OLE also takes into account the personnel involved in the various processes, seeking to optimize the synchronization between the output rates of machines and the use of human resources.

 

OLE   =   OEE of Machine A + OEE of Machine B + OEE of Machine C
3


The above calculation assumes that the importance of Machine A, B, and C are the same ie. have the same “weight”. In most manufacturing scenarios this will not be the case, with different processing stages having different weights, resulting in a more complex OLE calculation.

Overall Line Efficiency can be expanded further to include a calculation for each production line (taking into account the bottleneck for each), and to formulate a calculation that incorporates a number of production lines.

 

New Methods for Calculating OEE and OLE

The use of artificial intelligence is steadily growing within the manufacturing sector and can be applied to both OEE and OLE calculation. AI’s advantage here is its ability to adapt to different manufacturing scenarios thanks to the algorithms’ flexibility.

In other words, an AI algorithm used to calculate OLE isn’t affected by whether the operation is in the aerospace or food processing sector and the meaningful differences between those sectors can be reflected in the algorithm by setting specific weight values for critical parameters.

 

Using Artificial Neural Networks for Overall Line Efficiency

Artificial Neural Networks can easily handle the complexity of OLE calculation, and lead to far more accurate results than those achievable through more traditional calculation methods.

Implementing ANNs to calculate Overall Line Efficiency is not an immediate process - the algorithm needs to be trained. This is done by feeding the ANN existing historical data categorized as input (OEE) or output (OLE) along with other relevant data from the machines and production floor. ANNs can also be fed data from observations made by operators, enhancing the training through additional information layers.

 

Better Manufacturing Management with Overall Line Efficiency

Using a combination of OEE and OLE calculations to monitor the performance of a manufacturing operation can be extremely useful for management. Due to the high number of variables involved, artificial intelligence in the form of Artificial Neural Networks and other techniques, is very well suited to this field and can offer actionable insights for better management decisions and greater impact.

 

 


Manufacturers are Leveraging IoT to Dominate Their Markets

How Manufacturers are Leveraging IoT
to Dominate Their Markets

For manufacturers, adopting new digital technologies can be a real challenge - one that requires significant planning, resources and time. IoT connectivity offers such a wide range of advantages that the abundance of options can actually deter decision makers from jumping onboard with this technology.

Despite this, the implementation of Internet of Things functionality is steadily making its way to the top of more and more to-do lists within the manufacturing sector. IoT hardware, software, platforms, and services are all being improved constantly, helping manufacturers take on the task of IoT integration with more confidence than ever before.

IoT - The Gift that Keeps on Giving

Companies that have successfully introduced IoT technology into their manufacturing operations are experiencing a host of benefits from improved energy efficiency to the production of better quality products.

IoT Manufacturing BenefitsA connected packaging machine is one area in which manufacturers are taking advantage of new business opportunities with IoT. 

And the advantages don’t stop there. Because IoT offers a level of connectivity that we haven’t really experienced before, this opens up the potential for new channels of income. In fact, IoT can lead to the renewed evaluation of a company’s business plan, with the knowledge that this new connectivity, access to data, and analysis capability, can completely reinvent how a company earns revenue.

The Bottom Line

Here it is: manufacturers need to make a conscious decision to be proactive about IoT and digitization in general. These technologies have so much to offer that avoiding this opportunity, or even procrastinating for too long, could mean risking losing business to competitors, and even becoming completely irrelevant as a manufacturing entity.

In an attempt to shed some light on the specifics of what manufacturers have to gain with IoT connectivity, we’ve put together a white paper that clearly discusses the matter.

Get the free whitepaper - Leveraging IoT in Manufacturing

Leveraging IoT in Manufacturing

This whitepaper offers insight on 6 of the main reasons why your company, factory or plant should consider the move to IoT sooner rather than later.


Model-Based Systems Engineering for IoT

Is Model-Based Systems Engineering
the Guiding Light to Successful IoT Delivery?

As IoT adoption and functionality advance, systems become more feature-rich, the number of connection points increases exponentially, and managing physical devices, data, and digital networks becomes more complex. Organizations will have to meet this complexity head-on in order to be able to leverage the benefits of IoT.

The pairing of IoT development and Model-Based Systems Engineering (MBSE) is growing in popularity as a way to successfully deliver IoT projects. MBSE is proving itself as a viable approach to IoT implementation because of its “Systems of Systems” approach.

Spending Quality Time with your IoT Project

At first glance, Model-Based Systems Engineering might seem like an overly-complex methodology for product teams to deliver their IoT projects as fast as possible. And yes, initially this approach might take longer than a more informal one. The catch is that rushing a product to market quickly can often lead to its ultimate demise through unforeseen defects and customer disappointment.

Moreover, while a first project using MBSE might seem to require more time, following projects are likely to progress even faster than before. Deployment times improve due to a better understanding of how to apply the principles, and thanks to the avoidance of unnecessary iterations, wasted resources, and having to deal with surprise malfunctions.

It’s important to remember that IoT projects aren’t limited to gadgets that make our lives a little easier. Autonomous travel, smart medical devices and connected machinery could all have life-threatening consequences should errors occur in the IoT system and network. Taking the extra time to perfect and evaluate a system before launch is crucial in such cases.

What MBSE has to Offer IoT

By employing Model-Based Systems Engineering, an IoT system could be built, evaluated, maintained, and improved by addressing questions like these...

  • Has a model-based engineering approach been used to specify the IoT system's requirements?
  • Does the IoT system functionality answer the user requirements?
  • Have the requirements been tested with a user group?
  • Is the IoT system's testing automated? And, what is the best way to manage the IoT testing process?
  • Are design and functionality decisions guided by analysis and simulations?
  • Is the network architecture and overall design being verified using system-level analysis?

The above questions demonstrate the fact that MBSE points to a data-driven evaluation and decision-making process.

This approach places a lot of emphasis on verifying a project across a number of domains by utilizing software, hardware, and cyber-physical IoT simulations.

Creating a Digital Twin

An exciting outcome of using Model-Based Systems Engineering for IoT is the creation of a “Digital Twin”. A digital twin is a customized high-resolution digital model that works in parallel to an actual system.

For example, a manufacturing plant could have a digital twin that precisely reproduces the dynamics of all the systems within the plant. Every process in the plant from raw materials processing and production to quality control and delivery can be monitored and repeated in the digital twin in real time. This allows for risk-free experimentation - through simulation, processes can be modified to any extent, and the results can be measured and compared.

Model-Based Systems Engineering for IoT

As with all new technologies, there is no golden rule or single tried-and-true method for integrating IoT into a product or manufacturing process. Each use case is different, and different approaches will have to be considered along with a good amount of customization and testing.

Model-Based Systems Engineering is an established engineering discipline, which means it offers a defined pool of professionals with a specific skill set. As a framework with its own principles and solutions, and because of its data-driven approach, Model-Based Systems Engineering could be extremely useful in successfully delivering complex IoT systems.