Skip navigation
All People > ernstamort > Holger Amort's Blog > 2016 > June
2016

Slack has gotten lot of attention as a new way of collaborating in teams, especially in the tech industry. When I first saw it, I have to admit I wasn't that impressed. I didn't really see much more as a chatting app in it, with some news feed. It is often used in Software development and it does make life easier when you interface it with TFS and get updates about new check-ins. But other than that I asked myself, what is so different compared to other apps?

 

I stumbled across Slack a second time when I was searching for a new delivery channel for the AF system. Email and SMS doesn't seem to be a good fit, since they look dated and are more appropriate when sending very urgent messages (otherwise you create a lot of spam). But not all messages in manufacturing are urgent, some are just information that you would like to receive to keep updated. Some messages might be warnings and some messages need acknowledgement.

 

The other problem is that you need a system that allows real time collaboration. If an operator receives a message that a quality parameter is off-spec, he might need to communicate with several people in order to proceed:

QC: Do you need to test a second sample to confirm the result?
Field Operator: Hold off on the transfer before a decision has been made.
Supervisor: Needs to decide what to do with the product.

 

It is clear that Email or SMS are not the right tools to allow for quick decision making.

 

Slack does allow collaboration and provides an easy to use API to connect to message streams. But just streaming messages or events wouldn't make it a great tool, it also needs context. In the following example I used ISA 95 and ISA 88 data models to structure the AF database. Now I can map the equipment structure in AF to the channels in Slack:

 

     Area + Unit = Slack Channels

 

The idea is to get a unique combination that helps to identify the channel.

 

Now every batch event can be traced back to the equipment and posted in the right channel. The benefit is that people can subscribe to the channels that they are interested in.

 

 

IMG_1658.PNG

Each event also has a set of icons (check mark, question mark, cancel) to react to the event. These reaction can be traced and sent back to the AF system Now its possible to collaboratively decide if the event needs follow up or passes.

 

But there are often situations where you would need additional information to make a decision. This is where Slack bots come in. These are programmable users that respond to messages. In the first version the OsiBot can pull attribute information, which again is context specific. So in channel biotech-bio_reactor_3 a query for attributes will create a list of just the attributes for this reactor:

 

IMG_1659.PNG

IMG_1660.PNG

 

The next steps in this project is to make the bot a bit more interactive, so
    
     add function to write back to attributes

     get time series

     filter attributes by static and dynamic

     add charts and plots

 

Generally, I think Slack and PI are a great combination and will improve communication and decision making in manufacturing. Key I believe are to properly contextualize your data so you make it easier for people to find their information.

This post is a follow up to Marco's R blog:

 

Manufacturing facilities are creating large amounts of real time data that are often historized for visualization, analytic and reporting. The amount of data is staggering and some companies spend numerous resources for the upkeep of the soft and hardware and data securement. The driving force behind all these efforts is the common believe that data are valuable and contain the information for future process improvements.

 

The problem has been that advanced data analytic requires tools that go beyond the capabilities of spreadsheet programs such as Excel, which is still the preferred option for calculation in manufacturing. The alternatives are software packages that specialize on statistical computation and differ in price, capability and steepness of the learning curve.

One of these solutions, the R program, has become increasingly popular and is now actively supported by Microsoft [1][2][3][4]. R has been open source from the beginning and the fact that it is freely available has drawn a wide community to use it as primary statistical tool. But there are more important factors why it will also be very successful in Manufacturing data analytics:

 

  1. R works with .NET: There are two projects that allow interoperability between R and Net called R.NET[5] and RCLR[6].
  2. R provides a huge number of R packages (6,789 on June 18th, 2015 8,433 Windows packages as of June 6th, 2016), which are function libraries with specific focus. The package 'qcc' [7], for example, is an excellent library for univariate and multivariate process control.
  3. According to the 2015 Rexer[8] Data Miner Survey 76% of analytic professionals use R and 36% use it as their primary analysis tools, which makes R by far the most used analytical tool.
  4. Visual Studio now supports R with support for debugging and Intellisense[9]. Visual Studio is a very popular Integrated Development Environment (IDE) for NET programmers and will make it easier for developers to start programming in R.
  5. R's large user base help to review and validate packages.
  6. The large number of users in Academia leads to the fast release of cutting edge algorithms.

 

The following are three examples of using R analysis in combination with the OSIsoft PI historian (+ Asset and Event Framework).

Example 1: Process Capabilities

 

Fig.1      Process capability of QC parameter
Data were loaded using RCLR+OSIsoft Asset Framework SDK
Analysis shows that lower control limit will lead to a rejection rate of 0.5% (CpL < 1.0)

 

Example 2: Principal Component Analysis of Batch Temperature Profiles

Fig.2      PCA Biplot
Data and Batch Event Frames were loaded using RCLR+OSIsoft Asset Framework SDK
There are only 3 underlying factors that account for 85% of the variance. The biplot shows the high correlation between the different variables which is typical for batch processes.

 

Fig.3  Predicted Batch Profile using PCA
Blue line are measured values and the green line is the predicted remainder of the batch

 

The results of the R Analysis can also be used in real time for process analysis. In general, the process of model development and deployment is structured as follows:

In the model development phase, models such as SPC, MSPC, MPC, PCA or PLS are developed validated and finally stored in a data file. During the real time application or model deployment phase, new data are sent to R and the same model is used for prediction.

 

 

Fig.3      Single and Trend Process Control Limits
Control Limits are fed back to the historian – The dotted vertical line represents the current time

 

Summary

There is increasing gap in Manufacturing between the amount of data stored and the level of analysis being performed. The R statistical software package can close that gap by providing high level analysis of production data that are provided by historians such as OSIsoft PI. It provides a rich library of statistical packages that perform univariate and multivariate analysis and allows real time analytics.

 

Some Comments

  1. I was very surprised by the Rexer survey to see how popular R has become.
  2. Although R provides a lot of different packages, some tools for manufacturing analysis are still missing. Most notably for batch are real time alignment and optimization. One reason is that chemical engineers often use Matlab and there is a large code base available. Also some key journals only provide Maltlab examples (e.g. Journal of Chemometrics).
  3. R.NET is single threaded. This isn't a problem in the model development, but during run time this could lead to a bottle neck. I used a consumer-producer queue to enforce sequential calculation.
  4. rclr works fine and I didn't encounter any problems. In order to call the AFSKD from R still requires an additional layer in NET to flatten some of the objects to string or double array and some scripting in R to make the calls consistent with other libraries.
  5. Writing future values/forecast into PI tags worked perfectly - now you can work with forecasts over longer periods.
  6. I used elements as container for the model parameter, but this might not be the right way of organizing data.
  7. Same article was also published here.

 

 

References:

[1] http://www.zdnet.com/article/microsofts-r-strategy/
[2] http://www.revolutionanalytics.com/
[3] https://www.microsoft.com/en-us/server-cloud/products/r-server/
[4] https://mran.microsoft.com/open/
[5] https://www.nuget.org/packages?q=R.NET.Community
[6] https://rclr.codeplex.com/
[7] https://cran.r-project.org/web/packages/qcc/qcc.pdf
[8] http://www.rexeranalytics.com/Data-Miner-Survey-2015-Intro2.html
[9] https://www.visualstudio.com/en-us/features/rtvs-vs.aspx