Skip navigation
All Places > All Things PI - Ask, Discuss, Connect > Blog > 2020 > July
2020

The following is from the lab notes for the hands-on lab "CBM and Condition Monitoring – Process vis-a-vis Equipment Reliability" at PI World 2018, San Francisco, CA. Lab manual and slides are attached.

 

Condition-based maintenance (CBM) is a strategy where you monitor the actual condition of an asset to decide what maintenance needs to be done – see wiki for a broader definition. This is in contrast to a break-fix strategy (reactive maintenance) and also calendar scheduled maintenance (clean and lube every 3 months, laser align every 6 months etc. ) regardless of the condition of the asset and whether it was used or not.

 

An eBook titled A Guidebook to Implementing Condition-Based Maintenance (CBM) Using Real-time Data is here.

 

This lab provides hands-on exercises to illustrate how to execute CBM in your plant using process and condition monitoring data.

 

A popular approach is to use a layered implementation, i.e.

  • Usage-based Maintenance (UbM)
  • Condition-based Maintenance (CbM)
  • Predictive Maintenance (PdM)

 

LayersOfAnalytics_Maintenance.PNG

 

Other useful links:

Getting started with IIoT sensor deployment

 

 

Lab Manual and the slides used are attached.

10-minute read

 

Data quality is a foundational prerequisite for data engineering in your digital transformation and data science/machine learning initiatives.  This is a "how-to"  for getting started and implementing a layered approach to data quality with the PI System. 

 

 

With streaming industrial sensor/IoT time-series data, a layered approach to data quality gives you the flexibility to apply fit-for-purpose techniques mapped to use case requirements. 

 

Whether it is an individual sensor (pressure, temperature, flow, vibration etc.) or an individual equipment such as a pump (with a few sensors), or a functional equipment group (FEG) i.e. a feed water system (made up of a group of equipment like motor/pump/valve/filter/pipe assembly) or an entire process unit or a plant – the scope of the data quality requirement has to be understood in its usage context.

 

 

 At the individual sensor level, validation checks for data quality are basic and easily done - for missing data (I/O timeout), flat-line data (sensor malfunction), out-of-range data, and similar.  Often, the gaps in data are due to issues in source instrumentation/control system, network connectivity, faulty data collection configuration (scan rate, unit-of-measure, exception/compression specs) etc. And, the recommended practice is to use a monitoring app (shown below as PI System Monitoring)  to proactively detect and correct them - instead of imputing values for the missed/erroneous readings using post-processing logic. 

 

Below are some screens from PI System Monitoring (PSM):

 

 

 

 

 

 

Sensor data quality issues (flat line, bad data, stale data...) that are beyond the logic in PI System Monitoring can still be trapped with simple analytics - see Chapter 9.  

 

Figure below shows the analytics related to flat line, bad data, stale data...: 

 

Next, for related sensors in an individual equipment (motor, pump, heat-exchanger, ...), simple math/physics based analytics/correlations and inferred metrics such as chiller kW/ton, pump head/flow etc.can be applied to cross-validate data quality.  

Also, using the PctGood (percent good) in the time-weighted analytics provides a way to ensure that bad or missing data is not propagated to dependent calculations - most commonly, aggregated metrics such as totals, hourly averages, and other such statistics.  And, you can use simple display features to visually differentiate between good (black-on-white) and not-good (white on black) data - see the example from NRC (Nuclear Regulatory Commission) below.

  

 

For FEGs (Functional Equipment Group) such as a feed water system, an air handler etc., with 10s or 100s of sensors or an entire process unit/plant with 1000s of continuous measurements, data quality validation requires multivariate logic. Even with healthy source systems and data collection infrastructure reporting apparently “good” sensor readings, the inconsistencies across multiple sensors can only be inferred at a system level using analytical data models, and when required, connected flow models with mass/energy balance.  

 

To illustrate a FEG, consider an air-handler – part of an HVAC (heating, ventilation and air condition) in a  building management system (BMS).  Sensor data for Outside Air Temperature, Relative Humidity, Mixed Air Temperature, Supply Air Temperature, Damper Position, Chilled Water Flow, Supply Air Flow, Supply Air Fan Power etc. are available, and we walk through a data driven (machine learning) analytical model to cull out “bad sensor” data...more.

 

In another use case, a pre-heater example illustrates the need for a connected flow model. The stack flue gas O2 (oxygen) measurements are unusable until reconciled with mass/energy balance corrections.

 

A steam cycle is another use case with a connected flow model to validate the numerous sensor readings via mass and heat balance. 

 

 

To recap, a layered approach to data quality includes:

  • PI System Monitoring ...more
  • Simple analytics to validate (individual) sensor data streams ...more
  • Simple analytics to validate at an equipment level (several related sensors)
  • Advanced (multivariate/machine learning) analytics to validate at a FEG (functional equipment group) level - tens or hundreds of sensors ...more
  • Advanced (connectivity model based) analytics - data reconciliation with mass/energy balance (one or more process units) ...more

 

Data quality is a foundational prerequisite for data engineering in your digital transformation and data science/machine learning initiatives, and a layered approach as discussed in this blog can be helpful.

 

Also see  Industrial IoT time-series data engineering - a layered approach to data quality - YouTube 

 

 

For the last couple of years I have been using the OSIsoft tool to update all Word document field properties using Excel sheet . This was used to create all the design specifications and installation operational qualification documentation .

 

The main challenge is to have the Excel add-in work stable an every time... and I finally found the issue  and how to SOLVE it.... with this short Blog I want to share my findings... 

 

Download Add-in can be downloaded via: (previous post on PISquare) 

This was the last version of the excel addin we produced - its posted in the template archives category in this group.

 

But again, this unsupported addin has been deprecated and replaced by word macros, so you're welcome to try... but plan on moving away from its use.  My suspicion is that the computers you're having trouble with the macro may be 64 bit Office, so you could try this version and make sure in excel you use the 64 bit addin version if that version of Office is 64 bit.

 

ERROR: somehow, after you open the excel sheet and try to update the properties, you get this error message

CAUSE: the root cause (in my case) was the auto save function in combination with office 356 - one drive which start to upload the file, as soon as the Excel add-in is trying to update the Word properties fields.

 

FIX :

1)   copy the all relevant document (word templates and Excel sheet) into a location, that is not sync by onedrive

2)   disable the auto save feature

3)   Open Excel sheet and select the correct tab

4)   Open add-in menu, select Export and select the correct word document.... and wait.....

5)   Close the excel sheet (this will make sure that MS word is also closed)

6)   goto step 3

 

 

Hope this helps you .... it did the trick for me.....

 

Robin

Filter Blog

By date: By tag: