AnsweredAssumed Answered

What is the Optimal Method for Calculating Data Over a Range of Time?

Question asked by SvenBatalla on Jun 30, 2015
Latest reply on Jun 30, 2015 by SvenBatalla

Using the AFSDK, we have written an application that performs calculations to determine the amount of time a value is outside an operating window.  Conceptually, the principle is quite simple...

 

For example, assume the following image:

In the image above, the high limit is 7.5.  We can see that at the 6th time unit, the value is now above the high limit.  Not considering time-weighting, we can see that the value was above the limit for 5 time units out of the 10 known units (so 50%).  Using the AFSDK, we can calculate the amount of time above the limit fairly easily:

 

AFValues values = AFCalculation.CalculateAtTimes(
   myelement,
   "TimeGE('Source Tag', '*-10m', '*', 'High Limit')",
   new[] {new AFTime("*")});

 

In the example above, the calculation will look at all the values in the "Source Tag" attribute of my element for the past 10 minutes (just to match my time-units example) and compare it to the value in the "High Limit" attribute.  This will spit out a value indicating the amount of time the "Source Tag" was above the "High Limit" attribute and result in the same value we got in our thought-experiment above.

 

Except...

 

Now consider the following image:

In the image above, the high limit changes.  At the 8th time unit it changes to 12.5.  (This value is stored in PI, so that means we have the high limit history and can trend exactly this.)  This means that the value is now considered to be back below the high limit as of the 8th time unit.  Still not considering time-weighting, the result is that the value is now only considered above the high limit for 2 time units (the 6th and 7th).  Using the same code above, the result of the calculation would be ZERO!  This is because the "TimeGE" function uses the value of the "High Limit" attribute at the times given in the 3rd parameter or possibly at the given end-time of the query (I did not test which).  In any case this is bad news for my application because 0 is very much different than 2 in our world.

 

To work around this problem, the solution proposed is to retrieve interpolated minute-by-minute data for all the tags in question then iterate through and compare each value individually.  In other words:

ValueHigh LimitIs Out-of-Range?
57.5No
57.5No
57.5No
57.5No
57.5No
107.5Yes
107.5Yes
1012.5No
1012.5No
1012.5No

 

This solution yields me exactly the correct answer and allows me to deal with bad data in a custom manner.  The two downsides are:

  1. The performance may get hairy over longer time periods (e.g. 1 year) or many elements (e.g. 20,000).
  2. The calculation is easy to prove this way, but not being time-weighted means it is not precisely correct.  The calculation assumes that if a value is above the limit at a given minute that it was above that limit for the entire previous minute (like a step chart).

 

So the question to you knowledgeable folks is:  is there a better way to do this?

 

Thanks!

Outcomes