gfmalek

Is this a good use case for a Custom Data Reference?

Discussion created by gfmalek on May 22, 2020
Latest reply on May 29, 2020 by skwan

We have a PI tag that records an emissions rate (values are roughly 1 minute apart).  The 30-day average of the emissions rate has to stay under a limit, say "x".  I am trying to build an AF analysis that estimates how long it will be until the 30-day average exceeds the limit "x", assuming that the current emissions rate continues perpetually.  This would only be relevant if the current 30-day average is beneath the limit, but the current rate is above the limit, of course.

 

The math for this is rather simple.  However, the analysis I've created (outlined below) takes far too long to evaluate using 30 days worth of data.  I fear that when I backfill this calculation, it will have faulty results, or will create problems when trying to display or update in real time.

 

When I tried this using only an hour of data, on the other hand, it evaluated almost instantaneously.  This makes me believe the problem with my approach is how computationally expensive it is.  As an amateur "programmer", I have little idea of how computationally expensive calculations are in general.  I hope someone here has an idea of how to make this analysis less expensive, or if making a CDR instead is the optimal approach.

 

Here is the analysis:

 

EndTime                        :=             '*'

StartTime                       :=             EndTime - 3600*720       (3600 seconds in 1 hour, 720 hours in 30 days)

Limit                               :=            .12                                    (The "x" as outlined above)

CurValue                        :=            TagVal('Tag','*')

CompressedValues        :=            RecordedValues('Tag',StartTime, EndTime)

 

TWACurValPerpetuity     := 

 

MapData(CompressedValues, ((TimeStamp($val)-StartTime)*CurValue + (EndTime-TimeStamp($val))*TagAvg('Tag',Timestamp($val),EndTime))/(EndTime-StartTime))

 

FilterOnlyExceedTimes   :=            FilterData(TWACurValPerpetuity, $val>Limit)

TimeUntilExceed             :=            TimeStamp(FilterOnlyExceedTimes[1])-StartTime

 

 

The variable "TWACurValPerpetuity" calculates, at every emissions rate data point timestamp, a rolling 30-Day average, replacing values at the back end of the time frame with the current value at the front end.  The last two variables effectively identify the first timestamp where this extrapolated, time-weighted average exceeds the Limit variable.  Sorry if this is poorly described, and I will try to elaborate if asked.

 

 

I also tried replacing the TagAvg() piece inside the "TWACurValPerpetuity" variable with a simple Avg function of the "CompressedValues" variable.  I thought this might reduce the number of times the analysis had to pull the tag's archive values.  Unfortunately, the analysis had still not finished evaluating after ~10 minutes even with this adjustment.

 

 

My idea for the CDR approach would be to first pull the compressed tag values, then replace each value  in the array one-by-one with the current value, until the average of the new array surpasses the limit.  

 

 

Any thoughts or advice would be greatly appreciated.  Thanks.

Outcomes