6 Replies Latest reply on Jul 8, 2016 12:18 PM by Roger Palmen

    KPI calculation by AF Analytics


      Hi all,

      we are currently trying to implement some technical KPIs. On some reports, these KPIs are going to be compared between different sites in order to determine the sites on which an intervention has to be performed with a high priority.

      In a first approach, using AF Analytics (especially for the use of template) seems a good idea.


      For example, 2 temperatures (Temp A and Temp B) are recorded in 2 PI Tags on a hourly basis.

      And KPI =  Min( (Number of readouts with Temp A < 50°C / Number of readouts Temp A), (Number of readouts with Temp B < 32°C / Number of readouts Temp B)) * 100%


      On the reports (which won't probably be implemented with PI tools), this KPI will be visualized at least on a daily basis but the user will have the possibility to change the period.

      3 questions :

      - I didn't find a function in PI Analytics which allows to count the number of readout below a threshold. Is there an other way to implement this KPI than adding a new PI tag containing all Temperatures below these thresholds?

      - To calculate this KPI on a daily basis, I guess this calculation has to be triggered each day at midnight for the previous day. If, for some reason, the temperatures for the previous day are stored in PI with some delay (for example break in communication with telemetry system), is there a possibility to automatically backfill the results of this KPI?

      - If we want to visualize the value of this KPI for example for one week, the average of the daily KPI value will be wrong (in case of different number of readouts per day). So, in order to do that, shall we implement this calculation directly in the reports (for example by the use of PI OLEDB)? In order to do that, regarding performances, maybe it's better to store in PI tags the number of readouts (for Temp A, temp B, Temp A below threshold, Temp B below threshold). What do you think about it?


      Thanks in advance for your answers,




        • Re: KPI calculation by AF Analytics
          Dan Fishman


          There is no function to filter the count below the threshold; however, I would review this KB article: KB01120 - Filtered calculations in Asset Analytics.  Essentially, you have to use a second tag to obtain an event count. 


          Currently, it is not possible to automatically backfill the results of the KPI if the data stream is delayed.  This is a familiar question and I don't know where this fits into our road map for AF Analytics.  If I hear anything about our plans to implement such a feature I'll let you know or have Steve Kwan (the AF product manager) post.


          Regarding your performance related question, if you are already filtering the values due to the first issue with filtering, you might as well use those values. It probably is easier to do it using two tags using PI OLEDB Enterprise.   I wish there was a better way to solve your first issue since it isn't that expensive to filter and it allows you to dynamically change your limits in you need to.  The filtering you need is relatively straightforward using AF SDK with PE syntax. 




          • Re: KPI calculation by AF Analytics
            Marcos Vainer Loeff

            Hi Jan,


            Here are my answers:


            1) I couldn't find a function either. I think that in version 2.8, this is still not possible to do. Ideally, you would like to add a new overload for the EventCount expression, something like:


            "EventCount(attribute attname, time startTime, time endTime, string filterExpression)"


            Is this what you are looking for? I can forward to the Product Manager of the product.


            2) There is no out of the box solution to automatically backfill the results. But triggering a backfill programmatically should be possible soon. Please refer to the following thread for more information:

            Re: Automate AF Analysis Backfill



            3) For creating reports there are a lot of different strategies and PI Developer Technologies available, for instance, PI AF SDK, PI Web API and PI OLEDBs. Yes, you can get the recorded values and calculate the KPIs on the fly. Due to performance reasons, I prefer PI Web API and PI AF SDK, as it is possible to do bulk calls to retrieve data from multiple streams.


            Please let me know if this helps you!

            1 of 1 people found this helpful
            • Re: KPI calculation by AF Analytics
              Lonnie Bowling

              It seems to me that you are trying to figure out the time below a certain threshold as a percentage of the total time range. The counts you are talking about just a happen to be hourly. I view counts as just a representation of time here. The readings in this case are hourly, but what if they were not? Have you looked at event frames? You could trigger an event frame off the temperature dropping below the threshold, close the event frame when it goes above the threshold. To get the % value, you query the event frames over the period you are interested in. Now this will not solve the minimum part of the equation. You could do that in the client application.


              This would be a good use case for the requested feature to allow for rollups to be calculated using event frames, that Roger Palmen talks about:





              2 of 2 people found this helpful
              • Re: KPI calculation by AF Analytics
                Roger Palmen


                In this case the readings are hourly, sou you could make use of that by creating 24 checks for each of the past 24 hours, and sum those. It ain't pretty but it works. So with approx 51 Analysis expressions you should be able to get your results.


                But using EventFrames as Lonnie suggested could be another option.

                Another idea would be that we should be able to filter the data of a PI Point before we use that data in another attribute. I see these kinds of requests where we need to perform aggregations on filtered data too often, and currently there is no separation between the logic to filter a dataset and to aggregate the dataset.