5 Replies Latest reply on Aug 26, 2015 6:58 PM by skwan

    "Attach to analysis" in Abacus

    Rhys Kirk

      I would very much appreciate a feature where I can somehow attach to an existing, running Analysis to see all details of that Analysis from the PI Analysis Processor including an execution of the analysis - what the inputs were, where they were read from (cache or source) etc.

      There has been so many occasions where particular Analyses stop behaving as expected and I have to have blind faith that the PI Analysis Processor is executing correctly. Using the evaluate button in PSE (or equivalent programmatically) is in the wrong context compared to the PI Analysis Processor so it only proves that a calculation "should" be executing correctly. I need to be able to debug a particular analysis in detail. Next to evaluate in PSE it would be great to have a "Attach to Analysis" button so that the PI Analysis Manager requests debug information from the PI Analysis Processor for the analysis.

      The usual response from TS (no disrespect to TS) is to restart the PI Analysis Processor which simply masks the problem until it resurfaces.


      Before I go digging, and we know I like to go digging, is there already an undocumented way to do this? What is the likelihood of such functionality


      [Starts digging]


        • Re: "Attach to analysis" in Abacus
          Mike Zboray

          We generally don't build "secret" features that have no manifestation in the UI, certainly not something as powerful and complex as an analysis debugging capability. Programmatic exposure is another story, but if the capability is there, we would have exposed it to the user in some fashion.

          • Re: "Attach to analysis" in Abacus

            +1 to Rhys on this. I have around 300 + analysis configured for a client and this is the only feature I missed most of the times. If an analysis suddenly goes bonkers and starts giving unexpected results, I had no way to troubleshoot or debug it directly other than going through each and every input, checking its value etc. There were cases where analyses was writing to 5 output attributes, but on a particular day, it wrote only to 3 out of 5 for the current date, and the remaining 2 did not get written at all. And guess what worked to resolve this? Stopping and restarting analysis as periodic 1 minute to test for next minute. And then it wrote to all 5.


            In our case we had correct result in some other historian against which we had to validate our PI results. But now when I think about it, what would happen in greenfield implementation for eg? We won't have any validation point and would have blind faith in the analysis processors and suddenly if things start acting weirdly in production/live environment, wrong results, no executions, etc etc.... (Scary)


            So a debugging procedure or a debugging add in for analyses would be really helpful. Not sure if i made sense Rhys, but that's what I guess I understood from your post above so thought of spamming your post with my thoughts! Sorry for that!