4 Replies Latest reply on Aug 15, 2018 2:36 AM by Dan Fishman

    Backfill Sampling Density Different than Data Capturing?

    Seedubs

      Hi there!

       

      I am wondering why my sample density does not stay consistent after backfilling.

      92.PNG

       

      I then re backfill my data for the last 5 days,

       

       

      I currently have my trigger scheduled for every 2 minutes. When backfilling, it works, but not for real time capturing.

       

      Why does this happen? I would ideally want the 2 minute sampling all the time.

       

      Thanks,

       

      Corey

        • Re: Backfill Sampling Density Different than Data Capturing?
          tramachandran

          This could be caused due to Backfilled/recalculated data for the analysis.

          For analyses writing outputs to attributes configured as PI point, backfilling or recalculation would result in writing out-of-order events.

          Such events are written to the PI Data Archive without compression. Out of order events are not filtered by compression. Therefore, out of order data result in more events being stored in the archives

          than if the events were in order and filtered by compression.

            • Re: Backfill Sampling Density Different than Data Capturing?
              Seedubs

              Thanks for the reply Thyagarajan,

               

              Ok, I understand. Yes this is an analyses writing outputs to attributes.

               

              If my analyses depends on Pi points X1, X2, X3, does the archives store uncompressed data for those three points? How can it have this uncompressed data ready just in case of backfilling for out of order events?

               

              I'm just trying to further my understanding! thanks.

                • Re: Backfill Sampling Density Different than Data Capturing?
                  tramachandran

                  I'm not sure if I understood your question correctly. The values will not be compressed only for the analysis outputs that are backfilled. Writing events previous to a tag's current snapshot value makes those events out of order. For the inputs attribute/pi points, there is only one set of values which exist in the archive. (this is usually compressed if data that came in-order).

                  Note: Compression in the PI Data Archive context is an irreversible process.

                  • Re: Backfill Sampling Density Different than Data Capturing?
                    Dan Fishman

                    The compression that is applied is lossy compression (irreversible compression) so you can't reproduce the raw data that came in.  If compression is applied for points X1, X2, and X3, you will be recalculating your data using lower fidelity data.  In most cases, this is not an issue.  When you recalculate, if the Step point attribute is turned off for the inputs, data interpolation is performed if a values does not exist in the archive at the sample time.

                     

                    In your case, I'd recommend disabling compression on the calculation output tag, such that a value is always stored at two minute intervals.