This could be caused due to Backfilled/recalculated data for the analysis.
For analyses writing outputs to attributes configured as PI point, backfilling or recalculation would result in writing out-of-order events.
Such events are written to the PI Data Archive without compression. Out of order events are not filtered by compression. Therefore, out of order data result in more events being stored in the archives
than if the events were in order and filtered by compression.
Thanks for the reply Thyagarajan,
Ok, I understand. Yes this is an analyses writing outputs to attributes.
If my analyses depends on Pi points X1, X2, X3, does the archives store uncompressed data for those three points? How can it have this uncompressed data ready just in case of backfilling for out of order events?
I'm just trying to further my understanding! thanks.
I'm not sure if I understood your question correctly. The values will not be compressed only for the analysis outputs that are backfilled. Writing events previous to a tag's current snapshot value makes those events out of order. For the inputs attribute/pi points, there is only one set of values which exist in the archive. (this is usually compressed if data that came in-order).
Note: Compression in the PI Data Archive context is an irreversible process.
The compression that is applied is lossy compression (irreversible compression) so you can't reproduce the raw data that came in. If compression is applied for points X1, X2, and X3, you will be recalculating your data using lower fidelity data. In most cases, this is not an issue. When you recalculate, if the Step point attribute is turned off for the inputs, data interpolation is performed if a values does not exist in the archive at the sample time.
In your case, I'd recommend disabling compression on the calculation output tag, such that a value is always stored at two minute intervals.