Well, best way if of course to prevent this data to be captured. I'd suggest setting the compression settings (or interface settings for interfaces like UFL) for the tag to prevent these values in the first place.
Second best i can think of, if this really an issue, is to copy the data to a second tag using a PE tag or Abacus, and loose the duplicates there.
Thank you Roger,
I forgot to say i cannot manipulate server data, I'm just a client pulling values.
As a client just pulling data you have discovered an issue that should be addressed on the server. Please consider reporting it to the PI Data Archive administrator.
Well, I though I could perform a remote procedure call to directly filter server-side all duplicated values.
Just as summaries which are remotely calculated.
All you need to do is pass the filter expression parameter in your point.Data.RecordedValues call. The expression is the fourth parameter, and is optional.
The expression that would generally fit your needs should be something like
'tag' <> prevval('tag','*')
Where you use the tagname of the current PIPoint object in place of where I show 'tag', make sure that you keep the single quotes. This will remove repeating values and only return the first of each repeating sequence.
You may also want to change your boundary type to outside (third parameter), so that repeating values leading into your time window aren't completely filtered.
FYI: the AFSDK makes this even easier because you can use '.' in place of 'tag', which the SDK infers that you mean to use the referenced point. That way you don't need to custom format a string for every point you wish to filter.
I hope this helps.
your solution worked fine but the evaluation timing (the parameter I wanted to improve with my filter) drastically increased (in some cases even 10 times more).
It's probably due to the query calculation required in order to obtain filtered values.
I don't see any kind of alternative way to perform what I need.
I think i will keep filtering my values client-side and hope for the best.
Thanks anyway for the support.