AnsweredAssumed Answered

Best way to retrieve whole PI AF hierarchy elements with non PI data reference attributes.

Question asked by AbhayP on Nov 27, 2016
Latest reply on Nov 29, 2016 by Mike Zboray

I am building a job scheduler window service and scheduling it once per day. This scheduler uses AF SDK to retrieve:-


1. All elements with associated attributes (non PI Points)

2. All attribute templates (if exists)

3. All element templates (if exists)


I am using static method of AFElement class - "public static AFNamedCollectionList<AFElement> LoadElementsToDepth(IList<AFElement> elements, bool fullLoad, int depth, int maxCount);" to retrieve all elements and then I am iterating all elements for getting associated attributes:-


            AFAttributes attrsAfAttributes = element.Attributes;

            foreach (var attribute in attrsAfAttributes)


                if (attribute.DataReference == null ||  !attribute.DataReference.Name.Equals("PI Point", StringComparison.OrdinalIgnoreCase))


                    var afValues= attribute.GetValues(timeRange, int.MaxValue, null);



Question 1: I can see some problem with "LoadElementsToDepth" if there are huge list of elements (it may increase memory footprint and I don't know the depth in advance). Is there any better way to get all elements with non PI attributes? (which is better in performance and with less memory footprint). I am not sure if client installation has templates or not (elements and/or attributes) so I am not sure if I can use overloaded methods to get elements based on template parameters.


Question 2: I am planning to load full snapshot only once at the start of service and will subscribe to Changed event of AFDatabase to retrieve changed element/attribute. Is it good implementation choice? Also if something fails on Changed event then I will load whole snapshot again (data duplication is not a problem for me). What is the different between AFDataPipe and Changed event of AFDatabase?


My end goal is to push whole hierarchy once per day into data lake.


Thanks in advance.