Skip navigation
All Places > All Things PI - Ask, Discuss, Connect > AF Community Library > Blog
1 2 Previous Next

AF Community Library

24 posts

I recently wrote up a short introduction to using the URI builder Data Reference to create links to google maps. Here, I wanted to do the same to build a PI Coresight Ad Hoc display.


Ad Hoc displays are limited to display trends, so this particular example might not have a wide appeal, but I believe the techniques shown below should be more widely known.


In this example, we will show how to display the URL to open the following PI Coresight page.


We typically recommend to store the name of a PI Data Archive in a top level element and I did the same to store the name of the PI Coresight server information.


Let's now look at how I configured the URI Builder part by part.


First, the address uses the value of the address as: https://'\PI Coresight|Name'/Coresight/. The '\PI Coresight|Name' reference is what allows me to get the value of the PI Coresight server name.

I can see that this are working out well so far by looking at the preview view to the left of the ok button.



The Display field now simply the fixed parameter "/Display/AdHoc".


To build an Ad Hoc display, we need to get the path of the attribute we want to display in the Ad-Hoc display.

We can do this using a string builder reference. What we need is to build a string that contains the name of the AF server, the database, the path of all elements and finally the name of the attribute. In this case "temperature".


The string builder configuration is thus: \\;"%System%";\;"%Database%";\;%ElementPath%;|;Temperature;


We use this string builder data reference attribute in the URI builder as the following key/value pair.


The ".|" in ".|path" simply refers to accessing a child attribute of the URI builder attribute.


This makes a complete example, but to flesh things out more, we can add other parameters. Thus, I add in EndTime and StartTime as child attributes of my URI builder. (I also set them these child attributes as invisible as they do not need to typically seen).



I can then add them along side with other url parameters. Parameters such as kiosk mode and hidetoolbar.


This gives me a quick way to see this information on PI Coresight.

The following document describes an example of using AF and Event Frames for perform and report on a simple volume balance.  This can be applied to a mass or energy balance.

This is an ongoing series introducing the various code examples I and others have created in the goal of localizing the various AF example kits.


Part 0: URI Builder data reference in AF 2016

Part 1: Localizing the AF Example kits - Part 1 - Exporting a database

Part 2: This entry

Part 3: Localizing the AF Example kits - Part 3 - Reset to Template


Objects in AF tend to have configuration string, it is a way to serialize the object and is what allows you to export your favorite AF Template and share it with friends and family grateful coworkers.


As you progress in your path towards AF mastery, your knowledge of each of these configuration strings will surely grow and I will talk more about then in future entries to this blog series. This time, I want to highlight one configuration string that gave me pause at the beginning of this project to localize the AF Example Kits, the one underlying the AFVariableMap object.


The AFVariableMap object is what allows the analysis service to know where to save the output of expressions. That is, it what keeps track of the left and right side of the lines in a expression calculation.



When you look at the configuration string of an AFVariableMap, by say exporting a database to a XML file, you can see it looks like the following.



I can see the name of my variables, TotalFlowToday and ConstantNumber as well as my mapped attributes, TotalFlow and SixtyFour. The mapped attributes don’t just have their name stored, but their attribute id is stored. This is good as it allows you to rename the attributes, and the mapping will still work; if analysis tries an attribute lookup by name and fails, it will then attempt to do a look up via attribute id.

The issue is now, those ids are only valid in the database in which the elements were created. Those ids will be totally different if your were to recreate this configuration in any other database. Thus, if you export the element which contains this analysis and import it in a new database, everything will look correct as the lookup by attribute name will continue to work, but internally, these cached ids will be all wrong. This explains why renaming no longer preserves mapping for imported databases.


In AFSDK, there is happily a way to resolve this issue using the function AFAnalysisRule.RefreshConfigurationAndVariableMapping

It will look at the AFVariableMap object and update the GUID for whatever it now happens to be. To make use of it, it is quite simple, all you need to do is loop over all analysis in your freshly imported database and call this method on them.


foreach (var analysis in db.AnalysisTemplates)


The full code can be found here:

Create calculations that are not stored in a PI Tag and understand which functions are supported for this

Sometimes it is desirable to perform calculations but not to store them in a PI tag.  This is done by creating an Expression Analytic and mapping it to an Attribute that does not have a PI Data Reference.  This Analytic is only performed when a PI AF Client tool asks for the value of that Attribute.  When you do this, the Data Reference for the output Attribute is set to Analysis and the Settings describes which Analysis is involved.


There are some considerations and limitations that you need to think about before creating On Demand calculations:


  • The calculation of the Expression is performed on the PI AF Client.
  • What is the frequency of the calculation, e.g. less than one second frequency?
  • Are the calculations daisy-chained, i.e. one On Demand Calculation uses Attributes that are also On Demand calculations?
  • How many of these calculations will be performed during one data request?
  • What are the typical time ranges for the data requests?

All of these will have an impact on performance of retrieving values for these Attributes.  Therefore, performing On Demand calculations should be used sparingly.


Because some functions are very calculation expensive they have been either “restricted” or “banned” from being used for On Demand calculations.


This means that the functions listed below are allowed in an On Demand calculation, but they do not support summary data calls. This applies to the following function list:

  • EventCount()
  • PctGood()
  • Range()
  • StDev()
  • TagAvg()
  • TagMax()
  • TagMean()
  • TagMin()
  • TagTot()


In this example, we take the Flow Attribute and create a rolling 5-minute average, and map it to the Attribute Average Flow.  This Attribute is not mapped to a PI tag, but we can evaluate it over a time range.  See the Figures below.




However, if you try to use the Average Flow Attribute in another Expression where you want to use it in a Time Series function you will get the following error because data summary calls are not supported for On Demand calculated Attributes:


If we create a new Attribute, Average Flow Historized, and map it to a PI tag.  Then map the average calculation to this new Attribute.


Now if we perform the TagTot() on this Average Flow Historized Attribute all is ok.




This means that the functions listed below are not allowed in an On Demand calculation.  If you try to use them in that way you will get an error when the Analyses tries to start up for the Elements:


This applies to the following function list:

  • DeltaValue()
  • FindEq()
  • FindGE()
  • FindGT()
  • FindLE()
  • FindLT()
  • FindNE()
  • HasValueChanged()
  • NumOfChanges()
  • TimeEq()
  • TimeGE()
  • TimeGT()
  • TimeLE()
  • TimeLT()
  • TimeNE()
  • NoOutput()

Understand how to use date-times in your expressions

Using date-time in Analytic Expressions looks straight forward, but you can get yourself in a jam if you do not know the tricks.  Suppose you want to calculate the rate of change of an Attribute and also the duration over which this rate of change was computed.  I want to store these results as well.


In this example, we want to calculate the rate of change for the Volume1 Attribute and also the duration for this rate of change, and store these in other Attributes.  You would be tempted to use the following approach, see the Figure below.


The Variables st and et store the date-time of the archived values of the Volume1 Attribute.  The DeltaTime Variable is then just the difference between these two Variables.  The result is in dd:hh:mm:ss format.  We can use it in the ROC Variable calculation and we get a result.  Behind the scene the DeltaTime Variable was converted to seconds, so the resulting ROC Variable is in bbl/sec. This is fine, but we also wanted to store the DeltaTime Variable to an Attribute.  When you try this you get an error on the Element Attribute.


Try to change the unit of measure for the Raw Delta Time Attribute to DateTime, but then you get an error that the DateTime value type does not support UOMs.  Ay Caramaba.

No worries, the trick is to use the Float() function.  Use Float() to change the DeltaTime Variable to a double.  Then use the Convert() function to assign units of measure of hours to the DeltaTimeInHours Variable and perform the rate of change calculation.  See Figure below for details. You can now map the DeltaTimeInHours Variable to an Attribute with no problem.


Write the results of your expressions to a time other than the trigger time

A new feature in AF 2.7 (and later versions) is the ability to put the results of an analysis at a different time other than the execution time.  This is very useful in cases where some data, like lab information, gets entered but it refers to information for a previous time.  The calculation is done when the data comes in, but the results need to be written some previous time.  Another scenario is to a future point in time such as calculating new targets for the end of the month based on current production.  In this case, the value is calculated now and is timestamped at the end of the current month, relative to current evaluation or trigger time.


Writing back in time or to future time:

The laboratory density data comes in at 7 am, but it refers to the previous day period that is defined as 4 am to 4 am, in this case. The mass of material for the previous day period is calculated by the total of the volume of material for the previous day period times the density that was recorded at 7 am (but refers to the sample taken for the 4 am to 4 am period).  The calculation is triggered at 7 am and we now want to write the result back to 4 am.  So we used the Advaced… button and make the selection of the Output Time Stamp to be Relative to Trigger Time  typing t+4h, as shown in the Figure above.  Presto, problem solved.


Note: you can use the same technique to write to future PI Tags, for example a new calculated target production.

How to create filtered averages, totals, and rollups

Calculations like an average of a set of data, but only when the values are above a certain threshold cannot be done directly with the built in functions in Analytics.  The examples below show how you can perform filtered calculations for Expressions and for Rollups. (There is a Knowledge Based article on this subject, KB01120 )


Suppose you want to get an average flow rate for a pump over a time range, but only when the pump was pumping. 


In this example, the time weighted and event weighted averages are calculated for a one day period.  The flowrate during this time period is 0 on several occasions.  Since we want to know the average flowrate of the pump only when it is running (flowrate  > 0), then using the built in TagAvg() and TagMean() functions will give us the wrong results.  The Figures below show the data set, the Expressions and the wrong results.



Filtered Time-Weighted Average

To get the correct time-weighted average we need to first create another Attribute, Filtered Flowrate in this example, and use the Formula Data Reference as shown in the Figure below.


Then create an expression that first uses the TimeGT() function to calculate the amount of time that the Flowrate Attribute was greater than 0 over the one day period.  This result is in seconds; therefore, we need to convert it into days, which is the PumpRunningInDays Variable.  Then TagTot() function is then applied to the Filtered Flowrate Attribute to sum the values
over the one day period.  The TagTot() returns the total value over a day, so we then divide this by the PumpRunningInDays Variable.



Filtered Event-Weighted Average

To get the correct event-weighted average we need to create another Attribute and a PI Point.  Use an Expression to filter out the values that are 0 and only write these to a new Attribute.


Then use the built in function TagMean() to compute the event-weighted average on this new Attribute.





We want to perform a rollup of data from child-elements, but only include the data where some other conditions are met, in other words a conditional rollup.  The Rollup Analytic can perform the aggregations from child-elements, but cannot filter out unwanted values.  To achieve this we need to follow the procedure presented in the example below.


Suppose we want to get the total production for all gas wells in a region, but not count flowrates that are below 25 MM scf/d.  Therefore, for the wells we create a new Attribute, Filtered Production, mapped to a PI tag.  The Expression for the filtering is shown in the Figure below.



The wells are child-elements of the Well Padd1 element.


The Well Padd1 element then has a Rollup Analytic on the Filtered Production Attribute.


Insert comments into your Variables so you and others can understand what you did later

Have you ever written an equation or a piece of code that made sense to you at the time only to come back a few weeks later (or in my case the next day) and wonder what you were thinking?  Worse yet, looking over somebody else’s effort and trying to decipher it?  Then you might find this section valuable to you.  A good practice is to always document and/or reference your calculations.  You may not know, but you can add comments to the equations you write, and stop having to spend time reinterpreting the calculations you or someone else wrote some time ago.  The sections below show you how to create single and multiple line comments, and how to make your equations more readable.


Single Line Comments

To add a comment to the expression in a Variable you need two things: first, you use // to preface the comment and then at the end of the comment press the Shift-Enter key combination.  You can add the comment at the beginning or at the end of the equation, but personally, it makes more sense to me at the beginning.  You can also add comments to the end of a line in the expression.


The set of equations in the Figure below are calculating the percent grade change during casting of steel I-beams.  As you can see without comments it is not easy to understand the calculation.



Adding comments makes the code more understandable, see Figure below.  Note that some comments are interspersed in the expression, like the Pct Variable in the Figure below.


Multiple Line Comments

You can add multiple lined comments in one of two ways. You can use // to preface the comment and then at the end of the comment press the Shift-Enter key combination, and do this on several lines.  Alternately, you can use /* at the beginning of the first line and then */ at the end of the last line of the comment.  When you want to go to a new line press the Shift-Enter key combination.



Make sure you get the correct result by using UOMs in your calculations

In some situations, the expressions will contain calculations that are dependent on a specific unit of measure for each Attribute used in the calculation.  We may also want to make sure that if the
output of the expression is in a certain unit of measure, and we map it to an Attribute, that it will be independent of the Attribute’s default unit of measure setting.  For example, let’s say we have an expression result in lb and it is mapped to an Attribute whose Default UOM setting is also lb.  Then later, we change the Attribute’s Default UOM setting to kg, but the expression result is still in lb and it will not be converted to kg unless you do what is in the example below.


In this example, I want to calculate the mass from the Volume and Density Attributes and assign it to the Mass Attribute.  To make this bullet proof so that I do not get any surprises in the future I make sure that the units of measure for Volume and Density are in the units that will give me the resulting mass in lb.


When I look at the result of the analytic in the Element all is good.


Sometime later, someone has changed the default unit of measure for the Mass Attribute from lb to kg.  I now get the wrong answer.  This is a bit of a sticky wicket as the English say.


If I was smarter when I first created the Analysis this would not have been a problem.  If you put in the following additional Variable in the expression and map that to the Mass Attribute, then all will be correct.


The Element now show the correct value, all is well.


(Note: KB01366 knowledge based article gives more details on UOM behavior in expressions)

(Use Attributes from other Elements in your expressions)

In some situations, there is a need to include Attributes from other Elements in the expression.  A typical example is a Parent-Element will contain information that the Child-Elements need for a
calculation.  Therefore, it would be great if we can reference Attributes from other Elements in the expressions.
You may also want to perform calculations (other than Rollups) that involve using values from parent and child-elements.  One way is to create additional Attributes in the current element and then reference the values from the parent and/or child-element.  This is not the best approach and not very flexible.  The better way is to reference these is inside the Analyses themselves. Well say no more.

In this example, we have well pads that have production targets and each well is to perform a calculation to determine the well’s production as a percent of the target production.  In the Figure below, Well Pad 1 is the Parent-Element has the Attribute Target Production.   Well Pad 1 also has Well Child-Elements.


The Wells then have an expression configured to retrieve the Target Production Attribute from the Parent-Element and perform the percent of target calculation, as shown in the Figure below.  The relative reference, ..\|Target Production, means go up to my Parent-Element and give me the value of the Attribute Target Production. (The Guide to AF and EF Substitution Syntax has more details on this type of syntax.)



If you are going to use Attributes from other Elements then it is a very good idea to only use relative references.  This also underscores the importance of following a standard when building PI AF


This example shows the syntax needed for referencing parent-element values and child-element values.  The following hierarchy is used for this example.


The Tank1 element needs to use the Total Inventory Attribute from the Well Pad1 Element.  In the Template for Tank, you can create an Expression and retrieve the value of Total Inventory
using the syntax below.  The name of the parent-element can change and your calculation will still be correct.


From the Well Pad1 Element you can write an Analysis that uses the Inventory Attribute from the child-element Tank1.  However, I have many child-elements and I do not want to specify the
name of the child-element because it will be different in other Well Pad Elements.  No worries, use the syntax below, which uses filters and collections to find the Tank Element based on its Template, no matter what its name is.  Note: There is another document called Guide to Substitution Syntax in AF-EF Data References which shows other ways to search for a child-element.


Create the same triggers for Notifications and Event Frames

If you would like create Notifications when certain conditions are met, it is also a good idea to create Event Frames for the same condition.  The reason is that the Event Frames provide a history of the Notifications for easier reporting, and other scenarios like creating Pareto charts and BI analysis.  In addition, Event Frames give us the duration of the event in question, which you cannot get from the NotificationsThe best approach is to use an expression in Analytics to write results to a PI Tag that is used as the trigger for the Notifications and Event Frames.  The example below shows how to set up Notifications and Event Frames for uptime/downtime of pumps.  This example also shows a very good use of the NoOutput() function.

(Note: for the upcoming PI AF 2.8.5 release in Q4 2016, Notifications will become an extension to Event Frames, so using the outlined approach below will not be need.)


In this example, we will create events and notifications when a pump stops and then when it turns on again.  The first thing is to create the Event Frame Templates and the Notification Templates.  You should create separate ones because you will want different Attributes and different content for the two different pump conditions, as well as ability to assign different names to the notifications and events.


The Pump Running template has the Attribute FlowRate in addition to what the Pump Stopped template has.


The Figures below show the triggering for the two Notification templates Pump Off Status and Pump On Status.



The next thing to do is to create the trigger that will trigger the events and notifications for all pumps based on the templates we set up above.  So for the Pump Template create an Expression to write the value of On or Off to the Pump Status attribute depending on whether the flow rate is greater than 0.

(Note: the use of the NoOutput() function, so that we only write a value to the Pump Status when the condition has changes.  This will make our event and notification generation much more efficient.)


The last thing to do is to create Event Frame Generation expressions to create the pump running and stopped events.  The expressions are now very simple just checking the Pump Status value, and Bob’s your uncle.



Prevent writing the same value to a PI Tag

We used this function in several examples above.  The main thing to be aware about this function is that it is only supported for expressions where the result of the expression is mapped to an Attribute with a PI Tag Data Reference.  It is extremely useful in situations where you want to write values to a PI Tag only when certain conditions are met and nothing if not.


(Note: There is an instance where using NoOutput() does not produce expected results. See knowledge based article KB01127.)


If you don’t use NoOutput():

Imagine you are creating a trigger tag to for events and notifications to write a value of 1 when the Variable is below 100 and 0 when it is above, as in the example below.  This is very simple and you would be tempted to do it as in the Figure below.


However, this is not a very good idea since either 1 or 0 will be written to the PI Tag every time this expression is triggered regardless what the previous value was.  This comes to bite you when you are using other expressions or notifications that use this PI Tag for the natural trigger for evaluation.  Now all these expressions evaluate many more time than they need to, resulting much more resources than they need to and much slower backfilling (also, it will be slower when trending the resulting Attribute).  What we really want is to only write a 1 if the previous value was 0, and a 0 if the previous value was a 1.  So this is where the NoOutput() function comes in very handy.  Figure below shows how to rewrite the above expression.


Make your calculations more robust by using Attributes instead of hard coding values in the expressions

When writing an expression it is not a good idea to hard code things like upper/lower limits, time ranges, and other parameters.  This is a great use for Child-Attributes.  Hardcoding values in expressions only achieves two things: it is faster now and gets you in potential trouble later.  It also does not provide for easy visibility and ability to change parameters based on different assets.  For example, there might be different control limits on different pumps.  The examples below show how to pass time ranges, and upper/lower control limits to expressions.

Passing time parameters:

In this example, I want to calculate the average value of the Volume Attribute over a time range that will be different for different Elements.  To do this, create Attributes in the Template
to retrieve the EndTime, StartTime, and OffSet from a PI AF Table, which will return different values based on the Element in question.


Below are two equivalent expressions to calculate the average value of the Volume Attribute; one uses the EndTime and OffSet Attributes, and the other uses  the EndTime and StartTime Attributes. 

Note: you need to use the ParseTime() function to convert the Attribute values, which are of value type String, to a time format.


If you were to hardcode the times it would look like the HardCoded Variable in the Figure below.

Note: you do not use the ParseTime() function in this case, you just type in the strings in between the single quotation signs.  As I stated before, I strongly discourage you to use this approach unless you are absolutely certain that you will not want to change the time range and that it will not vary between different Elements.


Passing control limits parameters:

The Child-Attributes are best used for items like PI Tag name, control limits, specifications, etc. that relate to the Parent-Attribute.  To do this create Child-Attributes called Maximum and Minimum under the Mass Attribute, and configure them to retrieve the Maximum and Minimum from a PI AF Table, which will return different values
based on the Element in question.


Easiest thing to do when writing the Analysis is to create Variables that get the value from the Child-Attributes and then use these Variables in the evaluation of the operational status of the Element.  When you start typing the name of the Child-Attribute, the Intellisense does not display the Child-Attributes.  So you need to type the Child-Attributes yourself using the |
symbol, in this case ‘Mass|Maximum’ and ‘Mass|Minimum’.


You can also use the Attributes at the bottom right of the analysis by selecting the Attribute and clicking Relative.  This inserts the Child-Attribute name into the variable as shown below.


Check for system digital states before using them in calculations.  Validate whether the value is stale.

When dealing with real life systems it is highly unlikely that you will always have good data for your calculations at all times.  Therefore, it is important to provide filtering of the values that are
used in calculations to prevent failure of your calculations, thus providing the wrong information to the end user. 
There are a couple of scenarios that are the most common - Bad Values (System
Digital States) and Stale Values (value has not changed in a certain period of time).  Neither scenario is a desirable one.

The first scenario will cause your calculation to fail, seen in the Figure below where the Attribute Gas Flow has a digital state for part of the time range in question and the results for the expression are Calc Failed.


The second scenario is worse since we will have results for the calculation, but they are bogus and we would not know. (I have seen one example where a value was being used by the end user in analyses, but it had not changed in several days, unknown to the user.)  The examples below give some ideas on how to handle these situations.

Bad Value:

The following example just filters out the No Data (or any other System Digital State) and only writes back the results of the calculation when there are actual values for the Gas Flow Attribute.


When the Variable BadGasFlow is TRUE then no output is written to the Adjusted Flow Attribute.  Previewing the results shows what will be written, as shown in the Figure below.


Stale Value:

This scenario is often talked about and overlooked just as often.  In PI AF 2015, there is a new function, HasChanged(), that makes it easy to validate Attributes used in expressions in certain
circumstances.  The examples below demonstrate when to use this function to find a stale value and when to use a different approach.

When to use HasChanged()

This function should be used with caution and can give incorrect results if the expression is not configured correctly.  Take the following example.  The Attribute Trigger has values in
the PI Archive as shown in the figure below. Note the last value archived is the same as the previous value and is more than 10 minutes later.


So now we would like to see if the value of Trigger has changed in the last 10 minutes, in other words every time the expression is triggered evaluate if the value has changed in the last 10 minutes.  Writing the expression using the HasChanged() function and previewing the results gives us that the value of Trigger has changed for every evaluation, see figure below.
This is clearly not true for the last two values.  The IsStale Variable value should be False at 8/3/2015 12:57:53 pm.


This is because the Attribute Trigger was used in scheduling the expression as shown in the Figure below.  So when the expression is triggered, there is a new value in the PI Archive and so the function returns True even though the actual value has not changed.


If we modify the expression to include another Attribute, in this case Pressure, which is referenced to a PI Tag, then we can trigger it only on the Pressure Attribute.  Previewing results, we see that we get the behavior we expect, as you can see in the Figures below.


We can also revert to our original expression and schedule the expression evaluation on a Periodic basis.  In the Figure below, the periodic time evaluation is 10 minutes.


When NOT to use HasChanged()

The approaches above work fine except in the case where we want to use the change in the value of the attribute Trigger in the scheduling of the expression.  In cases like this we cannot use the HasChanged() function.


Note: Because different instruments will have different periods of time for the stale value test, it is a good idea to pass the time string above, ‘*-10m’, as a parameter from
an Attribute of the Element. (See the AF Analytics - Pass parameters using Attributes blog.)


AF Analytics - Preview Results

Posted by asoudek Aug 31, 2016

Check your calculations to validate them before checking them in

I find this feature one of the most useful ones for debugging expressions.  Before you start your analyses and/or backfill results, it is always a good idea to validate your expressions are doing what you expected.  To do this, use the Preview Results selection and export the results to Excel if need be.  The example below shows the details. (Note: To be able to Preview Results the  expression must not contain any errors and if you are in the Template, you must select an Example Element.)


The first thing to do is to split up the expression into multiple expressions and Variables.  In this example, the expression is split into four Variables; PumpOn, PumpOff, Flowing, and Status (see Make AF Analytics more readable - Break up your calculations blog).  The first three are True/False checks and the last one is the determination of the status.  This is a simple example, but it illustrates the benefit. When we preview the results, we get the trigger timestamps and the values of the above four Variables, as well as the values for all the Attributes in this Analysis, at the trigger times.


Contrary to popular belief, you do not need to check-in anything to preview the results.  (So I would advise you to refrain from checking-in all the time, it is a waste of time unless you want to make the specific changes you just made available to other PI AF client tools.  Also, be aware that if the analyses created from this Template are running, every time you check-in, these analyses will
restart.)  To preview results, we right click on the Analysis Rule, Pump Status in this example, and select Preview Results.


Next, we select a start time and end time of interest and click the Generate Results button.  The results will be generated as shown in the above Figure.  Notice that the FlowRate
Attribute values are also added to the result set.  Any Attributes that are used in the expressions will also be added to the result set.  The next thing to do is to export the data to a csv file by clicking the Export Results button.  We can then open it in Excel and use filters or sorting to check that the conditions for the expression are working as expected for a historical data set.