we have about 900 table lookup attributes in our PI AF system fetching data from a remote SQL database and via event-triggered analyses stores the data in corresponding PI Points. The values are in the SQL database updates about one time every second or third week (they are manually entered data in a rounding system). The reason to store the data in PI Points is that we want to have it historised in the PI Data Archive and we have manually back-filled 15 to 20 years back in time.
This works pretty good, but intermittently some data is not fetched when the data changes in the SQL database. I would like to understand better how the functionality for this is working and if there is some tuning we can do to have it 100% reliable.
Or perhaps, is there a smarter way to do this? The RDBMS interface could be one solution, but at the other hand we are only talking 900 readings here being changed every fortnight or so.
Below is an example with nine table lookup attributes and the corresponding nine PI Point attributes.
The event-triggered analysis just outputs the value of the data lookup attribute to the PI Point attribute whenever the value changes.