2 of 2 people found this helpful
I understand CI as the practice of merging work into a single repository, and the "testing" part as the tests you submit your codebase to check if it conforms to a predefined set of rules. That said, for me, the use of "continuous integration testing" sounds a little off to me when talking about AF Databases.
So, for the sake of argument, lets consider CIT as the general idea of continually test the AF database against a set of rules. If that's the case, we have to first define what CIT is for an AF Database. I can think of 3 different scenarios where users may consider it:
- Continually monitor if the AF model conforms to his reality/needs;
- Continually monitor if the AF model has valid data references;
- Continually monitor if the data flow is valid;
For the last entry, I think CIT is worthless. That's what the backend of the PI System is all about and we already provide several tools to monitor the data health. We could even say that PI is the CI platform for this situation and I think that would be a waste of time creating custom tools for this specific need.
The second one is tricky as it's pretty hard to define what "valid" is and that opens the question to a myriad of scenarios: is it within a certain range? Is it pointing to the correct data source? Is it as frequent as necessary? Each of them requires a specific strategy and most of them can be monitored by simple AF Analysis and Notifications.
The first one, in my opinion, is the only valid situation for CIT as we frequently see customers modeling their AF tree after an existing structure and check its integrity/conformity is a comon task. So, let me answer your questions with this scenario in mind:
1) Do you use, or have you ever used Continuous Integration testing for your PI AF data models?
I have seen customers doing that. They model their tree after an existing SAP or Maximo structure and they develop tools to make sure that both trees are the same.
2) What kind of automated tools/frameworks/scripts have you used for CI testing of your AF models?
Most of them create a service that pulls data from AF using AF SDK and compare it against the respective counterpart. Once I saw this strategy implemented entirely with SQL queries.
3) Did you find any value in this kind of testing for your AF data models?
Unless your business model is dependent on the AF structure you are using, I don't see value in testing your data model. Most of the time it will be a lot of effort to little return as a wrong model will be noticed by a user and (s)he will ask for a fix.
Well, here's my two cents.
Thanks for your reply. In many respects, your comments align fairly well with my own thoughts on this topic. I agree that of the three scenarios you identified, that only the first seems to make sense for this kind of testing. I would have to say though that the concept of creating a service to effectively synch asset structure between a source system such as SAP or Maximo and AF doesn't fit my idea of CIT - I have implemented such a service myself for a customer, and it was more of a synchronisation service than a CIT framework, where the external system was considered the system of record for assets.
On the second scenario that you identified - Continually monitor if the AF model has valid data references - I have another customer that has implemented a simple AFSDK application to do this, but this is considered as more of a data quality exercise; if PI Points DR's can't resolve to PI tags then they are flagged for being excluded, or having the DR set to None.
My own feelings are that CIT doesn't really seem to be something that fits with asset modelling in PI AF, which is why I had never really thought about it before my customer asked me last week.
I appreciate your considered response to this. I am still hoping to get some feedback from other users and/or SI's as well - let's see what happens.
This is a bit like the old dev issue where SQL Server was never considered as a first class citizen, yet if someone changes the SQL Server then it can screw up your code base.
In terms of AF often it is the model definition that is put under source control, and then deployed to AF during CI. If you change an AF Template, or the relationships then that has potential to cause checked in code to fail, so those changes to the model should be under source control. You can use orchestration tools (chef, puppet ...) to automate the deployment of AF from model definition to an AF instance during a code check-in (eg. in VS, or VS Code) and have your code tests run against a specific version of your AF hierarchy. I've never gone near the whole AF internal versioning of elements etc as I don't fully see the point in having that overhead.
What normally happens is everyone has a shared central AF database, if you're lucky in segregated environments; dev, test, prod. Then each developer will be working against the shared instance where anybody can come along and alter the structure / meta data. Then someone pushes some of those changes to the test environment where the code base is tested together with other changes. Then hopefully someone rolls the changes into prod. Nobody every really looks in detail at AF, or to an extent the PI Data Archive, in that development process.
I do know of a couple of clients who do take care and version their AF hierarchy from a master data management tool set...they have different issues but you can be sure of the AF database that gets deployed is the correct one.
Thanks for sharing – I forwarded this to the TDM team.
Thanks for your comments and insight Rhys. A large part of this customer's issue is that they don't have properly segregated environments to start with, and the dependencies on what they have built and deployed in AF aren't necessarily well documented or understood. Until these situations change, CI testing probably won't add a lot of value. What I also read from your comments is that CI seems to make sense for AF when there is custom code that uses the AF data model - changes to the model definition have the potential to break code. Where there is no such dependency then perhaps just implementing sound governance for version management is all that is required.
I think more discussion with this customer is needed to determine what their actual business requirements are in this area.