Skip navigation
All People > gmichaud-verreault > Gabriel's Blog

Gabriel's Blog

2 posts

Even though the first release of OPC UA communication protocol standard was more than a decade ago (2008), client and server applications leveraging OPC UA have only started being the new standard in the recent years.

OPC UA is a welcome change that addresses the main pain points of OPC Classic. Some of the main changes that directly impact the data collection into OSIsoft's PI System:

  • Platform Independence
    • OPC UA Servers are not tied to Windows (DCOM) anymore and can run on devices with much smaller footprint across a variety of different platform (PC, cloud-based servers, PLCs, micro controllers) and Operating Systems (Windows, iOS, Android, Linux, etc.)
  • Security
    • There is a reason, why every firewall engineer has nightmares about OPC Classic and its underlying DCOM technologies, and that many referred to OPC as "Oh Please Connect". Since remote connections were not always trivial, many decided to use Tunnelers or to open up DCOM settings to an absurd extent, and it can have serious consequences from a cybersecurity perspective
    • Session encryption, message signing, sequenced packets, authentication, user control, and auditing capabilities
    • No need to open up thousands of ports in the firewall to allow DCOM communication. OPC UA Client (PI Connector for OPC UA) and servers initiate their session over one user-defined port.
  • Asset Modeling and Address Space
    • Similar to some of the functionalities that the AF Server adds to the PI Server, OPC UA Server can now contain a vast array of metadata, static attributes, and references between nodes.


Most of the differences were compiled using information available by the OPC UA Foundation


If this is all new to you, I also recommend watching Webinar—Introduction to OPC UA and Migration from Classic OPC to OPC UA (1 hour) from Unified Automation that describes the process from the OPC server side. Once your server uses OPC UA, you can leverage all the functionalities above and the PI Connector for OPC UA.


If one is currently using an OPC Classic technology (OPC DA, OPC HDA) for their data collection solution into the PI System, now is the best time to see if your source system (SCADA, DCS, PLCs, etc.) supports OPC UA or can upgraded to do so.

Some users will decide to use an OPCUA to OPCDA wrapper to stick with what they are comfortable (OPC Classic), but I believe that it is important to learn the new technologies and be able to leverage the additional functionalities of OPC UA and not stick with a technology designed around what the needs were in 1996.


Many DCS and SCADA vendors now have embedded OPC UA Servers to expose the data from their system. This means that if one is to modernize their control system platform, there will be a need to migrate client applications like PI Interfaces from OPC DA to OPC UA.


In 2016, we released the first version of the PI Connector for OPC UA, allowing direct data collection from OPC UA Server, but migrations from OPC DA to OPC UA were definitely a headache and required renaming all PI Points on the PI Data Archive to match the new OPC UA structure very difficult and against many customers PI Point naming conventions.


In 2019, we released version of the PI Connector for OPC UA, allowing much more granular data selection, PI Point renaming, and a distributed architecture allowing the PI Connector to be 2 network zones away from the PI Server, eliminating the need for an additional PI Server in the DMZ with a unidirectional PI-to-PI interface sending the data to the business PI Server.



OPC UA Concepts

In order to transition from OPC DA to OPC UA, it is important to have a mapping file that relating OPCDA ItemIds and OPCUA NodeIds.


In the old days the classic DA Server have used simple “string”-Identifiers. The so-called “ItemID” was a fully qualified name that was unique throughout the whole server (there was only one “namespace”). Furthermore, the classic DA Servers had only capabilities for a simple hierarchy, i.e. a tree-like structure with branches and leaves. Hence, many vendors have used the full folder hierarchy to create unique ItemIDs (e.g. “Folder1.Folder2.Folder3.MyTemperature”). This lead to massive redundant strings, wasting memory and slowing down performance when looking up or searching for individual Items. With OPC UA, this concept has been abandoned and nodes are uniquely identified by their NodeId.


For existing PI Points, the OPC DA ItemId is stored in the InstrumentTag attribute of the PI Point. For each of those ItemId, one would need to obtain the NodeId on the OPC UA Server. In OPC UA, every entity in the address space is a node. To uniquely identify a Node, each node has a NodeId, which is always composed of three elements:

  • NamespaceIndex
  • IdentifierType
  • Identifier



To define a node, the OPC UA Connector uses the XML notation.




In the following migration section, I assume that the PI Connector for OPC UA, PI Connector Relay, and PI Data Collection Manager (DCM) are installed and that both the PI Connector Relay and the PI Connector for OPC UA are registered and routed in the DCM. See What is the recommended procedure for configuring a new PI Connector with the PI Connector Relay and DCM for more details.


Additionally, I assume that the PI Connector for OPC UA is connected to the OPC UA Server and that a secure session between the client and the server can be established - X.509 certificate from the server is trusted on the connector, and X.509 certificate from the connector is trusted on the server. When deciding what 'Security Policy' and 'Message Security Mode' to use for the endpoint, one should always use the most secure one (please, stop using None:None, unless you are troubleshooting an issue that requires inspecting the network traffic in an unencrypted fashion). A good practice is also to only expose the most secure endpoint on the OPC Server itself.


As discussed earlier, one would also need to obtain an OPC DA ItemId -> OPC UA NodeId mapping file


Since the PI Connector for OPC UA is not shipped with an OPC UA client, I strongly recommend getting a robust OPC UA client that will be installed on the PI Connector for OPC UA. I personally recommend using UA Expert, which is the UA reference client and by far the best one I have personally used.



1- Data Selection

Once, the data source has been added, the first step is to perform data selection. In this example, we will connect to an Ignition SCADA Gateway OPC UA module.

On the DCM, select the OPC UA Connector, then navigate to the Data tab, select the data source, and perform the data source content discovery.


Once data source discovery has completed (can take a long time if the source OPC Server is large), we are ready for the data selection process.


To perform the data selection, one can either manually create rules, or use the data selection UI to select what nodes should be brought in to PI as PI Point / AF objects.


2- OPC UA to existing OPC DA PI Point mapping

Export the Tag Naming Worksheet


Open the exported spreadsheet and the PI Points with their instrumenttag, and use VLOOKUP to match the new OPC UA Points to their existing OPC DA PI Points (exported with PI Builder)


Save the Tag Naming Worksheet and Import it back to the connector using the Import function and verify that the connector accepts the custom name.


3- Remove PI Point Prefix

In order to match the custom name to the existing opcda PI Points, it will likely be necessary to not use PI Point Prefix on the destination PI Data Archive(s).

4- Define the behavior for automatically detected model changes on the OPC UA Server

NOTE: Your OPC UA server must implement the GeneralChangeModelEventType so that the connector can react onto these notifications!

See OPC UA - Model Changes and impact on the target PI Server for more details.


5- Stop the PI Interface for OPC DA and start the PI Connector for OPC UA

Note that data density may be slightly different since the PI Connector for OPC UA leverages subscription. For more details on how it works, one of the best resources is Unified Automation's OPC UA Subscription Concept.


You are now all set to use OPC UA and have (hopefully) successfully migrated to OPC UA and you can go back and tightened up your DCOM settings if the old OPC DA nodes (server and clients) will not be decommissioned.

In the past, in order to capture real-time weather data, I have always had to use PowerShell or Python to capture the data from an API and then reformat it before a using a configuration file (INI) that would never end. With the new functionalities added to the PI Connector for UFL, I can now parse the JSON file directly from the API call and create my dynamic AF Structure.


For this example, I have used OpenWeatherMap to collect real-time data. The same could be adapted for different calls (ie. forecast) or another API. Just note that at the time of writing, the UFL connector cannot create future data PI Points, so the PI Points would need to be created prior to doing the steps below. Once the points are created, the future data will flow normally.


Get an API key

Create an account on and get your free API key. The free API key allows you to access Current Weather data and 5d/3h forecast and you get up to 60 calls per minute.

All your calls will need to include &APPID=<YourAPIKey>


Define the call you want to make

For example, if I want to query the current weather in Philadelphia, San Leandro, Montreal, and Johnson City in metric units:,4560349,5392263,4633419&units=metric&appid=<apikey>


Below is an example data file:

{"cnt":4,"list":[{"coord":{"lon":-73.59,"lat":45.51},"sys":{"type":1,"id":943,"message":0.005,"country":"CA","sunrise":1549972798,"sunset":1550009869},"weather":[{"id":804,"main":"Clouds","description":"overcast clouds","icon":"04d"}],"main":{"temp":-15.7,"pressure":1031,"humidity":65,"temp_min":-17,"temp_max":-15},"visibility":24140,"wind":{"speed":7.7,"deg":40},"clouds":{"all":90},"dt":1549989754,"id":6077243,"name":"Montreal"},{"coord":{"lon":-75.16,"lat":39.95},"sys":{"type":1,"id":5344,"message":0.0042,"country":"US","sunrise":1549972594,"sunset":1550010820},"weather":[{"id":501,"main":"Rain","description":"moderate rain","icon":"10d"},{"id":601,"main":"Snow","description":"snow","icon":"13d"},{"id":701,"main":"Mist","description":"mist","icon":"50d"}],"main":{"temp":-0.22,"pressure":1023,"humidity":88,"temp_min":-3,"temp_max":1.7},"visibility":11265,"wind":{"speed":6.2,"deg":70},"clouds":{"all":90},"dt":1549989754,"id":4560349,"name":"Philadelphia"},{"coord":{"lon":-122.16,"lat":37.72},"sys":{"type":1,"id":5154,"message":0.0041,"country":"US","sunrise":1549983659,"sunset":1550022312},"weather":[{"id":804,"main":"Clouds","description":"overcast clouds","icon":"04d"}],"main":{"temp":5.28,"pressure":1019,"humidity":76,"temp_min":2.7,"temp_max":8},"visibility":16093,"wind":{"speed":3.1,"deg":130},"clouds":{"all":90},"dt":1549989754,"id":5392263,"name":"San Leandro"},{"coord":{"lon":-82.35,"lat":36.31},"sys":{"type":1,"id":2666,"message":0.0046,"country":"US","sunrise":1549973989,"sunset":1550012873},"weather":[{"id":701,"main":"Mist","description":"mist","icon":"50d"}],"main":{"temp":12.72,"pressure":1010,"humidity":77,"temp_min":5,"temp_max":17.2},"visibility":16093,"wind":{"speed":2.1,"deg":90},"clouds":{"all":90},"dt":1549989754,"id":4633419,"name":"Johnson City"}]}

Parse the JSON file from the API

With the 1.2 version of the PI Connector for UFL, there is native support for JSON formatted file using the new FOREACH(), JsonGetValue() and JsonGetItem() functions.


JSONGetItem() function is used as part of the FOREACH statement and allow to capture the objects within the array. For those unfamiliar with JSON formatting, objects are enclosed with curly brackets {} and contain unordered data. On the other hand, arrays are enclosed with square brackets [] and are used for to store ordered objects.

As the name indicates, JSONGetValue() is used to obtain a value from a name/value pair within an object.


Since the data file contains an array for the 4 cities we are interested in, we will use the FOREACH() function with JsonGetItem to loop through each object (all the information for each city).


The first step is to capture the message using JsonGetItem("Json_input","Selector"). In this example, we are grabbing the content of "list", so the Selector is simply "list[]" and we will read the entire message using __MESSAGE.


Once we are within the object of the array, we simply have to use a series of JSONGetItem() to get the level where the value is stored.


Take advantage of the Collection data type to store the values in PI and AF

In the [FIELD] section, we created TagNames, Values, and AttributeNames. Those three collections will simply be an array of the tagnames, values, and attribute names. This will allow us to simply make one StoreEvents() call for each object. The ADD() function is used to add an item to the collection. One of the things I really like about collections is that they can store multiple data types at once. In this example, the Values collection stores DateTime and Numbers. You simply have to define the data type in [FIELD] section for the values to be added to the collection.


Create the AF Structure:

In order to create the AF Structure, it is important to create each level of the hierarchy. Since I wanted my attributes to be under "\Weather_Monitoring\<CityName>", I had to create the "\Weather_Monitoring" element first. Since the UFL Connector does not read the AF structure, it is necessary to do so even if the parent element already exists.


Final INI file:



'Going through each object in the array

FOREACH (JsonGetItem(__MESSAGE, "list[]")) DO
'Initialize the Collection variables
TagNames = Clear()
Values = Clear()
AttributeNames = Clear()
City = JsonGetValue(__ITEM, "name")

'Getting the variables of interest under main{}
main = JsonGetValue(__ITEM, "main")
temp = JsonGetValue(main, "temp")
Tagnames = Add(CONCAT(city,"_temp"))
Values = Add(temp)
pressure = JsonGetValue(main, "pressure")
Tagnames = Add(CONCAT(city,"_pressure"))
Values = Add(pressure)
humidity = JsonGetValue(main, "humidity")
Tagnames = Add(CONCAT(city,"_humidity"))
Values = Add(humidity)
'Getting the variables of interest under wind{}
wind = JsonGetValue(__ITEM, "wind")
windspeed = JsonGetValue(wind, "speed")
Tagnames = Add(CONCAT(city,"_windspeed"))
Values = Add(windspeed)
AttributeNames=Add("Wind Speed")
'Getting the variables of interest under sys{}
sys = JsonGetValue(__ITEM, "sys")
sunset = JsonGetValue(sys, "sunset")
Tagnames = Add(CONCAT(city,"_sunset"))
Values = Add(sunset)
AttributeNames=Add("Sunset Time")
sunrise = JsonGetValue(sys, "sunrise")
Tagnames = Add(CONCAT(city,"_sunrise"))
Values = Add(sunrise)
AttributeNames=Add("Sunrise Time")

'Store the values in PI
'Create the AF Element to store the attributes in
ElementName=CONCAT("Weather_Monitoring\", City)
StoreElement("Weather_Monitoring") 'Parent Element
'We use tagnames collection as the Dynamic attributes collection
StoreElement(ElementName, "WeatherTemplate",tagnames) 'Child Element



Configuration in the connector admin page:

  • Configuration File: The .ini file built previously
  • Data Source Type: REST Client
  • Address: <YourAPIcall>
  • Scan Time: 600 (I am getting a new value every 10 minutes)
  • Word Wrap: -1; This is very important. In order to read JSON format, we need to use "-1" so that it interprets the formatted file as 1 line



An element was created for each city, including our attributes of interest. Since they all share the same template, the UOMs can be defined and further calculations/displays can be built at the template level.

Since the API call defines the entire structure, we could add any other location and the structure would automatically update on the next scan.