Skip navigation
All People > gmichaud-verreault > Gabriel's Blog

Gabriel's Blog

3 posts

In the last OPC UA blog post I wrote, Migrating from OPC DA to OPC UA, Roger Palmen mentioned the 'security and management puzzle' that many have used as a reason to delay the move from OPC DA to OPC UA. Hopefully, this helps clarify a few things. Also, from my experience, setting up an OPC UA connection is much simpler than configuring DCOM and provides additional flexibility and so, so many better security features. It also makes it relatively easy to install the connector in a DMZ instead of needing it on the control network (where opcda interfaces generally reside) since they are MUCH more firewall friendly.

I will admit that it does involve a bit more management - certificates expire at some point, and some security policies and hash algorithm eventually deprecate, but it is very little for how much positive it brings.,

Obviously, as it is security-related, the material here is pretty dense, so I will try to answer as many questions as possible.

 

OPC UA provides a number of different security features for different levels of security between the OPC UA Server and Client. In this post, the terms 'OPC UA Client' or 'PI Connector' may be use interchangeably.

Since there are many more security features (thankfully) than OPCDA, it may seem more complicated at first, but it actually offers a high degree of security with limited complexity, especially when compared with DCOM and OPC DA.

 

In this post, I will be focusing on the following features and walkthrough the initial connector setup when most features are used:

  • Application Authentication based on Certificates
  • Secure communication channel with message signing and encryption based on Security Policies
  • User Authentication and Authorization
  • Access control (node and attribute level)

I am purposefully not going over auditing mechanism as it does not involve the PI Connector, but the OPC UA Server itself.

 

Endpoint

To connect to a server, a client needs information like network address, protocol, and security settings. For this purpose, OPC UA defines a set of discovery features. All information which is required to establish a connection between client and server is stored in a so-called endpoint. A server can provide several endpoints, each containing:

  • Endpoint URL (protocol and network address)
    • As compared to OPC DA (DCOM), all connections are established against the one OPC UA Server port. No need to leave the firewalls wide open or use a tunneler for DCOM anymore
  • Security Policy (defines the algorithms for signing and encryption, the algorithm for key derivation and the key lengths used in the algorithms)
  • Message Security Mode (security level for exchanged messages)
  • User Token Type (types of user authentication supported by the server)

 

Thus, when configuring the PI Connector, one only needs to know the endpoint URL of the server or the discovery server. The PI Connector will perform the discovery of the available endpoints:

 

Network Level Security

As mentioned above, OPC UA is much more firewall friendly than OPC DA and its underlying DCOM technology. The client connection will always perform its initial 'Hello' from an ephemeral port to the target OPC UA Server port defined in the endpoint. The session will be established in the same TCP stream. The server will not establish a new independent connection like it would in OPC DA.

 

Transport Level Security

In the list endpoint presented, one can see all the SecurityModes available on the server and their associated SecurityPolicy.

 

SecurityMode

The security mode defines the type of security that apply to all messages. Other than testing, troubleshooting or development work, one should always use Sign&Encrypt.

  • None
    • For testing only
    • Supports connecting to servers without certificate (not very common)
    • Connector does not perform Server Certificate Validation
  • Sign
    • Ensures the message integrity and authenticity
  • SignAndEncrypt
    • Ensures the message integrity and authenticity
    • Prevents eavesdropping (messages are encrypted)

SecurityPolicy

A well-managed OPC UA Server should only return secure Security Policies. At minimum, one should use ‘Basic256Sha256’.

Weaker security policies using outdated algorithms such as SHA-1 should not be used.

 

Application Level Security

When the endpoint has been selected, before any session can be established, both applications (server and client) need to trust each other.

OPC UA requires a bidirectional authentication of the Client and Server applications with X.509 application instance certificates during the establishment of a secure communication connection. Application instance certificates are required to uniquely identify each installation or instance of an application. All activities in the application layer are based on a secure channel that is created in the communication layer. Applications rely upon it for secure communication in addition to application authentication. The secure channel is responsible for messages integrity, confidentiality and application authentication. The application layer manages user authentication and user authorization. Clients may pass a user identity token to the OPC UA Server. The Server verifies that this user is allowed to access and what resources it is authorized to use. 

SecurityLayers.jpg

Source: OPC Foundation

 

To identify itself to communication partners, each installed OPC UA application or devices needs an Application Instance Certificate and an associated public/private key pair. The public key is distributed with the certificate. The private key has to remain secret and is used to sign and/or decrypt messages. A communication partner can use the public key to verify the trust relation, check the signature of messages, and encrypt messages. The Application Instance Certificate (self-signed) for the OPC UA Connector is located in %PIHOME64%\Connectors\OPCUA\pkiclient\own\certs.

 

The OPC UA Connector uses a file-based certificate store with the following format. OPC Servers may follow a similar structure, or can  handle it in the application itself.

 

Own

Application Instance Certificate and private key of the application. For the PI Connector, this contains the certificate that will need to be trusted by the OPC Server

Trusted

Self-signed certificates of trusted OPC UA applications or CA certificates for trusted CAs. Each CA certificate comes with a CRL that requires frequent updates.

Rejected

Certificates from OPC UA applications that tried to connect but were not trusted. Administrators can move certificates from Rejected to Trusted if the application is allowed to connect.

Issuers

CA certificates that are not directly trusted but required to verify a chain of CA certificates. Each CA certificate comes with a CRL that requires frequent updates.

 

To create a secure channel the server needs to trust the client (connector), and the connector needs to trust the server. Only after they both trust each other can a secure channel request be established.

secure_connection.png

Source: https://documentation.unified-automation.com/uasdkdotnet/2.5.2/html/L2UaDiscoveryConnect.html

 

Trusting the client certificate on the server

If the OPC Server rejects the Connector certificate, the following error will be recorded in the message logs:

Please verify that the server trusts the certificate which is provided by the connector. Error: BadSecureChannelClosed Message: Socket was closed gracefully

 

Depending on the certificate store configuration on the OPC Server side, the connector (client) certificate can either be trusted by adding it to the trusted list folder or using the OPC Server application.

Example: Trusting the client (connector) certificate using the OPC Server application (vendor specific)

Status Endpoints Certficate users Sessions Certificates Connection Log Address Space Simulation Debug Log Open Certificate in OS viewer ReqtRes Loq WINTERFELLdevosisottint Own Certticate OpcUQConnectorHost Relettec Trust/Reject of the client certificate in the OPC Server application Valid From: Valid To: Applt.ün Key Size: Filenarne: Serial Signature algorithm: Issuer. SLCect Subject Alternative Name Thumbprint 2111201814:31 2909.202814:31 urn 2048 C: users'tgmicnaud-verreaun. A72EFFgC5B77, PI Connector for OPC LIA SHA256withRSA urn-w.introslsottopcua connectorHost1. [2. GV.INTFI [201 oxaa7ætgc5b774348e83d14agd5de8e9öd732e7e

Trusting the server certificate on the client (connector)

If the PI Connector for OPC UA rejects the OPC UA Server certificate, the following error will be recorded in the message logs:

Connection failed. Please verify that the connector trusts the certificate provided by the server. Error: BadSecurityChecksFailed Message: Error received from remote host: Bad_SecurityChecksFailed (code=0x80130000, description="An error occurred verifying security.")

After rejecting the certificate, the connector places a copy of the rejected certificate in the rejected folder of its file-based certificate store. To trust the server certificate, simply move the certificate to the trusted store.

NOTE: Make sure to inspect the certificate to make sure that it belongs to the OPC Server you want to grant access to.

 C Windows (C:) Program Files PIPC Connectors OpcUa pkiclient rejected SimulaticnSe,ver [U0553EE6012C125266F3C6FC82FBB498F837C9C].der ove ece o e cetts Date modified rusted folder to allow for a secure channel to be _ OpcUa pkiclient trusted PC Windows(C:) Program Files PIPC Simulaticnserver (35FE16FOE016BFF29FB... 4] uaServerCpp@GV.CNCT 188A73CAEU2. cert5 Date if ed 2013259 PM Security Certificate Security Certificate  

User Level Security

Depending on the OPC UA Server implementation, role-based or user-based access control can be implemented. This means, that when connecting with the PI Connector, a username and password will need to be provided. Note that this still occurs after everything else mentioned above, so it is not a something that is done instead of Application or Transport level security, but in addition to.

Similar to PI Data Archive security, identities on the OPC UA Server define what 'user' can do what at a node and attribute level. Since the PI Connector for OPC UA is read-only, it should only be granted a read-only (CurrentRead, HistoryRead) role for the nodes that it will be monitoring.

Keep in mind that the username is not a Windows nor a local account. User management is handled in the OPC UA Server application.

If the username and password fields are left blank, the connector will only be able to connect as an Anonymous user.

 

Recap

To recap, once one select the most secure endpoint, it is expected for the connection to fail twice.

  1. Attempt a connection (wait 1 min, save the data source configuration or force a discovery in PI Connector) -> Fails
  2. The PI Connector will move the OPC UA certificate to its rejected folder. Inspect the certificate information and move it to the trusted folder if it is the OPC UA certificate.
  3. Attempt a connection (wait 1 min, save the data source configuration or force a discovery in PI Connector) -> Fails
  4. The OPC UA Server should reject the client certificate. Trust the connector certificate on the UA Server (the process here will vary widely depending on the vendor)
  5. Attempt a connection (wait 1 min, save the data source configuration or force a discovery in PI Connector) -> Success if the User  is successfully authenticated and will be granted a role on the UA Server
  6. Good to go with the next steps (discovery, data selection, stream naming, etc.)

 

 

Source: https://readthedocs.web.cern.ch/download/attachments/21178021/OPC-UA-Secure-Channel.JPG 

 

Further Reading/Watching

Even though the first release of OPC UA communication protocol standard was more than a decade ago (2008), client and server applications leveraging OPC UA have only started being the new standard in the recent years.

OPC UA is a welcome change that addresses the main pain points of OPC Classic. Some of the main changes that directly impact the data collection into OSIsoft's PI System:

  • Platform Independence
    • OPC UA Servers are not tied to Windows (DCOM) anymore and can run on devices with much smaller footprint across a variety of different platform (PC, cloud-based servers, PLCs, micro controllers) and Operating Systems (Windows, iOS, Android, Linux, etc.)
  • Security
    • There is a reason, why every firewall engineer has nightmares about OPC Classic and its underlying DCOM technologies, and that many referred to OPC as "Oh Please Connect". Since remote connections were not always trivial, many decided to use Tunnelers or to open up DCOM settings to an absurd extent, and it can have serious consequences from a cybersecurity perspective
    • Session encryption, message signing, sequenced packets, authentication, user control, and auditing capabilities
    • No need to open up thousands of ports in the firewall to allow DCOM communication. OPC UA Client (PI Connector for OPC UA) and servers initiate their session over one user-defined port.
  • Asset Modeling and Address Space
    • Similar to some of the functionalities that the AF Server adds to the PI Server, OPC UA Server can now contain a vast array of metadata, static attributes, and references between nodes.

 

Most of the differences were compiled using information available by the OPC UA Foundation

https://opcfoundation.org/about/opc-technologies/opc-ua/

 

If this is all new to you, I also recommend watching Webinar—Introduction to OPC UA and Migration from Classic OPC to OPC UA (1 hour) from Unified Automation that describes the process from the OPC server side. Once your server uses OPC UA, you can leverage all the functionalities above and the PI Connector for OPC UA.

 

If one is currently using an OPC Classic technology (OPC DA, OPC HDA) for their data collection solution into the PI System, now is the best time to see if your source system (SCADA, DCS, PLCs, etc.) supports OPC UA or can upgraded to do so.

Some users will decide to use an OPCUA to OPCDA wrapper to stick with what they are comfortable (OPC Classic), but I believe that it is important to learn the new technologies and be able to leverage the additional functionalities of OPC UA and not stick with a technology designed around what the needs were in 1996.

 

Many DCS and SCADA vendors now have embedded OPC UA Servers to expose the data from their system. This means that if one is to modernize their control system platform, there will be a need to migrate client applications like PI Interfaces from OPC DA to OPC UA.

 

In 2016, we released the first version of the PI Connector for OPC UA, allowing direct data collection from OPC UA Server, but migrations from OPC DA to OPC UA were definitely a headache and required renaming all PI Points on the PI Data Archive to match the new OPC UA structure very difficult and against many customers PI Point naming conventions.

 

In 2019, we released version 2.0.1.33 of the PI Connector for OPC UA, allowing much more granular data selection, PI Point renaming, and a distributed architecture allowing the PI Connector to be 2 network zones away from the PI Server, eliminating the need for an additional PI Server in the DMZ with a unidirectional PI-to-PI interface sending the data to the business PI Server.

 

 

OPC UA Concepts

In order to transition from OPC DA to OPC UA, it is important to have a mapping file that relating OPCDA ItemIds and OPCUA NodeIds.

 

In the old days the classic DA Server have used simple “string”-Identifiers. The so-called “ItemID” was a fully qualified name that was unique throughout the whole server (there was only one “namespace”). Furthermore, the classic DA Servers had only capabilities for a simple hierarchy, i.e. a tree-like structure with branches and leaves. Hence, many vendors have used the full folder hierarchy to create unique ItemIDs (e.g. “Folder1.Folder2.Folder3.MyTemperature”). This lead to massive redundant strings, wasting memory and slowing down performance when looking up or searching for individual Items. With OPC UA, this concept has been abandoned and nodes are uniquely identified by their NodeId.

 

For existing PI Points, the OPC DA ItemId is stored in the InstrumentTag attribute of the PI Point. For each of those ItemId, one would need to obtain the NodeId on the OPC UA Server. In OPC UA, every entity in the address space is a node. To uniquely identify a Node, each node has a NodeId, which is always composed of three elements:

  • NamespaceIndex
  • IdentifierType
  • Identifier

nodeid_concept_1.png

 

To define a node, the OPC UA Connector uses the XML notation.

ns=<namespaceIndex>;<identifiertype>=<identifier>

 

Prerequisites

In the following migration section, I assume that the PI Connector for OPC UA, PI Connector Relay, and PI Data Collection Manager (DCM) are installed and that both the PI Connector Relay and the PI Connector for OPC UA are registered and routed in the DCM. See What is the recommended procedure for configuring a new PI Connector with the PI Connector Relay and DCM for more details.

 

Additionally, I assume that the PI Connector for OPC UA is connected to the OPC UA Server and that a secure session between the client and the server can be established - X.509 certificate from the server is trusted on the connector, and X.509 certificate from the connector is trusted on the server. When deciding what 'Security Policy' and 'Message Security Mode' to use for the endpoint, one should always use the most secure one (please, stop using None:None, unless you are troubleshooting an issue that requires inspecting the network traffic in an unencrypted fashion). A good practice is also to only expose the most secure endpoint on the OPC Server itself.

 

As discussed earlier, one would also need to obtain an OPC DA ItemId -> OPC UA NodeId mapping file

 

Since the PI Connector for OPC UA is not shipped with an OPC UA client, I strongly recommend getting a robust OPC UA client that will be installed on the PI Connector for OPC UA. I personally recommend using UA Expert, which is the UA reference client and by far the best one I have personally used.

 

Migration

1- Data Selection

Once, the data source has been added, the first step is to perform data selection. In this example, we will connect to an Ignition SCADA Gateway OPC UA module.

On the DCM, select the OPC UA Connector, then navigate to the Data tab, select the data source, and perform the data source content discovery.

 

Once data source discovery has completed (can take a long time if the source OPC Server is large), we are ready for the data selection process.

 

To perform the data selection, one can either manually create rules, or use the data selection UI to select what nodes should be brought in to PI as PI Point / AF objects.

 

2- OPC UA to existing OPC DA PI Point mapping

Export the Tag Naming Worksheet

 

Open the exported spreadsheet and the PI Points with their instrumenttag, and use VLOOKUP to match the new OPC UA Points to their existing OPC DA PI Points (exported with PI Builder)

 

Save the Tag Naming Worksheet and Import it back to the connector using the Import function and verify that the connector accepts the custom name.

 

3- Remove PI Point Prefix

In order to match the custom name to the existing opcda PI Points, it will likely be necessary to not use PI Point Prefix on the destination PI Data Archive(s).

4- Define the behavior for automatically detected model changes on the OPC UA Server

NOTE: Your OPC UA server must implement the GeneralChangeModelEventType so that the connector can react onto these notifications!

See OPC UA - Model Changes and impact on the target PI Server for more details.

 

5- Stop the PI Interface for OPC DA and start the PI Connector for OPC UA

Note that data density may be slightly different since the PI Connector for OPC UA leverages subscription. For more details on how it works, one of the best resources is Unified Automation's OPC UA Subscription Concept.

 

You are now all set to use OPC UA and have (hopefully) successfully migrated to OPC UA and you can go back and tightened up your DCOM settings if the old OPC DA nodes (server and clients) will not be decommissioned.

In the past, in order to capture real-time weather data, I have always had to use PowerShell or Python to capture the data from an API and then reformat it before a using a configuration file (INI) that would never end. With the new functionalities added to the PI Connector for UFL, I can now parse the JSON file directly from the API call and create my dynamic AF Structure.

 

For this example, I have used OpenWeatherMap to collect real-time data. The same could be adapted for different calls (ie. forecast) or another API. Just note that at the time of writing, the UFL connector cannot create future data PI Points, so the PI Points would need to be created prior to doing the steps below. Once the points are created, the future data will flow normally.

 

Get an API key

Create an account on https://openweathermap.org/api and get your free API key. The free API key allows you to access Current Weather data and 5d/3h forecast and you get up to 60 calls per minute.

All your calls will need to include &APPID=<YourAPIKey>

 

Define the call you want to make

For example, if I want to query the current weather in Philadelphia, San Leandro, Montreal, and Johnson City in metric units:

http://api.openweathermap.org/data/2.5/group?id=6077243,4560349,5392263,4633419&units=metric&appid=<apikey>

 

Below is an example data file:

{"cnt":4,"list":[{"coord":{"lon":-73.59,"lat":45.51},"sys":{"type":1,"id":943,"message":0.005,"country":"CA","sunrise":1549972798,"sunset":1550009869},"weather":[{"id":804,"main":"Clouds","description":"overcast clouds","icon":"04d"}],"main":{"temp":-15.7,"pressure":1031,"humidity":65,"temp_min":-17,"temp_max":-15},"visibility":24140,"wind":{"speed":7.7,"deg":40},"clouds":{"all":90},"dt":1549989754,"id":6077243,"name":"Montreal"},{"coord":{"lon":-75.16,"lat":39.95},"sys":{"type":1,"id":5344,"message":0.0042,"country":"US","sunrise":1549972594,"sunset":1550010820},"weather":[{"id":501,"main":"Rain","description":"moderate rain","icon":"10d"},{"id":601,"main":"Snow","description":"snow","icon":"13d"},{"id":701,"main":"Mist","description":"mist","icon":"50d"}],"main":{"temp":-0.22,"pressure":1023,"humidity":88,"temp_min":-3,"temp_max":1.7},"visibility":11265,"wind":{"speed":6.2,"deg":70},"clouds":{"all":90},"dt":1549989754,"id":4560349,"name":"Philadelphia"},{"coord":{"lon":-122.16,"lat":37.72},"sys":{"type":1,"id":5154,"message":0.0041,"country":"US","sunrise":1549983659,"sunset":1550022312},"weather":[{"id":804,"main":"Clouds","description":"overcast clouds","icon":"04d"}],"main":{"temp":5.28,"pressure":1019,"humidity":76,"temp_min":2.7,"temp_max":8},"visibility":16093,"wind":{"speed":3.1,"deg":130},"clouds":{"all":90},"dt":1549989754,"id":5392263,"name":"San Leandro"},{"coord":{"lon":-82.35,"lat":36.31},"sys":{"type":1,"id":2666,"message":0.0046,"country":"US","sunrise":1549973989,"sunset":1550012873},"weather":[{"id":701,"main":"Mist","description":"mist","icon":"50d"}],"main":{"temp":12.72,"pressure":1010,"humidity":77,"temp_min":5,"temp_max":17.2},"visibility":16093,"wind":{"speed":2.1,"deg":90},"clouds":{"all":90},"dt":1549989754,"id":4633419,"name":"Johnson City"}]}

Parse the JSON file from the API

With the 1.2 version of the PI Connector for UFL, there is native support for JSON formatted file using the new FOREACH(), JsonGetValue() and JsonGetItem() functions.

 

JSONGetItem() function is used as part of the FOREACH statement and allow to capture the objects within the array. For those unfamiliar with JSON formatting, objects are enclosed with curly brackets {} and contain unordered data. On the other hand, arrays are enclosed with square brackets [] and are used for to store ordered objects.

As the name indicates, JSONGetValue() is used to obtain a value from a name/value pair within an object.

 

Since the data file contains an array for the 4 cities we are interested in, we will use the FOREACH() function with JsonGetItem to loop through each object (all the information for each city).

 

The first step is to capture the message using JsonGetItem("Json_input","Selector"). In this example, we are grabbing the content of "list", so the Selector is simply "list[]" and we will read the entire message using __MESSAGE.

 

Once we are within the object of the array, we simply have to use a series of JSONGetItem() to get the level where the value is stored.

 

Take advantage of the Collection data type to store the values in PI and AF

In the [FIELD] section, we created TagNames, Values, and AttributeNames. Those three collections will simply be an array of the tagnames, values, and attribute names. This will allow us to simply make one StoreEvents() call for each object. The ADD() function is used to add an item to the collection. One of the things I really like about collections is that they can store multiple data types at once. In this example, the Values collection stores DateTime and Numbers. You simply have to define the data type in [FIELD] section for the values to be added to the collection.

 

Create the AF Structure:

In order to create the AF Structure, it is important to create each level of the hierarchy. Since I wanted my attributes to be under "\Weather_Monitoring\<CityName>", I had to create the "\Weather_Monitoring" element first. Since the UFL Connector does not read the AF structure, it is necessary to do so even if the parent element already exists.

 

Final INI file:

[FIELD]
FIELD(1).NAME="TagNames"
TagNames.TYPE="Collection"
FIELD(2).NAME="Values"
Values.TYPE="Collection"
FIELD(3).NAME="AttributeNames"
AttributeNames.TYPE="Collection"
FIELD(4).NAME="temp"
temp.TYPE="Number"
FIELD(5).NAME="pressure"
pressure.TYPE="Number"
FIELD(6).NAME="humidity"
humidity.TYPE="Number"
FIELD(7).NAME="windspeed"
windspeed.TYPE="Number"
FIELD(8).NAME="sunrise"
sunrise.TYPE="DateTime"
sunrise.FORMAT="SECONDS_LOCAL"
FIELD(9).NAME="sunset"
sunset.TYPE="DateTime"
sunset.FORMAT="SECONDS_LOCAL"
FIELD(10).NAME="main"
FIELD(11).NAME="City"
FIELD(12).NAME="wind"
FIELD(13).NAME="sys"
FIELD(14).NAME="ElementName"


[MSG]
MSG(1).NAME="Data"

[Data]
Data.FILTER=C1=="*"
'Going through each object in the array

FOREACH (JsonGetItem(__MESSAGE, "list[]")) DO
'Initialize the Collection variables
TagNames = Clear()
Values = Clear()
AttributeNames = Clear()
City = JsonGetValue(__ITEM, "name")

'Getting the variables of interest under main{}
main = JsonGetValue(__ITEM, "main")
temp = JsonGetValue(main, "temp")
Tagnames = Add(CONCAT(city,"_temp"))
Values = Add(temp)
AttributeNames=Add("Temperature")
pressure = JsonGetValue(main, "pressure")
Tagnames = Add(CONCAT(city,"_pressure"))
Values = Add(pressure)
AttributeNames=Add("Pressure")
humidity = JsonGetValue(main, "humidity")
Tagnames = Add(CONCAT(city,"_humidity"))
Values = Add(humidity)
AttributeNames=Add("Humidity")
'Getting the variables of interest under wind{}
wind = JsonGetValue(__ITEM, "wind")
windspeed = JsonGetValue(wind, "speed")
Tagnames = Add(CONCAT(city,"_windspeed"))
Values = Add(windspeed)
AttributeNames=Add("Wind Speed")
'Getting the variables of interest under sys{}
sys = JsonGetValue(__ITEM, "sys")
sunset = JsonGetValue(sys, "sunset")
Tagnames = Add(CONCAT(city,"_sunset"))
Values = Add(sunset)
AttributeNames=Add("Sunset Time")
sunrise = JsonGetValue(sys, "sunrise")
Tagnames = Add(CONCAT(city,"_sunrise"))
Values = Add(sunrise)
AttributeNames=Add("Sunrise Time")

'Store the values in PI
StoreEvents(TagNames,AttributeNames,,Values)
'Create the AF Element to store the attributes in
ElementName=CONCAT("Weather_Monitoring\", City)
StoreElement("Weather_Monitoring") 'Parent Element
'We use tagnames collection as the Dynamic attributes collection
StoreElement(ElementName, "WeatherTemplate",tagnames) 'Child Element
ENDFOR

 

 

Configuration in the connector admin page:

  • Configuration File: The .ini file built previously
  • Data Source Type: REST Client
  • Address: <YourAPIcall>
  • Scan Time: 600 (I am getting a new value every 10 minutes)
  • Word Wrap: -1; This is very important. In order to read JSON format, we need to use "-1" so that it interprets the formatted file as 1 line

 

Results:

An element was created for each city, including our attributes of interest. Since they all share the same template, the UOMs can be defined and further calculations/displays can be built at the template level.

Since the API call defines the entire structure, we could add any other location and the structure would automatically update on the next scan.