Skip navigation
All Places > All Things PI - Ask, Discuss, Connect > PI Connectors and PI Interfaces > Blog

A new version of PI Connector for UFL is now available. The highlight of this release is added support for Unicode (UTF-8) input.


What’s new in PI Connector for UFL 1.3?

- Unicode (UTF-8) support

- Status in StoreEvent (StoreEvents) functions is now mapped to the System Digital Set

- HTTP GET support HTTP Headers in REST Client configuration


Unicode (UTF-8) support

Encoding parameter has been added to Data Source configuration options.

It can be set to:
Extended ASCII - processes ASCII encoded data stream. This is the default setting and preserves the original connector behavior.
Unicode (UTF-8) - process UTF-8 encoded data stream.


PI and UTF-8 support - UTF-8 characters are supported by PI AF Server. Therefore, UTF-8 strings can be used in Element names, Attribute names, Event Frame names and static attribute values.
Limitations - UTF-8 characters cannot be used in the connector configuration file (.INI), Template names, PI Tag names, and data stored in PI. Furthermore, UTF-8 encoded strings cannot be manipulated using the string functions (CHAR(), REPLACE(), SUBSTR() etc.)





Enhanced support of Status values

With UFL Connector version 1.2 or older, every non-zero Status value specified in StoreEvent or StoreEvents function got represented as "Bad" status in the PI System. In version 1.3+, any status from System Digital set can be used.

Custom Status values can be specified by using a string (for example - "Manual") or by a number (for example, 224 resolves to "Bad Quality").




And just as before, if the Status is Good (status value equals 0), the actual value obtained from the data source is stored in PI.


HTTP Headers in GET requests
Some servers require custom HTTP headers in the incoming GET request. These can be specified in %PIHOME64%\Connectors\UFL\Configuration\Datasource.config.json file.
The HttpHeaders list is present for each data source and it's empty by default.


In the example below, two custom HTTP headers are specified. Each HTTP header needs to be enclosed in square brackets. Multiple headers are separated by a comma.

"HttpHeaders": "[User-Agent:OSISoft UFL Connector/1.2.1.x], [Accept:application/vnd.noaa.dwml+json;version=1]"





    WI# 189070 - UTF-8 encoding support of the textual input.

    WI# 190253 - Updated JSON parser.

    WI# 203682 - The status parameter in StoreEvent() is now mapped to the PI System Digital Set.

    WI# 195701 - Option to specify custom HTTP headers has been added to REST Client channel configuration.

Bug Fixes:

    WI# 197293 - In certain rare cases, if the PI Connector receives a stream name with trailing space, after a restart of the process, the data flow can stop with this error message "The given key was not present in the dictionary.". This is now fixed.

    WI# 193952 - AF buffer is not blocked when a message with an unsupported character in AF attribute name gets posted. Such a message will be discarded and AF buffer drain will continue.

    WI# 186556 - The IsNumber() function now recognizes the Locale setting.

    WI# 194120 - SECONDS_GMT and SECONDS_LOCAL are now independent on the Incoming TimeStamps option on the connector config. page.

    WI# 196204 - NULL values in the Collection-type variables are now ignored.

    WI# 196693 - Performance of FOREACH() construct has been improved.


Do you have ideas for enhancements or new features? We'd like to hear from you!
Please let us know here on PI Square or User Voice. The most up-voted item has been interactive editor for configuration files (INI), so that's our priority for the next release. If you need an inspiration for your INI file, take a look to our GitHub repository.

Sometimes the data stream doesn't provide you with all the information you need. For example, timestamp or asset name is in the file name - fe.: "Device1_20180605.dat". Or with REST Server you can get an information about the sender node/application and IP. This article describes what metadata you can get in the PI Connector for UFL.


Table of all the keywords for meta information:

__DSNAMEThe name of the configured data source (on the connector administration page)
__DSDESCRIPTIONData source description  (on the connector administration page)
__DSADDRESSData source address  (on the connector administration page)
__MESSAGEThe content of the current message (line)
__STREAMINFOStream information (described below)



For the File channel, this variable has the following format: filename|modification date|creation date, for example: a.txt|07-Jun-2016 04:35:39.676|03-Jun-2016 02:51:31.173 (replaces PFN parameter in the UFL Interface)

For the REST server channel, this variable has the following format: Source IP Address|Port, for example:|5687

For the REST Client channel, the format is following: Endpoint address and server, for example:|nginx


How to parse meta information?

In the following example for File channel we parse all the meta information as static attributes and create an asset.

FIELD(1).Name = "FileName"
FIELD(2).Name = "TimestampCreated"
FIELD(2).Type = "DateTime"
FIELD(2).Format = "dd-MMM-yyyy hh:mm:ss.nnn",_
FIELD(3).Name = "TimestampModified"
FIELD(3).Type = "DateTime"
FIELD(3).Format = "dd-MMM-yyyy hh:mm:ss.nnn",_
FIELD(4).Name = "StatAttrCol"
FIELD(4).Type = "Collection"
FIELD(5).Name = "Temp"


Data.FILTER = C1=="*"

Temp = __MESSAGE
FileName = ["(*)|*"]
TimestampModified = ["*|(*)|*"]
TimestampCreated = ["*|*|(*)"]
__MESSAGE = Temp
StatAttrCol = Clear()

StatAttrCol = Add("Data Source name",__DSNAME)
StatAttrCol = Add("Data Source description",__DSDESCRIPTION)
StatAttrCol = Add("Data source address",__DSADDRESS)
StatAttrCol = Add("StreamInfo string",__STREAMINFO)
StatAttrCol = Add("Streamdata string",__MESSAGE)
StatAttrCol = Add("File name",FileName)
StatAttrCol = Add("File created",TimestampCreated)
StatAttrCol = Add("File modified",TimestampModified)
StatAttrCol = Add("Processing started",NOW())



The result:


We’ve been working on a new version of PI Connector for UFL for the last couple of months. The new version brings better performance, more efficient resource management, and couple of highly requested features (native JSON and FOREACH construct support). Feel free to download and try it out!


What’s new in PI Connector for UFL 1.2?

  • More efficient processor consumption & faster creation of AF structure
  • REST Client Data Source supports bulk calls by URL parameters in a pipe separated list
  • ForEach() construct for stepping through collections in JSON and CSV formats
    • Native support for parsing JSON
  • Bulk function for sending events – StoreEvents()
  • New ToString() function


Bellow I would like to show & explain the main new features. For the full reference please read the User Guide and take a look to the GitHub examples (


How to process JSON data with PI Connector for UFL?

The following JSON contains information like “timestamp”, “rowType” etc. on the root level. These keys value can be easily parsed by using the function JsonGetValue([JSON_AS_STRING], [KEY]). But tagnames and values are inner JSONs in “channels” array[], so we need to iterate through it by using FOREACH(JSONGetItem([JSON_AS_STRING], [KEY[]])). New “__ITEM” field is then populated by the inner JSONs (ex: {"tag":"controller.stats.idleTime_hours","value":"194.806808"}) one by one.









TimeStamp = JsonGetValue(__MESSAGE, "timestamp")

FOREACH (JsonGetItem(__MESSAGE, "channels[]")) DO

     TagName = JsonGetValue(__ITEM, "tag")
     Value = JsonGetValue(__ITEM, "value")

     StoreEvent(TagName, ,TimeStamp, Value)


How to process CSV data with PI Connector for UFL?

Parsing CSV with the current UFL can be quite difficult, especially when there's one line of tag names and several lines of values - I've seen INI files with thousands of lines. Since we want to make configuration as easy as possible, we have added FOREACH and Collection support to the INI capabilities. Take a look to the data bellow, the easiest way to process such a CSV is to have a collection of tagnames, collection of values and one field for a timestamp. That's exactly how it works now. With those collections it's possible to call StoreEvents method where strings in tagnames collection match the order and number of items in the value collections. Thanks to this UFL Connector can pair it together with timestamp to create an event.



7/12/2017 6:10,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20
7/12/2017 6:20,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20
7/12/2017 6:30,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20
7/12/2017 6:40,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20



FIELD(3).FORMAT="M/dd/yyyy h:mm"


Tags.FILTER = C1=="T*"

TagNames = Clear()

FOREACH (CsvGetItem(__MESSAGE, ",")) DO

     IF(Counter > 0) THEN
          TagNames = Add(__ITEM)

Data.FILTER = C1=="*"

Counter = 0
Values = Clear()

FOREACH (CsvGetItem(__MESSAGE, ",")) DO

     IF(Counter == 0) THEN
          TimeStamp = __ITEM
          Value = __ITEM
          Values = Add(Value)

     Counter = Counter + 1

StoreEvents(TagNames, ,Timestamp, Values)


REST Client Data Source BULK calls

With REST Client, HTTP GET method can be executed to bring in content from a remote server. It can be a REST Server or for example an IOT device. But what if there's multiple IOT devices? How to get data from all of them with just one datasource configured in PI Connector for UFL? New support of “UFL_PLACEHOLDER” keyword makes it easy. The following string could be used as a DS address:


     UFL_PLACEHOLDER:6894/RESTServiceImpl.svc/json/ |||


As a result 3 GET requests are executed each time. The responses are processed by your INI file, subsequently tags and AF structure get created for all three listed devices.


There is much more, just try it and let us know




    WI# 147909 - Stepping through collections in Json and Csv formats via the ForEach() construct.

    WI# 149795 - Implementation of the ToString() function.

    WI# 149933 - Implementation of the StoreEvents() function.

    WI# 153597 - REST Server Data Source Type performance optimization for frequent inputs coming from many clients (IoT scenario).

    WI# 164127 - REST Client Data Source Type now supports replacing parts of the end point address by items in a pipe separated list.

    WI# 164148 - The structured input (json) can now be read as one line when Word Wrap = -1.

Bug Fixes:

    WI# 162882 - The CHAR(#) function converted numbers to ASCII for numbers between 0 and 127. This has been changed and now the CHAR(#) function recognizes every whole number from 0 to 255.

    WI# 162883 - The PRINT() function now correctly handles sub-milliseconds.

    WI# 176099 - For performance reasons, data files larger than 256 MB will not be zipped after processing.

We are excited to announce that today we have posted a beta for version 1.2 of the PI Connector for UFL.  New in version 1.2 are native consumption of JSON files, a ForEach construct, a way to include parameters in your REST URL to make multiple queries, and many behind the scenes improvements.


You will find all the details on our beta site: OSIsoft Beta Program.


We're looking for feedback from you, our customers and partners, so that we build the best, most robust product for you.  We encourage you to join the beta program today!

I'm happy to announce that this OPC Foundation has certified the PI Interface for OPC DA for compliance with the OPC DA specification.  We go through certification every few years to make sure we haven't strayed away from any portions of the specification.  We were certified with the latest release ( with no need for any changes to the product.


We have posted a reference to the certification on our TechSupport site on the PI Interface for OPC DA details page.  The certification is also posted on the OPC Foundation website.

We are excited to announce that today we have posted a beta for version 1.1 of the PI Connector for UFL.  This connector now has a major new channel for getting data.  The PI Connector for UFL can now retrieve data by calling an external REST web service!  Simply set up the parsing logic as always and then point it to the external web service and you are off an collecting data.  You will find all the details on our beta site: OSIsoft Beta Program.


We're looking for feedback from you, our customers and partners, so that we build the best most robust product for you.  We encourage you to apply for the beta program today!

If a source data file contains sometimes 'tabs', 'spaces' or a combination of 'tabs' and 'spaces' between the different fields as separators you can use the same syntax below, where instead of ',' ';' ':' , you can type a 'space' and 'tab': (see below)

FIELD(4) = C1 – C1("[;,:]") > FIELD(4) = C1 – C1("[      ]")

( the 'white space' between the '[ ]' contains a type 'space' and a 'tab')

Regretly this blog service changes a tab to spaces...



So when two data line looks like this; The first 2 lines contain only 'spaces', while the third line contains a 'tab' and 'spaces'

123XYZ01PQ122       6.99

123XYZ01PQ123        1.03

123XYZ01PQ125     0.04


With the UFL Designer it looks like you can configure it for 'spaces' and 'tabs';

But, as you can see below, it will NOT recognize any data;


But with the syntax as mentioned in line 2 above, it does nicely recognize the tags and values;

Have you ever wanted to have a PI Connector be able to call a RESTful web service and take the resultant file and parse it into data sent to your PI Servers?  Well that time is coming soon.


In version 1.1 of the PI Connector for UFL, we will introduce a new feature called a REST client.  It will be able to make a simple REST web service call and then use the UFL parsing logic to bring transform that data into assets, event frames, PI Points and the time series data stored in PI Points.


Also, there will be numerous other enhancements including improved performance and resource usage.


Details will be forthcoming soon.  We expect the beta period to run about mid-October to mid-November for about 4 weeks.  So get your test resources ready and look for my next post on PI Square telling you how to get the beta.


Meanwhile, if you would like to see a demo, check out our PI Connectors and PI Interfaces presentation at the Berlin UC last week.  Here's a link:


More soon...



What are PI Connectors?

Posted by ccoen Aug 24, 2016

Hi all,

Curious as to what these new products called PI Connectors do?  How are they different from PI Interfaces?  Want to know how to explain this to a colleague?


Our Learning team has produced a new video called "What are PI Connectors?" that should help answer your questions.  Check it out and let us know what you think!




This post is based on a summer internship project, undertaken at the OSIsoft Montreal office. For more details, check out the PI Square blog Monitoring Smart City Assets with the PI System.


For this post, we shall take a look at a concrete example.


Below we have data from bike trips in a CSV file, published on Montreal’s bike sharing system’s website.



   Historical Bike Data (2015) - Montreal's BIXI



As we can see, we have information regarding the following: When was the bike picked up? From where? When was the bike returned? To where? Additional information such as the account type and the total trip duration are also given.


Eventually, we’ve decided to store these trips, not as PI points, but rather as event frames. This is due to the fact that the data is static and holds a start and an end time.

We have also considered the possibility of using them as transfers, but they are not supported by very many client tools.


We will need to assign the Event Frames to a parent element in PI AF, thus allowing us to consume this trip data in client tools like PI Coresight.


Given the large number of Event Frames created (more than 5,000,000!), another great way of consuming the data will be to create Event Views using the PI Integrator for BA, then build dashboards in Power BI.

We did not simply transform those csv trips into event frames just to view them in Power BI, because we could have simply loaded the CSV directly from Power BI, right?


However, where it becomes more interesting, is when we combine the event frame’s static data (static attributes) with other relevant PI points that existed in the same time frame. This is where the PI System comes in place.


For this example, since biking is an outdoor activity, the weather data would seem able to justify why on some days there are more trips than others, would you agree? This is exactly where the PI System comes in hand - the ability to integrate multiple datasets and PI points in a single scene!


Fortunately, we were collecting live weather data using the connector. Therefore, we simply used to connector to store the following weather parameters to the same PI points as the live weather is writing to, and backfill.


Let’s take a look at the historical weather data we’ve brought into our PI server:


      Historical Weather Data (2015)





Indications regarding the weather status, temperature, humidity and wind speed will hopefully permit to justify the observations made for the 2015 bike trips. 




Using PI Connector for UFL to create an AF structure, event frames with attributes, and referencing the event frames to their corresponding element:


One of the many reasons in which we’ve decided to use the PI Connector for UFL, rather than the PI Interface for UFL, is due to its ability to store event frames using the built-in StoreEventFrame() function.


First, let us see how the data is structured in the comma separated values (csv) file:




In this case, we know that the first content before the first comma represents the “start date”, the second content after the first comma represents the “start station number”, so on and so forth. Therefore, we can delete the first csv line (that represents the headlines of the table’s columns) to avoid storing it.


Since we’re more interested in looking at the demand at certain stations, we’re going to be referencing to the “start station name” to analyze how many trips were made from a station rather than to a station.


Let us take a closer look on how to parse the 2015 trips that are in a csv file into event frames.


We must first define our variables in our INI configuration file:


The Att_Col variable will be used to allow us to store multiple attributes, under one single variable!



Finally, we run this configuration file in our connector’s admin page with the data file (csv) we’re trying to parse!






This is the AF structure we’ve created with the connector for UFL in order to reference the event frames (bike trips) to their corresponding starting station name:










As we can see, simply with one INI configuration file, the PI connector for UFL was able to:

  • Create an AF structure
  • Store the trips as Event frames (with a start and end time)
  • Store four attributes for the event frame template




We have used the PI Connector for UFL to store the weather parameters to existing PI points and backfilled.


Next, we simply import these weather PI points in our original event frame template to have them in a single scene.




Now, we need to bring this data into our business intelligence tool. Fortunately, this is where the PI integrator for BA shines.


In our case, we published “Event Views” as we were dealing with event frames. If our AF elements had attributes, we could have published “Asset Views”.


Below, we can see how the final published “Event View” looks like:



All of our Event Views were stored in our SQL Server, therefore we can now go to Power BI and load them from our SQL server:


powerbi 222.png


Once that’s loaded, we can use the various features in Power BI to create multiple advanced analysis, all thanks to the Integrator for BA.



It is worth mentioning that once in we loaded our “Event View” in Power BI, we have added another column counter and gave it the value of 1. That way, whenever we slice through parameters, we can have a value to determine how many of the parameters are being sliced, or in other terms, being active by the filter (or slicer).


Attached, you will find the INI file and a sample data set for you to try!



Please click here to view the analysis done on Power BI!



P.S: If you happen to download a csv file from BIXI's open data website for other months for example, make sure you open the csv in MS Excel and then save it (still in CSV), simply to have beautiful rows so that the connector can easily read.


Sending public JSON data to UFL Connector Silently

This post will discuss sending public JSON data to the PI Connector UFL, and the steps which can be taken to run this process continuously in the background. Please have a look at Jerome Lefebvre’s post on sending pubic JSON data to the PI Connector for UFL.


This post is based on a summer internship project, undertaken at the OSIsoft Montreal office. For more details, check out "Monitoring Smart City Assets with the PI System"



Running the GET/ PUT script continuously

When reading data from a live source, there is often a need to read data from the source at a regular interval. This means the Python script must be called at a specified interval, such as every 10 minutes. I found this was best done by creating a batch file which calls the Python script repeatedly, and then adding a 10-minute sleep command within the Python script. The result is a script which is executed once every 10 minutes continuously.

The Python script is called within a batch file in order to pass necessary arguments (Connector’s REST endpoint location and data page URL).


The batch file is as follows:


py "https://YUL-SRV-INT01:5460/connectordata/REST/" ""
GOTO begin


Once the batch file is launched, a command prompt window will open showing the status of each time the Python script was run. The Python script is set to print a message to the terminal window confirming if the script was run successfully, or if any error occurred.  It may show “data sent successfully over https” or “sending data failed due to error 4XX”.


Running the Python script as a windows service

When we wanted to migrate this script from our development to production environment, one of the requirements was that the script had to be running in the background and not under a single user account. This was first accomplished by running the Python script as a windows service. NSSM (Non-sucking service manager) was used to take the launcher batch file and implement it as a windows service.  NSSM is a service helper, created by Iain Patterson – more information on it can be found here.


Calling NSSM from the command line will bring up a GUI allowing us to configure the service. We can customize our service to run under a specific user account, add startup dependencies and more.


Writing the output to the Windows Event Logs

Since our Python script is now running as a windows service in the background, we can’t rely on the command prompt window to see whether it is functioning properly.  We created a workaround for this by writing to the windows event logs. This is possible thanks to the Win32API Python library, which allows direct access to the windows event logs.  A basic outline of the script is available here.


We can include statements to open communication with the  windows event logs before executing GET/PUT requests:

ph = win32api.GetCurrentProcess()
th = win32security.OpenProcessToken(ph, win32con.TOKEN_READ)
my_sid = win32security.GetTokenInformation(th, win32security.TokenUser)[0]
applicationName = "UFL_Service"
eventID = 1
category = 5    # Shell


Then, in the section of our script where we handle printing the status to the terminal, we can write to the windows event logs. This method must be embedded in the Python GET/PUT script which can be found in Jerome Lefebvre's blog post. The print message is hard-coded in each Python script. In our case, each Python script dealt with a different data source, so the print message had to be changed for each. The one below deals with data from Philadelphia's public bike sharing systems, Indego.


if response.status_code != 200:
    print("Sending data to the UFL connect failed due to error {0} {1}".format(response.status_code, response.reason))
    desc= ["Sending PHL Bike data to the UFL connect failed due to error {0} {1}".format(response.status_code, response.reason)]
    data = "Application\0Data".encode("ascii")
    myType = win32evtlog.EVENTLOG_WARNING_TYPE
    win32evtlogutil.ReportEvent(applicationName, eventID, eventCategory=category,eventType=myType, strings=desc, data=data, sid=my_sid)

    print('The data was sent successfully over https.')
    print('Check the PI Connectors event logs for any further information.')
    desc = ['The PHL Bike data was sent successfully over https.']
    data = "Application\0Data".encode("ascii")
    myType = win32evtlog.EVENTLOG_INFORMATION_TYPE
    win32evtlogutil.ReportEvent(applicationName, eventID, eventCategory=category,eventType=myType, strings=desc, data=data, sid=my_sid)


Instead of printing status to a command prompt window, the script will now print to the windows event logs. The logging level has been set to WARNING ( myType = win32evtlog.EVENTLOG_WARNING_TYPE) when the script encounters an error, and INFORMATION (myType = win32evtlog.EVENTLOG_INFORMATION_TYPE) when the script functions properly.




Alternative: running the Python script using Windows Task Scheduler:

Instead of running the Python script as a Windows service, we can call it at a regular interval using Windows Task Scheduler. In our case, certain arguments must be passed to the Python script every time it is run, so this is included in a batch file which calls the Python script with the  necessary arguments.


There are several advantages to using Windows Task scheduler vs a Windows Script. We can set the application to run at a specified interval or a specific event – on computer shutdown, sleep, etc. In addition to our Event
logging present in the Python script, Windows Task Scheduler will give us a confirmation of whether it was run successfully or what error it encountered. Windows Task Scheduler can also be configured to retry the script a specified number of times if an error was encountered.


Below is an example of our use case, we call a Batch file which reads data from six different websites and then sends them all to the PI Connector for UFL via REST endpoint.



py "https://YUL-SRV-INT01:5460/connectordata/REST/"  "" 
py "https://YUL-SRV-INT01:5460/connectordata/Bos/" "" 
py "https://YUL-SRV-INT01:5460/connectordata/NYC_Live/" "" 
py "https://YUL-SRV-INT01:5460/connectordata/PHL_Live_Bike/" "" 
py "https://YUL-SRV-INT01:5460/connectordata/SF_Live_Bike/" "" 
py "https://YUL-SRV-INT01:5460/connectordata/Toronto_Live_Bike/" ""

This post is based on a summer internship project, undertaken at the OSIsoft Montreal office. For more details, check out "Monitoring Smart City Assets with the PI System"


Building a PI AF hierarchy using the PI Connector for UFL

In this post, we will present a sample of the PI Connector for UFL’s ability to interact with a PI AF structure. Unlike the PI Interface for UFL, the PI Connector for UFL can communicate with the PI AF server and create PI AF elements. These PI AF elements can be crated based on templates to include static attributes and PI point data references.


Why create PI AF elements using the PI connector for UFL?

In the past, if there was a need to roll out a large number of AF elements, this had to be done either manually or using Excel/PI Builder. With the PI Connector for UFL, we can create AF elements, assign AF attributes, and create an AF hierarchy automatically. The process remains the same for 10 elements or 10,000 elements.  Should a new PI AF element appear in your data file, the PI Connector for UFL will add it automatically to the PI AF hierarchy. Under certain conditions, the Connector can therefore function under a “set and forget” mentality where the user can configure the Connector and walk away.


Building an element hierarchy based on data available in the input file:

When building an element hierarchy, the PI Connector for UFL will primarily rely on hierarchy data present in the data file being read. Anything not present in the data must be hardcoded in the connector’s configuration INI file, which will be demonstrated in the next section.

For example, take this excerpt from a live data feed showing a bike rental station in San Francisco. Each public bike rental station will have live, dynamic attributes such as the number of bikes available and number of docks available, and static attributes such as station ID, station name, and latitude/longitude positioning.


{      "id": 2,
      "stationName": "San Jose Diridon Caltrain Station",
      "availableDocks": 16,
      "totalDocks": 27,
      "latitude": 37.329732,
      "longitude": -121.901782,
      "statusValue": "In Service",
      "statusKey": 1,
      "availableBikes": 11,
      "stAddress1": "San Jose Diridon Caltrain Station",
      "stAddress2": "",
      "city": "San Jose",
      "postalCode": "",
      "location": "Crandall Street",
      "altitude": "",
      "testStation": false,
      "lastCommunicationTime": null,
      "landMark": "San Jose",
      "renting": true,
      "is_renting": true



Say we wanted to build an PI AF hierarchy that looked something like this:

  • San Francisco Bike Share
    • CITY A
      • Station A
      • Station B
    • CITY B
      • Station C
      • Station D



Using our Connector’s message statements, we can extract the variables we’re interested in. These include the longitude/latitude, station name, station ID, city location, available docks and available bikes.

We can then call the STOREEVENT function to store our PI points, and then add them to a dynamic attribute collection. Similarly, we can store static data in our static attribute collection.


StoreEvent("SF." + StationID + ".AvailDocks", "Docks Available", UpdateTime, DocksAvail)
StoreEvent("SF." + StationID + ".AvailBikes", "Bikes Available", UpdateTime, BikesAvail)
DynAttrCol = Add("SF." + StationID + ".AvailDocks")
DynAttrCol = Add("SF." + StationID + ".AvailBikes")
StatAttrCol = Add("Longitude", Long)
StatAttrCol = Add("Latitude", Lat)
StatAttrCol = Add("Station ID", StationID)
StatAttrCol = Add("City", City)



Now that our collections are fully set up, we can more on to storing our AF elements with the desired hierarchy. We have to start by storing the root of our hierarchy first, and then drill down to each level. We can use CHAR(92) to designate the backslash “\” character in the name of the path.


StoreElement("Bay Area Bike Share (SF)")
StoreElement("Bay Area Bike Share (SF)" + CHAR(92) + City, "SF_Borough")
StoreElement("Bay Area Bike Share (SF)" + CHAR(92) + City + CHAR(92) + StationName, "SF_Station", DynAttrCol, StatAttrCol)


Notice on the final line, we store the individual AF element with the desired static and dynamic attribute collection (StatAttrCol, DynAttrCol)


This is the resulting PI AF structure, fully built using the PI Connector for UFL.



The INI configuration file for this method is included below, named "Config file - Automatically build hierarchy.ini"


Building an element hierarchy based on data not available in the input file:


In this case, we would like to build a detailed hierarchy based on information which is not present in the source data file. We have two options – either build the hierarchy using PI Builder in Excel and then use the Connector to populate the PI points. Or we can hardcode the Connector’s INI Configuration file to include logic to create the desired hierarchy.


Say we wanted to build a similar hierarchy to the one listed in the section above, but we did not have access to a station’s city location in the data file. We still had all the other variables, just not in which city the station is located. We have access to another file containing this information. The INI configuration file can be hardcoded with this information.


Similar to the procedure above, we have to first store the individual PI points and add any attributes we want to their respective collection. We start off by storing the parent element for every city:


StoreElement("Toronto Bike Share Stations" + CHAR(92) + "Brockton", "TO_Borough")
StoreElement("Toronto Bike Share Stations" + CHAR(92) + "Cabbagetown", "TO_Borough")
StoreElement("Toronto Bike Share Stations" + CHAR(92) + "Chinatown", "TO_Borough")


We then need logic statements linking stations to their cities, based on the station ID.


IF(StationID == 7056 OR StationID == 7113) THEN
StoreElement("Toronto Bike Share Stations" + CHAR(92) + "Cabbagetown" + CHAR(92) + StationName, "TO_Stations", DynAttrCol, StatAttrCol)


We repeat these IF statements until we have assigned all stations to their respective cities. In our case, this is the only way to perform the assignment. If the Station ID had fallen into a known range for each city, then we could store it based on that logic too.


The INI configuration file for this method is included below, named "Config file - manually build hierarchy.ini"