Skip navigation
All People > Jerome Lefebvre > Jerome Lefebvre のブログ > 2020 > March
2020

Creating event frames with okish names using analysis

Analysis has gone a long way in creating event frames that we are happier with, but as always, please votes for all the ways you think they should be getting better in our feedback system!

In this article, I want to give two tips on improving the eventframe names created by the analysis service.

The goal is to create event frames with names like using AF configurations and 4 lines of AF SDK:

 

Current state of naming

As you may know, to help in the naming of event frames, analysis allows to save the name of the start trigger to an event frame attribute. We can then use those attribute values in the naming pattern of the event frame.

An issue arrises when multiple start triggers are defined in order to create child event frames. In this case, this is how the event frames will be created:

  1. A start trigger becomes true and an event frame A is created.
  2. A second start trigger becomes true, then the following occurs
    1. Event frame A is closed
    2. Event frame B is created and event frame A is moved to be a child of event frame B. Event frame B also has the same name and start time as event frame A.
    3. Event frame C is created with the starttime which is the endtime of event frame A.

As the parent and first event frame will have the same name, one name as to be sacrificied, typically the compromise is to have all child names to be terrible.

PSE has the ability to reevaluate names. This is a manual process, but if we could automate this and create a naming scheme that is good for both parent and childs, we would have a solution.

Automatically reevalute event frames name

We will need a bit of AF SDK here (as far as I've looked, this is not exposed via PI Web API or Powershell yet)

This is done via the AFNameSubsitution.ResolveName method.

Here is a sample code, that loops over all the recent event frames and re-evaluates their names.

 

var af = (new PISystems()).DefaultPISystem;
var db = af.Databases["Test"];
using (var search = new AFEventFrameSearch(db, "FindRecentEvent", @"Start:>='*-3d' Template:'Event With Names'"))
foreach (AFEventFrame item in search.FindObjects(fullLoad: true))
AFNameSubstitution.ResolveName(item);
db.CheckIn();

 

You would have to play around with the Start and End parameter to grab just the event frames you want to update. Or you could create an event watcher that grabs all event frames as they are created or closed and reevaluates their names.

Small tricks to build a good name

To build a good name, we need the ability to distinguish if we are a child or parent event frame and base on that select the correct name.

Sadly, we don't have many tools to do this. Formula only works with numerical values, string builder don't has if statements, output expressions in the analysis formular work with both types, but the close values of the last event frame and the first will be the same. Here is what I came up with, but hopefully somebody in the comments can point out a better way.

  1. Add a "_" at the start of each trigger name in the analysis and store that trigger name (note that underscores don't show up in the analysis pane)

  1. Now as the parent event frame will have this Trigger attribute a blank string and child event frames will start with a underscore, we can build a 0/1 flag to distiguish the two. This is simply a string builder who will have type Int32, and replaces the first most character if it is a underscore with a 1. Note that an empty string gets evaluated to a 0

 

  1. Create the name if we have a parent event frame or a child.

 

  1. Now you can create your actual name attribute

 

And we are done!

Bonus

An other issues with event analysis is that each child event frames are created back to back. In the real world, there is often a rest or preparation period between two subbatches. If you have gone this far to fix the names, create child event frames for all subbatches and rest periods and add a few lines to the above script to delete all unneeded event frames.

You would then have your events look like this in PI Vision

 

Sample code to delete event frames:

 

List<Guid> ids = null;
using (var search = new AFEventFrameSearch(db, "FindBlankRecentEvent", @"Start:>='*-3d' Name:'Blank'"))
while ((ids = search.FindObjectIds(pageSize: 2000).Take(2000).ToList()).Count > 0)
AFEventFrame.DeleteEventFrames(af, ids);

Linking Densowave's IoT Data Share with a PI System

Denso Wave built IoT Data Share to be a middleware translation layer between industrial protocols. It is used a lot in Japan, particularly due to our so called Galápagos syndrome, in which solution vendor love to reinvent the wheel.

List of devices already supported is quite long, here is a subset of it: 

 

I want to show a sample project that helps push this data to a PI System via OMF.

To make a simple projects, we will send only values to a single tag and not create any AF structures. A sample project is attached and it was only tested with version 1.8.0 of IoT Data Share and PI Web API 2019. I am also assuming that you are familiar with IoT Data Share, thus I am only pointing out gotchas and general concepts of OMF.

The request involved

To send data to a single tag we need to use 3 different requests:

  1. Create a type
  2. Create a container
  3. Send the time series data

The last step is the one that is repeated continuously. The first two steps are only done in two cases, initialization (first time the tags are created), recovery (something bad happen to the tags created or PI Web API). In this example, we will flip the order to the following:

  1. Send the time series data
  2. If step 1 returns an error regarding container, send the container data again and then send the data
  3. If step 2 returns an error regarding the type, send the type data again and then send the container data

With this set up, we don't need to keep track if we are still in the initialization steps. Recovery from a bad state or initialization are handled both the same way.

The connection

There are no dedicated OMF plugins in IoT Data Share as of today, but there is a JSON output option that we can configure.

 

In this example, I want to send tire pressure data, thus my containerid will contain the name of the tire (e.g. left front tire), the data will be a pair of a timestamp and a pressure value. For example, it will look like this:

[{ "containerid": "left-front", "values": [ { "time": "2020-03-23T00:00:00Z", "pressure": 32 } ] }]

Two things to notice

  1. The format of the timestamp this a specific ISO 8601 format and in UTC. In particular, OMF doesn't support timezone offset, so this timezone shifting must be done by the OMF client application
  2. Even if you send a single value, you must still send a list

The time format

To configure the timestamp, we will use execution time. Luckily, IoT data share allows us to grab the current time in UTC via the @GLOBAL_TIME variable.

 

 

All we need to do is format it correctly:

 

Create the JSON format

The JSON utility that IoT Data Share supports allows you to build a JSON object via a GUI editor that is made up of the usual keys and values, values being of string, number or array type. Thus, we can build up our object this way:

 

 

The GUI doesn't allow to send an array of JSON objects, thus to do so we will use a string operation to create our list of one JSON object.

 

 

Sending data to PI

We are now able to send data to PI via the HTTP Request object.

 

Note that IoT Data Share will not work against a untrusted certificate, thus make sure PI Web API has a valid trusted certificate.

Adding in the container information

The above request was to send data, this request will fail if the containers (and the type) is not created ahead of time. In this case, PI Web API will return two pieces of information, the request will return with a status of 400 ("Bad Request", i.e. there is an issue with the request) and it will contain a further error code of 5002 that states that the particular reason why the request is bad is that the container mentioned does not exist. It is the duty of your IoT Data Share project to handle these various OMF Event Codes, this sample only deals with the lack of a container.

We can now see the completed project. 

 

Note that we didn't add a step that handles the failure to have a type, thus this will be up to the reader to add this.

Note: The format of the data source that I am using for the INI file below changes quite a bit day to day. Either names of countries or relationships between countries and regions. Thus the INI file no longer works after the 10th of this month... if anybody wants to update it, please post a newer version below in the comments.

 

As with many I assume, I anxiously look at how Covid-19 is spreading around the world.

Other than limiting my person risk, there is little I can do that is constructive. But, as a guy with PI, I can at least play with numbers as a way to deal with that anxiety. 

 

I first want to write up how to get some data into PI as there might be some UFL tricks to be learned by doing so.

The data I am using is the data underlying this dashboard: Operations Dashboard for ArcGIS  and can get be found on GitHub GitHub - CSSEGISandData/COVID-19: Novel Coronavirus (COVID-19) Cases, provided by JHU CSSE 

 

The main thing I want to be able to visualize is when will the spread will slow down and not focus on the sadder aspects of this. Thus, I will only be looking at confirmed cases, which we can grab from this file:

https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_time_series/time_serie… 

 

The location data

Bringing up the data in Excel for a closer look, we can see that the first two columns gives various granularity to the areas affect. Either City -> State -> Country as in the US, or Provinces -> Country for China or Simply country as in the case of South Korea. For countries we also see that there are two cases, either the province is listed with the country name as in Taiwan, or nothing is listed in the province as in Japan. UFL can certainly handle that with a few IF statements.

 

The CSV data itself is: 

Zhejiang,Mainland China
,South Korea
Taiwan,Taiwan
"King County, WA",US

Thus, we need to be careful about using commas to delight fields. I won't go over all details but, the interesting steps are

1. How to detect the line starts with a double quote. To do so, I use the following IF statements:

If (["\"(*)"] is Null) then

In other words, if grabbing the entry that starts with a double quote returns null, then I am not starting with a double quote.

2. Once I deal with the somewhat messy two first columns, I don't want to keep being reminded that there might be an extra comma to avoid, thus I re-assign the __MESSAGE variable

When I am not dealing with an extra comma:

__MESSAGE = ["*,*,(*)"]

And when I have one:

__MESSAGE = ["\"*\",*,(*)"]

 

Reassigning __MESSAGE means that the Latitude and Longitude columns are very easy to get:

NumValue = ["(*),*"]
StaticAttributes = Add("Latitude", NumValue)
NumValue = ["*,(*),*"]
StaticAttributes = Add("Longitude", NumValue)

I'm using a Number field to cast the values to a Float.

 

Timestamps in the header

Looking in the image above we see that the timestamps are in the header. The UFL StoreEvents as the following definition: 

StoreEvents( TagNames, ElementAttributes, TimeStamp(s), Values[, Statuses, Questionables] )

While you can submit either one timestamp for all values or one timestamp per value. You can not submit only one Tag for all Values. Which means, we will have to add the same tag name for each column in a row that we have:

FOREACH (CsvGetItem(["*,*,(*)"], ",")) DO
  IntValue = __ITEM
  Values = Add(IntValue)
  TagNames = Add(TagName)
  AttributeNames = Add("Confirmed")
ENDFOR

 

This allows us to store all values contain in a single row with one call to StoreEvents.

 

With UFL's ability to create an AF hierarchy, we can get the following structure:

And with countries with more layers, we can have the following:

 

With just this structure and no additional work, we can get going and visualize things in Esri using our integrator. And having the historical data into PI, we can go back in time and see how things have changed.

 

 

Doing a bit of AF work, we can how the disease is growing in individual regions once it gets a foothold in the country itself.

Customers making heavy use of PI Vision will soon end up with more than one PI Vision server. Either for production/testing or related to different business units. At a quick glance, it may be difficult to tell on which of those particular instances you find yourself on. Thus, the need to somehow tweak the page to give indication to the users where they are.

Before we dive into this, some warnings:

1. This is not a supported edit! Tweaking the source code of PI Vision is an easy to find yourself without a working PI Vision site. Even it if works, there are no guarantees that the upgrade process will be able to deal with these edits.

2. This is not an endorsement to go wild on editing the front page. We take a lot of pride in PI Vision, removing the  PI vision logo weakens our brand, adding your own logos would confuse users as to what was created by OSIsoft.

To do something that is currently supported is to simply add this information, along with logos, etc., to each of the individual displays. This can be done by first creating a template graphics and creating all other displays by first editing this template and then using "save as".

And as always, if this is a need for you, please vote to make such edits a standard feature of PI Vision by registering your needs in UserVoice:

Configure the Color Theme for the PI Vision Chrome – User Feedback for OSIsoft Products and Services 

 

Here is the result we are going after:

 

 

Now onto this dangerous editing.

First, we need to know what file to edit. Chrome web debugger allows us to easily see what files are loaded, when you open the homepage or other displays. The file "pv-header-template.html" is the file we want.

 

We can also see where in the HTML file we want to edit to add this content.

 

Knowing this, to get the above appearance, I added the code below to line 11 of "pv-header-template.html".

 

    <div style="background-color:white;"><h4 style="color:blue;">Production</h4><h4 style="color:red;margin-left:2em; display:inline-block;">Server</h4></div>