Skip navigation
All Places > PI Developers Club > Blog
1 2 3 Previous Next

PI Developers Club

604 posts

A customer recently asked about filtering on multiple values of an attribute in AFSearch.  This is easily addressed using the IN search operator in a query string, or the equivalent AFSearchToken constructor expecting an array of values.  There is one major caveat: the IN operator is not allowed for floating point types such as Single or Double, since binary floating point values are considered approximations instead of exact numbers.

 

There are a few tips to get developers pointed in the right direction.  First of all, values within the IN operator are delimited by semi-colons.  If you (like me) accidentally use a comma-delimited list, then you will receive an odd error about missing a closing parenthesis.  Then if you carefully inspect the query string, then you (like me) will start pulling your hair out because you don’t see a missing closing parenthesis!  Hopefully you may benefit from my pain by quickly switching your delimiter to be a semi-colon.

 

Another tip is that while blanks are allowed within the IN operator - or more specifically around the delimiters - any values that contain embedded blanks must be wrapped in quotes (single or double quotes).  Consider this list of cities:

 

  • London
  • Paris
  • San Francisco

 

Obviously, San Francisco needs to be wrapped in quotes due to the blank inside the name.  If the attribute path is “|City”, then you could have this C# filter in your query string:

 

string uglyQuery = "|City:IN(London; Paris; \"San Francisco\")";

 

Though ... that is a wee bit ugly.  Besides being ugly, it also might be confusing to someone using a different language, for instance, VB.NET.  A couple of weeks ago, I taught a class where a VB coder thought the unsightly \" were delimiters for AFSearch.   I explained to him that it's how C# escapes double quotes.  This can look much easier on the eyes if you just use single quotes:

 

string prettyQuery = "|City:IN(London; Paris; 'San Francisco')";

 

And it avoids any needless confusion that \" is an AFSearch delimiter.

 

This is all fine-and-dandy, but how often do you hard-code values?  It does make the example simple and straight-forward, but what if the list of items you wish to filter upon are in a list or array?  Let’s look at some code that helps us out.  After all, this is a PI Developers Club, which demands for code.  Strictly speaking, all we need for the collection of values is that they be in an enumerable collection.  Your source collection doesn’t have to be strings, but keep in mind that eventually we will need to pass a string to the AFSearch query, or if we use the AFSearchToken, then we must actually pass an array of strings.

 

For our cities example, it makes sense that the city names are strings.  Let’s not take the easy way out and make it an array of strings.  Instead we will use a list so that we can see what we must do differently to make it all work together:

 

List<string> cities = new List<string>() { "London", "Paris", "San Francisco" };

 

A tiny bit of  code is needed to put that inside a query string:

 

// A little LINQ to wrap single quotes around each city, and separate them with "; "
string delimitedCities = string.Join("; ", cities.Select(x => $"'{x}'"));

// An Interpolated String to substitute the delimitedCities.
string query = $"|City:IN({delimitedCities})";

 

This resulting value in query would be:

 

"|City:IN('London'; 'Paris'; 'San Francisco')"

 

Or if you prefer working with search tokens, this would be the equivalent:

 

// Tip: here you must pass a string array, so we must ToArray() on the cities list.
AFSearchToken token = new AFSearchToken(AFSearchFilter.Value, AFSearchOperator.In, cities.ToArray(), "|City");

 

If we had defined cities to be a string array, we would not need the ToArray(), but then this example would be boring and less educational.

 

What if our enumerable collection isn’t a bunch of strings?  Let’s say we have a bunch of GUID ’s.  (Exactly how you got these GUID's is an interesting question not addressed here ; suffice to say this example takes a collection of things that aren't strings and converts them to one that is.)  There would now be an extra step needed where we must convert to string.  Once we have a collection of strings we can then implement code similar to the previous examples.  Let’s imagine we have something like this:

 

IEnumerable<Guid> ids = GetListOfGuids();  // magically get a list of GUID

 

Or it could maybe have been:

 

IList<Guid> ids = GetListOfGuids();

 

Let's say we want to filter on an attribute path of “|ReferenceID”.  First let’s tackle the problem of converting a GUID into a string that is compatible with AFSearch.  This is easy enough thanks to LINQ:

 

// Nicest way to convert a GUID a string compatible with AFSearch is to use GUID.ToString("B").
IEnumerable<string> idStrings = ids.Select(x => x.ToString("B"));

 

Okay, so now we have an enumerable collection of strings.  Using what we learned in previous examples, we can knock this out:

 

// A little LINQ to wrap single quotes around each string item, and separate them with "; "
string delimitedIds = string.Join("; ", idStrings.Select(x => $"'{x}'"));

// An Interpolated String to substitute the items.
string query = $"|ReferenceID:IN({delimitedIds})";

 

Fantastic.  If you prefer AFSearchTokens, that’s easy enough as well, but we do require the idStrings to generate a string array.

 

// Tip: here you must pass a string array, so we ToArray() on the idStrings collection.
AFSearchToken token = new AFSearchToken(AFSearchFilter.Value, AFSearchOperator.In, idStrings.ToArray(), "|ReferenceID");

 

Granted our example would have been simplified if we defined idStrings to be an array in the first place, but what fun would there be in that?

 

 

VB.NET Examples

 

Some of us supporting the PI Developers Club think there should be more VB.NET examples.  Towards that end, here are code snippets for the VB coders out there:

 

    Dim uglyQuery As String = "|City:IN(London; Paris; ""San Francisco"")"

    Dim prettyQuery As String = "|City:IN(London; Paris; 'San Francisco')"

    Dim cities As List(Of String) = New List(Of String) From {"London", "Paris", "San Francisco"}

 

Cities Query Example:

 

Query String

        ' A little LINQ to wrap single quotes around each city, and separate them with "; "
        Dim delimitedCities As String = String.Join("; ", cities.Select(Function(x) $"'{x}'"))

        ' An Interpolated String to substitute the delimitedCities.
        Dim query As String = $"|City:IN({delimitedCities})"

 

AFSearchToken

' Tip: here you must pass a string array, so we ToArray() on the cities list.
Dim token As AFSearchToken = New AFSearchToken(AFSearchFilter.Value, AFSearchOperator.In, cities.ToArray(), "|City")

 

Guids Example (or something that is not a collection of strings)

 

Query String

        Dim ids As IEnumerable(Of Guid) = GetListOfGuids()   ' magically get a list of GUID

        ' Nicest way to convert a GUID a string compatible with AFSearch is to use GUID.ToString("B").
        Dim idStrings As IEnumerable(Of String) = ids.Select(Function(x) x.ToString("B"))

        ' A little LINQ to wrap single quotes around each string item, and separate them with "; "
        Dim delimitedIds As String = String.Join("; ", idStrings.Select(Function(x) $"'{x}'"))

        ' An Interpolated String to substitute the items.
        Dim query As String = $"|ReferenceID:IN({delimitedIds})"

 

AFSearchToken

        ' Tip: here you must pass a string array, so we ToArray() on the idStrings collection.
        Dim token As AFSearchToken = New AFSearchToken(AFSearchFilter.Value, AFSearchOperator.In, idStrings.ToArray(), "|ReferenceID")

 

Summary

 

If you want to filter on a variety of values for an attribute, this requires the IN search operator.

  • You may use IN in a query string or an AFSearchToken.
  • Values must be exact.  Therefore, IN does not work on Single or Double floating point data types.
  • In query strings
    • The semi-colon is the delimiter between IN values.  Example: "|City:IN(London; Paris; 'San Francisco')"
    • Values containing blanks must be enclosed in single or double quotes.  Example: "|City:IN(London; Paris; 'San Francisco')"
  • AFSearchToken
    • Values must be passed as strings in an array

I have been asked on many occasions “How can I find the first recorded event for a tag?”  The direct answer to this may be as brief as the question.  However, usually there is a lurking question-behind-the-question about what they really want to do with that first event, and if you dig even slightly deeper you will uncover the overall task they are trying to accomplish.  What you may ultimately discover is there is no need to find the first event.

 

Let’s start off with the direct, simple answer in the form of a C# extension method:

 

public static AFValue BeginningOfStream(this PIPoint tag) => tag.RecordedValue(AFTime.MinValue, AFRetrievalMode.After);

 

This works because the PI Point data reference implements the Rich Data Access (RDA) method of RecordedValue.  The earliest timestamp to query is AFTime.MinValue, that is midnight January 1, 1970 UTC.  Thanks to the AFRetrievalMode, you ask for the first value after January 1, 1970.  If it’s the earliest recorded timestamp you are only concerned with, you can use this extension method:

 

public static AFTime BeginningTimestamp(this PIPoint tag) => BeginningOfStream(tag).Timestamp;

 

For PI points, this would give you the BeginningOfStream method to go along with the built-in EndOfStream.  Before the advent of future data, the EndOfStream was simply the Snapshot.  But there are oddities related to future data, which required different handling of data compared to the traditional historical data.  Hence, Snapshot was replaced by CurrentValue, and EndOfStream was added.

 

An AFAttribute could have a BeginningOfStream method, but it doesn’t have the same nice guarantees of PIPoint.  It all depends upon the data reference being used and whether it supports the RDA Data.RecordedValue method, which is why you should properly check that it is supported before attempting to call it:

 

public static AFValue BeginningOfStream(this AFAttribute attribute)
{
    if (attribute.DataReferencePlugIn == null) // static attribute
    {
        return attribute.GetValue();
    }
    if (attribute.SupportedDataMethods.HasFlag(AFDataMethods.RecordedValue))
    {
        // Depends on how well the data reference PlugIn handles AFRetrievalMode.After.
        return attribute.Data.RecordedValue(AFTime.MinValue, AFRetrievalMode.After, desiredUOM: null);
    }
    // Fabricated answer.  Not exact that one is hoping for.  
    return AFValue.CreateSystemStateValue(AFSystemStateCode.NoData, AFTime.MinValue);
}

 

Since the value being returned may be fabricated, I would be hesitant to include a BeginningTimestamp method as it would mask the inaccuracies.  To compensate, I would think further inspection of the returned value is needed, i.e. check for “No Data”.  Such are the difficulties of trying to create a BeginningOfStream method within your code libraries.  This is why we begin to probe more and ask about your use-case, or simply “What are you really wanting to do?

 

Virtually 100% of the people asking me how to find the first value in a stream want to find it for historical PI points only.  This greatly simplifies part of the problem because there is no need to be concerned with attributes or tags with future data.  Which brings us right back to the direct answer at the top, where you may be inclined to stop.  But if you take time to dig just a little deeper into what they are really doing, the true mission is revealed: they want to copy all historical data from an old tag to a new tag.  And trust me, there are several legitimate use-cases for doing this.

 

When I hear anything about copying of historical data, the first thought I have is “How much data is in the old tag?”  There are two particular answers that require the same special handling: (a) I don’t know, or (b) a lot.

 

The real problem they need to solve isn’t finding the earliest recorded timestamp.  Rather, they may have so much data they will bump into the ArcMaxCollect limitation (typically 1.5 million data values in a single data request).  There are programming ways around ArcMaxCollect (more below) and they rely upon the PIPoint.RecordedValues method specifying a maxCount > 0 (for instance, 100K works well).  The perceived issue of knowing the earliest timestamp becomes a moot point.  The more important date is knowing the end time, that is the switch-over date from the old tag to the new tag.  Depending upon how the old tag was populated, this may very well be the EndOfStream.  But if there is a chance that the old tag could still be receiving “InterfaceShut” or “IOTimeout” messages, you will need to explicitly specify the end time.  Worrying about the earliest recorded date has been a distraction to solving the real problem.

 

What of your start time?  I would think an in-house developer should know of the earliest start of their company's archive files.  A contracted developer could use AFTime.MinValue or go with a later, but still much safer date, such as “1/1/1980”.  Which brings us back to what they really want to do: copy large or unknown amounts of data.    This has been blogged about many times before:

 

Extracting large event counts from the PI Data Archive

 

PI DataPipe Events Subscription and Data Access Utility using AF SDK - PIEventsNovo

 

GetLargeRecordedValues - working around ArcMaxCollect

The 2.10 release of AF SDK unveiled DisplayDigits as a property you can investigate and set on an AF DataReference. Let’s do a quick investigation of what this property entails.

 

Further, there is a new .DisplayValue() method on the AFValue object for rendering single and double-precision floating point numbers.

 

What DisplayDigits Is

DisplayDigits is a setting that you can place on floating-point number tags to convey how much precision you wish to be displayed on-screen by downstream applications.

 

pse_example.png

Valid settings are any integer number from -20 up to 10.

 

What it’s used for

Most of your floating-point data will be expressed either in 32-bit or 64-bit floating point numbers. This allows for a large variety of decimal expression with varying degrees of precision. Controlling what precision you get back is what DisplayDigits is for.

Numerical precision

Consider the floating-point data types in the PI Data Archive…

TypeFloat16Float32Float64
MinimumZero of tag1.175494351 E-382.2250738585072014 E-308
MaximumSpan of tag3.402823466 E +381.7976931348623158 E +308
Exponent-8 bits11 bits
Mantissa-23 bits52 bits
Accuracyn/a7 digits15 digits

Floating point datatypes make a representation of a decimal number with relative increasing levels of precision given the number of bits used.1

Let’s experiment:

The SINUSOID tag on your Data Archive by default is stored as a 32-bit floating-point number. Let’s investigate what all the possible DisplayDigits settings exposed in AF SDK reveal.

First you will need to use an AF Database and set an attribute to the SINUSOID tag on your PI Data Archive. The DisplayDigits property hangs off the AFAttribute class and not off of PIPoint.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using OSIsoft.AF;
using OSIsoft.AF.Asset;
using OSIsoft.AF.Data;
using OSIsoft.AF.PI;


namespace DisplayDigitsExample
{
    class Program
    {
        static void Main(string[] args)
        {


            // Let's get the server
            PISystems pisystems = new PISystems();


            // Get an injection well asset
            AFElement well = pisystems["CLSAF"].Databases["TransformedWells"]
                .Elements["Injection Wells By Contractor"]
                .Elements["Paisano Transmission And Service Inc"]
                .Elements["Site1.F100_IW01"];


            // Get the line pressure
            AFAttribute linepressure = well.Attributes["Line Pressure"];
            
            Console.WriteLine($"Line Pressure uses PI tag {linepressure.DataReference.ConfigString}");
            Console.WriteLine($"Present Value: {linepressure.GetValue().ValueAsSingle()}");
            Console.WriteLine($"Display Digits Setting: {linepressure.DisplayDigits}");


            // Display all the DisplayDigits settings for this PI value
            for (int i = 10; i >= -20; i--)
            {
                Console.WriteLine($"Value with DisplayDigits Set to {i}: {linepressure.GetValue().DisplayValue(i, null, true)}");
            }


            Console.ReadLine();


        }
    }
}

 

Here's the same example in VB.Net...

 

Imports OSIsoft.AF
Imports OSIsoft.AF.Data


Module Module1
    Sub Main()
        Dim piServers = New PISystems


        Dim wellsDB = piServers("CLSAF").Databases("TransformedWells")


        Dim well = wellsDB.Elements("Injection Wells By Contractor") _
            .Elements("Paisano Transmission And Service Inc") _
            .Elements("Site1.F100_IW01")


        Dim linePressure = well.Attributes("Line Pressure")


        Console.WriteLine("Line pressure uses PItag " & linePressure.DataReference.ConfigString)
        Console.WriteLine("Present value: " & linePressure.GetValue().ValueAsSingle())
        Console.WriteLine("Display Digits Setting: " & linePressure.DisplayDigits)


        For i = 10 To -20 Step -1
            Console.WriteLine($"Value with DisplayDigits Set to {i}: {linePressure.GetValue().DisplayValue(i, Nothing, True)}")
        Next
        Console.ReadLine()
    End Sub
End Module

 

This yields the following:

img

 

So what’s with the negative and positive DisplayDigits?

After you run some example code against floating point data you have you will notice how DisplayDigits applies zero-padding and rounding.

Positive DisplayDigits settings will force a floating point number to be returned with a fixed number of digits to the right of the decimal point, up to 10 and include padding zeroes (0) where necessary.

If the number is more significant than what you specificed then the number may be rounded or truncated. For instance, if the value stored is 63.90804 but DisplayDigits is set at 2, the number is returned back as 63.91

A negative DisplayDigits setting instead determines the number of significant digits to display, with no zero padding.  If a rounding occurs the number of digits displayed may be less than the setting.

Why do I see more digits when I inspect .Value than when I use -7 or -15 DisplayDigits?

Floating point types are inherently imprecise as it’s a reflection on how your CPU represents floating-point values, given that the binary representation of the number is likely to not be exact.

That’s why you should use the new DisplayValue() method that’s been added to the AFValue object.

Footnotes

1 The PI Data Archive uses 32-bit single precision and 64-bit double precision as well as 16-bit

Note: Development and Testing purposes only. Not supported in production environments.

 

Link to other containerization articles

Containerization Hub

 

Introduction

During PI World 2018, there was a request for a PI Analysis Service container. The user wanted to be able to spin up multiple PI Analysis Service container to balance the load during periods where there was a lot of back filling to do. Unfortunately, this is limited by the fact that each AF server can only have exactly one instance of PI Analysis Service that runs the analytics for the server. But this has not discouraged me from making a PI Analysis Service container to add to our PI System compose architecture!

 

Features of this container include:

1. Ability to test the presence of AF Server so that set up won't fail

2. Simple configuration. The only thing you need to change is the host name of the AF Server container that you will be using.

3. Speed. Build and set up takes less than 4 minutes in total.

4. Buffering ability. Data will be held in the buffer when connection to target PI Data Archive goes down. (Added 13 Jun 2018)

 

Prerequisite

You will need to be running the AF Server container since PI Analysis Service stores its run-time settings in the AF Server. You can get one from Spin up AF Server container (SQL Server included).

 

Procedure

1. Gather the install kits from the Techsupport website. AF Services

2. Gather the scripts and files from GitHub - elee3/PI-Analysis-Service-container-build.

3. Your folder should now look like this.

4. Run build.bat with the hostname of your AF Server container.

build.bat <AF Server container hostname>

5. Now you can execute the following to create the container.

docker run -it -h <DNS hostname> --name <container name> pias

 

That's all you need to do! Now when you connect to the AF Server container with PI System Explorer, you will notice that the AF Server is now enabled for asset analysis. (originally, it wasn't enabled)

 

Conclusion

By running this PI Analysis Service container, you can now configure asset analytics for your AF Server container to produce value added calculated streams from your raw data streams. I will be including this service in the Docker Compose PI System architecture so that you can run everything with just one command.

1. Introduction

Every day more and more customers get in contact with us asking how does PI could be used to leverage their GIS data and how their geospatial information could be used in PI. Our answer is the PI Integrator for Esri ArcGIS. If your operation involves any sort of georeferenced data, geofencing or any kind of geospatial data, I encourage you to give a look at what the PI Integrator for Esri ArcGIS is capable of. But this is PI Developers Club, a haven for DIY PI nerds and curious data-driven minds. So, is it possible to create a custom data reference that provides access to some GIS data and functionalities? Let's do it using an almost-real-life example.

 

2018-06-07 11_38_41-pisquare - QGIS.png1.1 Scenario

The manager of a mining company has to monitor some trucks that operate at the northmost deposit of their open-pit mine. Due to recent rains, their geotechnical engineering team has mapped an unsafe area that should have no more than three trucks inside of it. They have also provided a shapefile with a polygon delimiting a control zone (you can download the shapefile at the end of this post). The manager wants to be notified whenever the number of trucks inside the control area is above a given limit.

 

relationship.pngCaveat lector, I'm not a mining engineer, so please excuse any inaccuracy or misrepresentation of the operations at a zinc mine. It's also important to state that the mine I'm using as an example has no relation to this blog post nor the data I'm using.

 

1.2 Premises

If you are familiar with GIS data, you know it's an endless world of file formats, coordinate systems, and geodetic models. Unless you have a full-featured GIS platform, it's very complicated to handle all possible combinations of data characteristics. So, for the sake of simplicity, this article uses Esri's Shapefile as its data repository and EPSG:4326 as our coordinate system.

 

1.3 A Note on CDRs

As the name implies, a CDR should be used to get data from an external data source that we don't provide support out-of-the-box. Simple calculations can be performed, but you should exercise caution as, depending on how intensive your mathematical operations are, you can decrease the performance of an analysis using this CDR. For our example, shapefiles, GeoJsons, and GeoPackages can be seen as a standalone data source files (as they contain geographic information in it) and the math behind it is pretty simple and it won't affect the server performance.

 

1.4 The AF Structure

Following the diagram on 1.1, our AF structure renders pretty simply: a Zinc Mine element with Trucks as children. The mine element has three attributes: (a) the number of trucks inside the control area (a roll-up analysis), (b) the maximum number of trucks allowed in the control area (a static attribute) and (c) the control area itself.

 

2018-06-06 09_37_25-__RBORGES-AF_GISToolBox - PI System Explorer.png

 

The control area is a static attribute with several supporting children attributes holding the files of the shapefile. Due to shapefile specification, together with the SHP file you also need the other three.

 

2018-06-06 09_39_00-__RBORGES-AF_GISToolBox - PI System Explorer.png

 

Finally, the truck element has two historical attributes for its position and the one using our CDR to tell if it's currently inside the control area or not (this is the one used by the roll-up analysis at the zinc mine element). Here I'm using both latitude and longitude as separated attributes, but if you have AF 2017 R2 or newer, I encourage you to have this data stored as a location attribute trait.

 

2018-06-06 16_55_42-__RBORGES-AF_GISToolBox - PI System Explorer.png

 

 

2. The GISToolBox Data Reference

The best way to present a new CDR by showing its config string:

 

Shape=..\|control area;Method=IsInside;Latitude=Latitude;Longitude=Longitude

 

Breaking it down: Shape is the attribute that holds the shapefile and its supporting files. It's actually just a string with a random name. What is important are the children underneath it that are file attributes and hold the shape files. Method is the method we want to execute. Latitude and Longitude are self-explanatory and they should also point to an attribute. If you don't provide a lat/long attribute, the CDR will use the location attribute trait defined for the element. There are also two other parameters that I will present later.

 

The code is available here and I encourage you to go through it and read the comments. If you want to learn how to create a custom data reference, please check the useful links section at the end of this post.

 

2.1 Dataflow

The CDR starts by overriding the GetInputs method. There we use the values passed by the config string, to get the proper attributes. You should pay close attention to the way the shapefile is organized, as there are some child attributes holding the files (these child attributes are AFFiles). Once this is done, the GetValue is called. It starts by downloading the shapefile from the AF server to a local temporary folder and creating a Shapefile object. Although Esri's specification is open, I'm using DotSpatial to incorporate the file handling and all spatial analysis we do. Once we have the shapefile, it goes through some verifications and we finally call the method that gets the data we want: GISGelper.EvaluateFunction(). For performance reasons, I'm also overriding the GetValues method. The reason is that we don't need to recreate the files for every iteration on the AFValues array sent by the SDK.

 

2.2 Available Methods

Taking into account what I mentioned on 1.3, we should not create sophisticated calculations so the CDR doesn't slow down the Analysis engine. To keep it simple and with good performance, I have implemented the following methods:

NameDescriptionOutputRepresentation
IsInsideDetermines whether a coordinate is inside a polygon in the shapefile. If your shapefile contains several polygons, it will check all of them.

1 if inside

0 if outside

inside.png
IsOutsideDetermines whether a coordinate is outside a polygon in the shapefile. If your shapefile contains several polygons, it will check all of them.

1 if outside

0 if inside

outside.png
MinDistanceDetermines the minimum distance from a coordinate to a polygon in the shapefile. If your shapefile contains several polygons, it will check all of them and return the shortest of them all.A double with the distance in the units defined by the used CRSmindist.png
CentroidDistanceDetermines the distance from a coordinate to a polygon's centroid in the shapefile. If your shapefile contains several polygons, it will check all of them and return the shortest of them all.A double with the distance in the units defined by the used CRScentdist.png

 

2.3 CRS Conversion

The GISToolbox considers that both lat/long and shapefiles are using the same CRS. If your coordinate uses a different base from your shapefile, you can use two other configuration parameters (FromEPSGCode and ToEPSGCode) to convert the coordinate to the same CRS used by the shapefile.

 

Let's say you have a shapefile using EPSG:4326, but your GPS data comes on EPSG:3857. For this case, you can use:

Shape=..\|control area;Method=IsInside;Latitude=Latitude;Longitude=Longitude;FromEPSGCode=3857;ToEPSGCod=4326

 

2.4 Limitations

  • It doesn't implement an AF Data Pipe, so it can't be used with event-triggered analysis (only periodic).
  • It handles temporary files, the user running your AF services must have read/write permissions on the temporary folder.
  • It only supports EPSG CRS
  • It only supports shapefiles.

 

3. Demo

Let's go back to our manager who needs to monitor the trucks inside that specific control area.

 

3.2 Truck Simulation

In order to make our demo more realistic, I have created a small simulation. You can download the shapefile at the end of this post (Trucks_4326.zip). Here's a gif showing the vehicles' position

 

gismap.gif

 

The trucks start outside of the control area and they slowly move towards it. Here's a table showing if a given truck is inside the polygon at a specific timestamp:

TSTruck 001Truck 002Truck 003Truck 004Total
000000
101001
201102
301102
411103
511114
611114
711013
811002

 

The simulation continues until the 14ᵗʰ iteration, but note how the limit is exceeded on the timestamp 5, so we should get a notification right after entering the 5ᵗʰ iteration.

 

3.3 Notification

The notification is dead simple: every 4 seconds I check the Active Trucks attribute against the maximum allowed. And as I mentioned before, the Active Trucks is a roll-up counting the IsInside attribute of each truck.

 

2018-06-07 16_05_09-__RBORGES-AF_GISToolBox - PI System Explorer.png

 

Shall we see it in action?

 

notif.gif

Et voilà!

 

The simulation files are available at the end of this post. Feel free to download and explore it.

 

4. Conclusion

This proof of concept demonstrates how powerful a Custom Data Reference can be. Of course, it doesn't even come close to what the PI Integrator for Esri ArcGIS is capable of, but it shows that for simple tasks, we can mimic functionalities from bigger platforms and can be used as an alternative while a more robust platform is not available.

 

If you like this topic and think that AF should have some basic support to GIS, please chime in on the user voice entry I've created to collect ideas from you.

Introduction

 

This is a MATLAB toolbox that integrates the PI System with MATLAB through PI Web API. With this toolbox you can retrieve PI data without having to generate the URL for each request. This version was developed on top of the PI Web API 2018 Swagger specification.

 

In the new upcoming 2018 release, PI Asset Analytics will introduce native connectivity to MATLAB enabling users to schedule and run their MATLAB functions fully integrated into their analyses. In other words, you will be able to integrate the PI System with MATLAB on production using a model that you have already built. This tool was developed for you to create new models with PI System data before using it on production.

 

Requirements

 

  • PI Web API 2018 installed within your domain using Kerberos or Basic Authentication. If you are using an older version, some methods might not work.
  • MATLAB 2018a+

 

GitHub repository

 

Please visit the GitHub repository for this project where you can find its source code, download the toolbox and read about the new features added.

 

Installation

 

This MATLAB toolbox is not available on MATLAB central servers. You should download it directly from this GitHub repository on the release section.

 

Please use the command below to install the toolbox:

 

matlab.addons.toolbox.installToolbox('piwebapi.mltbx')

 

If the installation is successful, you should see this toolbox inside matlab.addons.toolbox.installedToolboxes:

 

toolboxes = matlab.addons.toolbox.installedToolboxes;

 

If you want to uninstall this toolbox, use the command below:

 

matlab.addons.toolbox.uninstallToolbox(toolboxes(i))

 

 

Documentation

All the methods and classes from this MATLAB Toolbox are described on its documentation, which can be opened by typing on the console:

 

demo toolbox 'PI Web API client library for Matlab'

 

Notes

 

It is highly recommended to turn debug mode on in case you are using PI Web API 2017 R2+ in order to receive more detailed exception errors. This can be achieved by creating or editing the DebugMode attribute's value to TRUE from the System Configuration element.

 

Examples

 

Please refer to the following examples to understand how to use this library:

 

Create an instance of the piwebapi top level object using Basic authentication.

 

username = 'webapiuser';
useKerberos = false;
password = 'password'
baseUrl = 'https://devdata.osisoft.com/piwebapi';
verifySsl = false;
client = piwebapi(baseUrl, useKerberos, username, password, verifySsl);

 

Only the Basic authentication is available on this initial version. Please make sure to set up PI Web API properly to make it compatible with this authentication method.

If you are having issues with your SSL certificate and you want to ignore this error, set verifySsl to false.

 

Retrieve data from the home PI Web API endpoint

 

pilanding = client.home.get();

 

Get the PI Data Archive object

 

dataServer = client.dataServer.getByName(serverName);

 

Get the PI Point, AF Element and AF Attribute objects

 

point = client.point.getByPath("\\PISRV1\sinusoid");
attribute = client.attribute.getByPath("\\PISRV1\Universities\UC Davis\Buildings\Academic Surge Building\Electricity|Demand");
element = client.element.getByPath("\\PISRV1\Universities\UC Davis\Buildings\Academic Surge Building\Electricity"); 

 

Get recorded, interpolated and plot values from a stream

 

webId = point1.WebId;
startTime = "*-10d";
endTime = "*";
interval = "1h";
intervals = 30;
maxCount = 100;
desiredUnits = '';
selectedFields = '';
associations = '';
boundaryType = '';
filterExpression = '';
includeFilteredValues = '';


recordedValues = client.stream.getRecorded(webId, associations, boundaryType, desiredUnits, endTime, filterExpression, includeFilteredValues, maxCount, selectedFields, startTime);
interpolatedValues = client.stream.getInterpolated(webId, desiredUnits, endTime, filterExpression, includeFilteredValues, interval, selectedFields, startTime);
plotValues = client.stream.getPlot(webId, desiredUnits, endTime, intervals, selectedFields, startTime);

 

Get recorded, interpolated and plot values from a streamset in bulk

 

sortField = '';
sortOrder= '';

webIds = { point1.WebId, point2.WebId, point3.WebId, attribute.WebId};

recordedValuesInBulk = client.streamSet.getRecordedAdHoc(webIds, boundaryType, endTime, filterExpression, includeFilteredValues, maxCount, selectedFields, sortField, sortOrder, startTime);
interpolatedValuesInBulk = client.streamSet.getInterpolatedAdHoc(webIds, endTime, filterExpression, includeFilteredValues, interval, selectedFields, sortField, sortOrder, startTime);
plotValuesInBulk = client.streamSet.getPlotAdHoc(webIds, endTime, intervals, selectedFields, sortField, sortOrder, startTime);

 

 

Conclusion

 

If you want to create models to be used in production, please use this library and let me know if it works fine. You can send me an e-mail directly (mloeff@osisoft.com) or leave a comment below.

Note: Development and Testing purposes only. Not supported in production environments.

 

Link to other containerization articles

Containerization Hub

 

Introduction

In one of my previous blog posts, I was spinning up an AF Server container using local accounts for authentication. For non-production purposes, this is fine. But since Kerberos is the authentication method that we recommend, I would like to show you that it is also possible to use Kerberos authentication for the AF Server container. To do this, you will have to involve a domain administrator since a Group Managed Service Account (GMSA) will need to be created. Think of GMSA as a usable version of the Managed Service Account. A single gMSA can be used for multiple hosts. For more details about GMSA, you can refer to this article: Group Managed Service Accounts Overview

 

Prerequisite

You will need the AF Server image from this blog post.

Spin up AF Server container (SQL Server included)

 

Procedure

1. Request GMSA from your domain administrator. The steps are listed here.

Add-KDSRootKey -EffectiveTime (Get-Date).AddHours(-10) #Best is to wait 10 hours after running this command to make sure that all domain controllers have replicated before proceeding
Add-WindowsFeature RSAT-AD-PowerShell
New-ADServiceAccount -name <name> -DNSHostName <dnshostname> -PrincipalsAllowedToRetrieveManagedPassword <containerhostname> -ServicePrincipalNames "AFServer/<name>"

2. Once you have the GMSA, you can proceed to install it on your container host.

Install-ADServiceAccount <name>

3. Test that the GMSA is working. You should get a return value of True

Test-ADServiceAccount <name>

4. Get script to create AF Server container with Kerberos.

Invoke-WebRequest "https://raw.githubusercontent.com/elee3/AF-Server-container-build/master/New-KerberosAFServer.ps1" -UseBasicParsing -OutFile New-KerberosAFServer.ps1

5. Create a new AF Server container

.\New-KerberosAFServer.ps1 -ContainerName <containername> -AccountName <name>

 

Usage

Now you can open up PI System Explorer on your container host to connect to your containerized AF Server with the <name> parameter that you have been using in the procedure section. On the very first connect, you should connect with the afadmin user (password:qwert123!) so that you can set up mappings for your domain accounts. Otherwise, your domain accounts will only have 'World' permissions. After you set up your mappings, you can choose to delete that afadmin user or just keep it. With the mappings for your domain account created, you can now disconnect from your AF Server and reconnect to it with Kerberos authentication. From now on, you do not need explicit logins for your AF Server anymore!

 

Conclusion

We can see that security is not a limitation when it comes to using an AF Server container. It is just more troublesome to get it going and requires the intervention of a domain administrator. However, this will remove the need of using local accounts for authentication which is definitely a step towards using the AF Server container for production. I will be showing how to overcome some limitations of containers in future posts such as letting containers have static IP and the ability to communicate outside of the host.

Enhancing the Hello World example from Part 1

 

Now we're going to take the Line Pressure and Tubing Pressure AF Elements that were under the gas wells from our example database and convert those Elements into Attributes that live directly under the wells.  This is pretty easy!

 

 

The additions to the AF Transform XML file are straightforward.   First, the Measures template defines the two AF Elements where the values for each sensor are.   And these elements are always found underneath each well, so we can enhance the search for the wells to bring in the Line Pressure and Tubing Pressure child elements, like this:

 

...
<SearchShapes>
  <!-- Specify the elements and attributes of the search pattern that AF Transformer uses to search the source PI AF database -->
  <Shape ID="1001" Required="true" FilterMatchType="Any" ShapeWalkType="TopBottom">
    <ShapeElements>
      <ShapeElement ID="1" Required="true" FilterMatchType="Any" MaxDepthFromParent="1" IsEntryPoint="false">
        <Filters>
          <ElementFilter Category="" Template="Facility" Name="*" />
        </Filters>
        <Attributes>
          <ShapeAttribute ID="10" Required="true" FilterMatchType="Any">
            <Filters>
              <AttributeFilter Category="" Template="" Name="Contractor Name" />
            </Filters>
          </ShapeAttribute>
          <ShapeAttribute ID="11" Required="true" FilterMatchType="Any">
            <Filters>
              <AttributeFilter Category="" Template="" Name="Description" />
            </Filters>
          </ShapeAttribute>
        </Attributes>
        <ShapeElements>
          <!-- Line Pressure AF Element -->
          <ShapeElement ID="2" Require="true" FilterMatchType="Any" MaxDepthFromParent="1" IsEntryPoint="false">
            <Filters>
              <ElementFilter Category="" Template="Measures" Name="*PLN" />
            </Filters>
            <Attributes>
              <ShapeAttribute ID="20" Required="true" FilterMatchType="Any">
                <Filters>
                  <AttributeFilter Category="" Template="" Name="Value" />
                </Filters>
              </ShapeAttribute>
            </Attributes>
          </ShapeElement>
          <!-- End of Line Pressure AF Element -->
          <!-- Tubing Pressure AF Element -->
          <ShapeElement ID="3" Require="true" FilterMatchType="Any" MaxDepthFromParent="1" IsEntryPoint="false">
            <Filters>
              <ElementFilter Category="" Template="Measures" Name="*PTUB" />
            </Filters>
            <Attributes>
              <ShapeAttribute ID="30" Required="true" FilterMatchType="Any">
                <Filters>
                  <AttributeFilter Category="" Template="" Name="Value" />
                </Filters>
              </ShapeAttribute>
            </Attributes>
          </ShapeElement>
          <!-- End of Tubing Pressure AF Element -->
        </ShapeElements>
      </ShapeElement>
    </ShapeElements>
  </Shape>
</SearchShapes>
...

 

 

And then we just add the line pressure and tubing pressure attributes, referring back to the two elements:

 

...
<OutputElementShapes>
  <!-- Create and populate the asset model in the destination database -->
  <OutputElementShape ID="1003">
    <Elements>
      <Element Name="Injection Wells by Contractor" Guid="" ReadOnly="false" Template="">
        <Elements>
          <Element Name="[10.Value]" Guid="" Description="Contractor" ReadOnly="false" Template="">
            <Elements>
              <Element Name="[1.Name]">
                <Attributes>
                  <Attribute Name="Description" Value="[11.Value]" />
                  <Attribute Name="Line Pressure" Value="[20.DataReference]" />
                  <Attribute Name="Tubing Pressure" Value="[30.DataReference]" />
                </Attributes>
              </Element>
            </Elements>
          </Element>
        </Elements>
      </Element>
    </Elements>
  </OutputElementShape>
</OutputElementShapes>
...

 

Make sure when you're using PI Point data references (as well as AF table lookups) you use the .DataReference reference and not the .Value reference, otherwise you'll be pulling across the snapshot values of what those attributes are at the time you run AF Transformer.

Note: Development and Testing purposes only. Not supported in production environments.

 

Link to other containerization articles

Containerization Hub

 

Introduction

In this blog post, I will be giving an overview of how to use Docker Compose to create a PI System compose architecture that you can use for

 

1. Learning PI System development

2. Running your unit tests with a clean PI System

3. Compiling your AF Client code

4. Exploring PI Web API structure

5. Testing out Asset Analytics syntax

5. Other use cases that I haven't thought of (Post in the comments!)

 

What is Compose?

It is a tool for defining and running multi-container Docker applications. With Compose, you use a single file to configure your application’s services. Then, with a single command, you create and start all the services from your configuration. It is both easy and convenient.

 

Setup images

The Setup involved is simple. You can refer to my previous blog posts set up these images. Docker setup instructions can be found in the Containerization Hub link above.

Spin up PI Web API container (AF Server included)

Spin up PI Data Archive container

Spin up AF Client container

Spin up PI Analysis Service container

 

Compose setup

In Powershell, run as administrator these commands:

 

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
Invoke-WebRequest "https://github.com/docker/compose/releases/download/1.21.2/docker-compose-Windows-x86_64.exe" -UseBasicParsing -OutFile $Env:ProgramFiles\docker\docker-compose.exe

 

Obtain Compose file from docker-compose.yml. Place it on your desktop.

 

Deployment

Open a command prompt and navigate to your desktop. Enter

docker-compose up

 

Wait until the screen shows

Once you see that. You can close the window. Your PI System architecture is now up and running!

 

Usage

There are various things you can try out. If you are experiencing networking issues between the containers, turn off the firewall for the Public Profile on your container host.

 

1. You can try browsing the PI Web API structure by using this URL (https://eleeaf/piwebapi) in your web browser. When prompted for credentials, you can use

username: afadmin

password: qwert123!

 

2. Test network connectivity from client container to the PI Data Archive and AF Server by running

docker exec -it desktop_client_1 afs

The hostname of the AF Server is eleeaf. When prompted to use NTLM, enter q. The hostname of the PI Data Archive is eleepi. You should see the following results.

 

3. You can install PI System Management Tools on your container host and connect to the PI Data Archive via IP address of the container. Somehow, PI SMT doesn't let you connect with hostname.

 

4. You can also install PI System Explorer and connect to the AF Server to create new databases.

 

5. You can try compiling some open source AF SDK code found in our Github repository using the AF Client container. (so that you do not have to install Visual Studio)

 

6. You can use PI System Explorer to experiment with some Asset Analytics equations that you have in mind to check if they are valid.

 

Destroy

Once you are done with the environment, you can destroy it with

docker-compose down

 

Limitations

This example does not persist data or configuration between runs of the container.

These applications do not yet support upgrade of container without re-initialization of the data.

This example relies on PI Data Archive trusts and local accounts for authentication.

AF Server, PI Web API, and SQL Express are all combined in a single container. 

 

Conclusion

Notice how easy it is to set up a PI System compose architecture. You can do this in less than 10 minutes. No more having to wait hours to install a PI System for testing and developing with.

The current environment contains PI Data Archive, AF Server, AF Client, PI Web API, a AF SDK sample application (called afs) and PI Analysis Service. More services will be added in the future!

Eugene Lee

Spin up AF Client container

Posted by Eugene Lee Employee May 21, 2018

Note: Development and Testing purposes only. Not supported in production environments.

 

Link to other containerization articles

Containerization Hub

 

In this blog post, the instructions for building an AF Client image will be shown. For instructions on how to install Docker, please see the link above.

 

1. Please clone this git repository. GitHub - elee3/AF-Client-container-build

2. Download AF Client 2017R2 from the Techsupport website. AF Client 2017 R2

3. Extract AF Client into the cloned folder.

4. Run build.bat

 

If you prefer us to build the image for you so that you can docker pull it immediately (less hassle). Please post in the comments!

 

Usage

This container can be used to compile your AF SDK code (so that you do not have to install Visual Studio) and you can use the container to pack an AF SDK application with its AF Client dependency for easier distribution. An AF SDK sample application (called afs) has been included in the image for you to try compiling it.

 

Limitations

Containers cannot run applications with GUI such as WPF and Windows Forms applications.

Eugene Lee

Containerization Hub

Posted by Eugene Lee Employee May 21, 2018

Good day everyone, I am creating this blog post as a convenient way for users to find the containerization articles that have already been published and also list those that have yet to be published (subject to changes). Users will just need to bookmark this page rather than bookmark all the individual articles.

 

Spin up AF Server container (SQL Server included)

Spin up PI Web API container (AF Server included)

Spin up PI Data Archive container

Spin up AF Client container

Compose PI System container architecture

Spin up AF Server container (Kerberos enabled)

Spin up PI Analysis Service container

Spin up AF Server container (without SQL Server)

Spin up PI Web API container (without AF Server)

Spin up PI Web API website container

Spin up PI Interface container

AF Server container network options

 

Let me know if you have any requests!

 

To prevent myself from repeating the same thing in every containerization article. I will include the steps to setup Docker here.

 

Install Docker

For Windows 10,

You can install Docker for Windows. Please follow the instructions here

 

For Windows Server 2016,

You can use the OneGet provider PowerShell module. Open an elevated PowerShell session and run the below commands.

 

Install-Module -Name DockerMsftProvider -Repository PSGallery -Force    
Install-Package -Name docker -ProviderName DockerMsftProvider    
Restart-Computer -Force    

Introduction

 

After publishing the PI Web API client libraries on GitHub, I have received several enhancement requests (for .NET Standard, Java and Python) from our customers and partners. Some of them were added to the libraries!

 

Enhancements for the client library for .NET Standard

 

Migrated from RestSharp to HttpClient

 

Although there is no change for the end user,  RestSharp was replaced by the native HttpClient. The main reason is to use the CancellationTokenSource which will be commented on the next item. Also, HttpClient is available natively on .NET Standard so there is no need to download an extra NuGet package.

 

CancellationToken added for Async requests

 

Using the CancellationTokenSource allows you to cancel HTTP requests during a running operation. Below you can find an example:

 

Stopwatch watch = Stopwatch.StartNew();
CancellationTokenSource cancellationTokenSource = new CancellationTokenSource();
PIItemsStreamValues bulkValues = null;
try
{
     Task t = Task.Run(async () =>
     {
          bulkValues = await client.StreamSet.GetRecordedAdHocAsync(webId: webIds, startTime: "*-1800d", endTime: "*", maxCount: 50000, cancellationToken: cancellationTokenSource.Token);
     });
     //Cancel the request after 1s
     System.Threading.Thread.Sleep(1000);
     cancellationTokenSource.Cancel();
     t.Wait();
     Console.WriteLine("Completed task: Time elapsed: {0}s", 0.001 * watch.ElapsedMilliseconds);
}
catch (Exception)
{
     Console.WriteLine("Cancelled task: Time elapsed: {0}s", 0.001 * watch.ElapsedMilliseconds);
};

 

Fixed known issue on the Calculation controller

 

There was a known issue reported on GitHub when calling Calculation.GetAtTimes() method using expressions with comma. This was fixed so the code below works successfully!

 

string expression = "'sinusoid'*2 + 'cdt158'";
PITimedValues values = client.Calculation.GetAtTimes(webId: dataServer.WebId, expression: expression , time: new List<string>() { "*-1d" });

string expression2 = "'cdt158'+tagval('sinusoid','*-1d')";
PITimedValues values2 = client.Calculation.GetAtTimes(webId: dataServer.WebId, expression: expression2, time: new List<string>() { "*-1d" });

 

Enhancements for the client library for Java

 

PI Web API Batch was added in order to make more complex requests with better performance. You can find more information about PI Web API Batch here.

 

Added PI Web API Batch

 

Map<String, PIRequest> batch = new HashMap<String, PIRequest>();
PIRequest req1 = new PIRequest();
PIRequest req2 = new PIRequest();
PIRequest req3 = new PIRequest();
req1.setMethod("GET");
req1.setResource("https://marc-web-sql.marc.net/piwebapi/points?path=\\\\MARC-PI2016\\sinusoid");
req2.setMethod("GET");
req2.setResource("https://marc-web-sql.marc.net/piwebapi/points?path=\\\\MARC-PI2016\\cdt158");
req3.setMethod("GET");
req3.setResource("https://marc-web-sql.marc.net/piwebapi/streamsets/value?webid={0}&webid={1}");

List<String> parameters = new ArrayList<>();
parameters.add("$.1.Content.WebId");
parameters.add("$.2.Content.WebId" );
req3.setParameters(parameters);


List<String> parentIds = new ArrayList<>();
parentIds.add("1");
parentIds.add("2");
req3.setParentIds(parentIds);

batch.put("1", req1);
batch.put("2", req2);
batch.put("3", req3);
Map<String, PIResponse> batchResponse = client.getBatch().execute(batch);

Object content1 = batchResponse.get("1").getContent();
Object content2 = batchResponse.get("2").getContent();
Object content3 = batchResponse.get("3").getContent();

JSON json = new JSON(client.getApiClient());
PIPoint pointBatch1 = json.deserialize(json.serialize(content1), new TypeToken<PIPoint>(){}.getType());
PIPoint pointBatch2 = json.deserialize(json.serialize(content2), new TypeToken<PIPoint>(){}.getType());
PIItemsStreamValue batchStreamValues = json.deserialize(json.serialize(content3), new TypeToken<PIItemsStreamValue>(){}.getType());

 

 

Added Web ID 2.0 client generation

 

Now, it is also possible to generate Web ID 2.0 without having to make an HTTP request against PI Web API. The library also provides a way to get information for a particular Web ID. Remember that this only works with PI Web API 2017 R2+.

 

PIDataServer dataServer = client.getDataServer().getByPath("\\\\MARC-PI2016", null, null);
PIPoint point = client.getPoint().getByPath("\\\\marc-pi2016\\sinusoid",null, null);
PIElement element = client.getElement().getByPath("\\\\MARC-PI2016\\CrossPlatformLab\\marc.adm",null, null);
PIAttribute attribute = client.getAttribute().getByPath( "\\\\MARC-PI2016\\CrossPlatformLab\\marc.adm|Heading",null,null);

WebIdInfo webIdInfo2 = client.getWebIdHelper().getWebIdInfo(attribute.getWebId());
WebIdInfo webIdInfo = client.getWebIdHelper().getWebIdInfo(element.getWebId());
WebIdInfo webIdInfo4 = client.getWebIdHelper().getWebIdInfo(point.getWebId());
WebIdInfo webIdInfo3 = client.getWebIdHelper().getWebIdInfo(dataServer.getWebId());

String web_id1 = client.getWebIdHelper().generateWebIdByPath("\\\\PISRV1\\CDF144_Repeated24h_forward", PIPoint.class, null);
String web_id2 = client.getWebIdHelper().generateWebIdByPath("\\\\PISRV1\\Universities\\UC Davis\\Buildings\\Academic Surge Building|Electricity Totalizer", PIAttribute.class, PIElement.class);

 

Available for downloading through JitPack

 

I've received a request on GitHub to publish the library on Maven Central. Since it is not an easy process, especially if you are not familiar, I've decided to publish it though JitPack.

 

If you want to use the Java library, please read the instructions here about how to retrieve the library without having to compile it locally.

 

 

Enhancements for the client library for Python

 

 

Added Kerberos as an authentication method

 

Robert Raesemann asked me in this blog post to make the client library for Python compatible with Kerberos authentication. Now it is possible to instantiate the PI Web API top level object as:

 

from osisoft.pidevclub.piwebapi.pi_web_api_client import PIWebApiClient
  client = PIWebApiClient("https://test.osisoft.com/piwebapi", useKerberos=True, verifySsl=False)

 

 

Added PI Web API Batch

 

PI Web API Batch was also added to Python.

 

  req1 = PIRequest()
  req2 = PIRequest()
  req3 = PIRequest()
  req1.method = "GET"
  req1.resource = "https://localhost/piwebapi/points?path=\\\\MARC-PI2016\\sinusoid"
  req2.method = "GET"
  req2.resource = "https://localhost/piwebapi/points?path=\\\\MARC-PI2016\\cdt158"
  req3.method = "GET"
  req3.resource = "https://localhost/piwebapi/streamsets/value?webid={0}&webid={1}"
  req3.parameters = ["$.1.Content.WebId", "$.2.Content.WebId"]
  req3.parent_ids = ["1", "2"]

  batch = {
"1": req1,
"2": req2,
"3": req3
  }

  batchResponse = client.batch.execute(batch)
  point1 = client.api_client.deserialize_object(batchResponse["1"].content, 'PIPoint')
  point2 = client.api_client.deserialize_object(batchResponse["2"].content, 'PIPoint')
  itemsStreamValue = client.api_client.deserialize_object(batchResponse["3"].content, 'PIItemsStreamValue')

 

Thanks Rafael Borges for helping me with this task!

 

 

Optional parameters with default values

 

In this new version, you don't need to define all parameters of each method. Optional parameters have default values which are going to be used if they are not defined. Let's see an example:

 

piItemsStreamValues = client.streamSet.get_recorded_ad_hoc(webIds, start_time="*-3d", end_time="*",
                                                                   include_filtered_values=True, max_count=1000)

 

 

Added Web ID 2.0 client generation

 

Web ID 2.0 client generation was also added to the library. Here is an example:

 

pi_data_server_web_id = client.webIdHelper.generate_web_id_by_path("\\\\PISRV1", type(PIDataServer()), None)
  point1_web_id = client.webIdHelper.generate_web_id_by_path("\\\\PISRV1\\SINUSOID", type(PIPoint()))
  point2_web_id = client.webIdHelper.generate_web_id_by_path("\\\\PISRV1\\CDT158", type(PIPoint()))
  point3_web_id = client.webIdHelper.generate_web_id_by_path("\\\\PISRV1\\SINUSOIDU", type(PIPoint()))
  pi_attribute_web_id = client.webIdHelper.generate_web_id_by_path(
"\\\\PISRV1\\Universities\\UC Davis\\Buildings\\Academic Surge Building|Electricity Totalizer",
type(PIAttribute()), type(PIElement()))

  pi_element_web_id= client.webIdHelper.generate_web_id_by_path(
"\\\\PISRV1\\Universities\\UC Davis\\Buildings\\Academic Surge Building", type(PIElement()), None)

 

Available for downloading through PyPI (Python Package Index)

 

Just run the code below to download it:

 

pip install osisoft.pidevclub.piwebapi

 

You can find more information on the PyPI library page.

 

Conclusion

 

I hope you find value in those improvements. If you have an enhancement request concerning one of the client libraries, please let me know!

 

It is almost time to update to 2018!

 

Stay tuned for new updates and releases!

Sometimes you wish your AF database was reorganized differently

 

I would hope your AF Database models follow a plant heirarchy and be nothing like tags in the Data Archive.  But you might not be so lucky.  Or, when using a PI Connector, AF Assets are created for you and they tend to follow the networking pattern for how the sensors assets are organized.  Often though, there are projects in your organization that would benefit greatly if your asset model were organized differently.  By reorganizing your AF database you could save yourself from expensive lookup queries or worse: traversing complicated parent/child chains using AF SDK or PI Web API.  It can make a lot more sense to get your asset model structured in a way that makes sense for whatever your present needs are.

 

Since AF SDK went public users have taken upon themselves to build AF SDK programs that do this sort of (re) "mapping".

 

AF Transformer helps you accomplish this remapping task without the need to build, compile and test C# code using AF SDK.  By editing an XML configuration file (with a good text editor ;-) and running the tool you can transform one or more AF databases into a new model that can greatly assist your projects and your downstream users who want to see assets in a way they understand.

 

 

Presented April 25, 2018 at PI World

 

Why might I use AF Transformer?

 

  • A downstream application is being built that needs to see/traverse your AF model in a way that would require a large number of lookups.  It might make a lot more sense to produce a new AF model that's pre-sorted to meet the needs of your downstream application--boosting performance and decreasing the amount of time it takes to develop.
  • You need to build a segmented AF model that limits the scope of assets to a particular group of users
  • You need to flatten, expand or pivot an AF model so it makes more sense to the various different business consumers in your organization

 

Getting Started - Let's pivot an AF database based on an attribute value

 

Inside the AF Transformer kit are examples that include two AF databases you can import.  I have an enhanced version of WellsDb.xml attached to this blog post (called WellsDBWithAFTable.xml) that includes an additional AF table of well maintenance contractors who are assigned to a well.  I have also added an attribute to link each well to a contractor and resolve the contractor's name.

 

2018-05-11_14-54-17.png

 

Import WellsDBWithAFTable.xml into a new AF Database in PI System Explorer (under File->Import From File...) I prefer to call it WellsDb, but you can name it whatever you like.

 

Next, create a second empty AF Database.  Let's call it TransformedWells.  This is where we're going to deposit the output from the transform.

 

Now, let's transform this.  Write an XML file with the following sections.  Be careful to update the <DataSource> and <Writers> tags to your correct source and target AF database host name and database names.  (CLSAF is the name of my own AF Server; which is likely not going to be the name of your own AF Server)

 

<?xml version="1.0" encoding="UTF-8"?>
<CASTDataSet xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
   <LogFilePath>c:\Program Files\PIPC\AFTransformer\AF Transformer Examples\example.log</LogFilePath>
   <StatInfo />
   <ProcessProperties>
      <!-- Default values for process properties are used -->
   </ProcessProperties>
   <Tasks>
      <Task Id="1000">
         <SearchShapes>
            <SearchShape Id="1001" ReaderId="1002" FilterMatchType="Any">
               <DataFilters />
            </SearchShape>
         </SearchShapes>
         <OutputElementShapes>
            <OutputElement Id="1003" />
         </OutputElementShapes>
         <TaskWriters>
            <TaskWriter Id="1004" />
         </TaskWriters>
      </Task>
   </Tasks>
   <DataSources>
      <DataSource MEFClass="AF" HighID="0" ID="1002">
         <Properties>
            <Properties Key="Host" Value="CLSAF" />
            <Properties Key="Database" Value="WellsDb" />
            <Properties Key="UserName" Value="" />
            <Properties Key="Password" Value="" />
            <Properties Key="DefaultPIServer" Value="" />
         </Properties>
      </DataSource>
   </DataSources>
   <Writers>
      <Writer MEFClass="AF" Enabled="true" Encoding="UTF8" ID="1004">
         <Properties>
            <Properties Key="Host" Value="CLSAF" />
            <Properties Key="Database" Value="TransformedWells" />
         </Properties>
      </Writer>
   </Writers>
   <SearchShapes>
      <!-- Specify the elements and attributes of the search pattern that AF Transformer uses to search the source PI AF database -->
      <Shape ID="1001" Required="true" FilterMatchType="Any" ShapeWalkType="TopBottom">
         <ShapeElements>
            <ShapeElement ID="1" Required="true" FilterMatchType="Any" MaxDepthFromParent="1" IsEntryPoint="false">
               <Filters>
                  <ElementFilter Category="" Template="Facility" Name="*" />
               </Filters>
               <Attributes>
                  <ShapeAttribute ID="10" Required="true" FilterMatchType="Any">
                     <Filters>
                        <AttributeFilter Category="" Template="" Name="Contractor Name" />
                     </Filters>
                  </ShapeAttribute>
                  <ShapeAttribute ID="11" Required="true" FilterMatchType="Any">
                     <Filters>
                        <AttributeFilter Category="" Template="" Name="Description" />
                     </Filters>
                  </ShapeAttribute>
               </Attributes>
            </ShapeElement>
         </ShapeElements>
      </Shape>
   </SearchShapes>
   <OutputElementShapes>
      <!-- Create and populate the asset model in the destination database -->
      <OutputElementShape ID="1003">
         <Elements>
            <!--Here is where the new AF Database model begins-->
            <Element Name="Injection Wells by Contractor" Guid="" ReadOnly="false" Template="">
               <Elements>
                  <Element Name="[10.Value]" Guid="" Description="Contractor" ReadOnly="false" Template="">
                     <Elements>
                        <Element Name="[1.Name]">
                           <Attributes>
                              <Attribute Name="Description" Value="[11.Value]" />
                           </Attributes>
                        </Element>
                     </Elements>
                  </Element>
               </Elements>
            </Element>
         </Elements>
      </OutputElementShape>
   </OutputElementShapes>
</CASTDataSet>

 

Now, let's run this.

 

In a command prompt window, call the utility directly with the /configxml="{path to your xml transform file here}" option.

 

2018-05-11_16-08-49.png

 

Go check your output

 

Take a look at the console output.  Make sure the AF Reader and the AF Writer in the console output both report that they've found their respective AF databases.  If so, the remaining part of the orchestration will complete and transpose the AF Objects.  Let's check the TransformedWells AF database to be sure.

 

2018-05-11_16-10-59.png

 

Yup, they're there.

 

This is the simplest example of AF Transformer working so that you grasp how to do grouping.  In Part 2, we will build off this example and convert the Line Pressure and Tubing Pressure AF Elements into Attributes that report directly on these transformed well elements.

 

>> Let's continue on to Part 2

Note: Development and Testing purposes only. Not supported in production environments.

 

Link to other containerization articles

Containerization Hub

 

Introduction

I now present to you another blog post in the containerization series on spinning up PI Web API in less than 3 minutes (My test came out to be 2 min 44 sec!).

 

I will repeat the steps here for setting up Docker for your convenience. If you have already done so while using the AF Server image, then you do not need to repeat it again. The PI Web API image offered here is fully self contained. In other words, you do not have to worry about any dependencies such as where to store your PI Web API configuration. In a later blog post, I will be posting on a PI Web API image that only contain the application service for those of you who want the application service to be separate from the database service. In that image, you will need to furnish your own AF Server then. For now, you do not have to care about that.

 

Set up

Install Docker

For Windows 10,

You can install Docker for Windows. Please follow the instructions here

 

For Windows Server 2016,

You can use the OneGet provider PowerShell module. Open an elevated PowerShell session and run the below commands.

 

Install-Module -Name DockerMsftProvider -Repository PSGallery -Force  
Install-Package -Name docker -ProviderName DockerMsftProvider  
Restart-Computer -Force  

 

Install PI Web API image

Run the following command at a console. When prompted for the username and password during login, please contact me (elee@osisoft.com) for them. Currently, this image is only offered for users who already have a PI Server license or are PI Developers Club members (try it now for free!). You will have to login before doing the pull. Otherwise, the pull will be unsuccessful.

docker login  
docker pull elee3/afserver:piwebapi  
docker logout  

Remember to check digest of image to make sure it is not tampered with.

Deployment

Now that the setup is complete, you can proceed to running the container image. To do so, use the following command. Replace <DNS hostname> and <containername> with one of your own picking. Remember to pick a DNS hostname that is unique.

 

docker run -it --hostname <DNS hostname> --name <containername> elee3/afserver:piwebapi  

 

After about 3 minutes, you will see that the command prompt indicates that both the PI Web API and AF Server are Ready.

This indicates that your PI Web API is ready for usage. At this point, you can just close the window.

 

Usage

Now you can open a browser on your container host and connect to it with the DNS hostname that you chose earlier.

https://<DNS hostname>/piwebapi

 

When prompted for credentials, you can use

User name: afadmin

Password: qwert123!

 

Browsing your PI Data Archive

You can use a URL of the form

https://<DNS hostname>/piwebapi/dataservers?path=\\<PI Data Archive hostname>

to access your PI Data Archive. Of course, you need to give access permissions by creating a local user on the PI Data Archive machine with the same username and password above and give a PI mapping to that user.

 

Browsing your AF Server

You can use a URL of the form

https://<DNS hostname>/piwebapi/assetservers?path=\\<AF Server hostname>

to access your AF Server. Again, you need to give access permissions by creating a local user on the AF Server machine with the same username and password above. By default, everyone has World identity in AF Server so you do not need to give any special AF mapping.

 

Multiple PI Web API instances

You can spin up several PI Web API instances by using the docker run command multiple times with a difference hostname and containername.

You can see above that I have spin up several instances on my container host.

 

Destroy PI Web API instance

If you no longer need the PI Web API instance, you can destroy it using

docker stop <containername>  
docker rm <containername>  

 

Limitations

AF Server, PI Web API, and SQL Express are all combined in a single container. There will be an upcoming blog post for a container with just PI Web API in it.

This example relies on local accounts for authentication.

 

Conclusion

Observe that the steps to deploy both the AF Server and PI Web API containers are quite similar and can be easily scripted. This helps to provision testing environments quickly and efficiently which helps in DevOps.

 

New updates (12 Jun 2018)

In the never ending quest for speed and productivity, every minute and second that we save waiting for applications to boot up can be better utilized elsewhere such as taking a nap or watching that cat video clip that your friend sent you. Therefore, I present to you a faster PI Web API container image that is more than 60% faster than the original one.

 

docker pull elee3/afserver:webapifast

Remember to check digest of image to make sure it is not tampered with.

 

3 test runs were performed to compare the boot up time.

 

Run 1

Start time was 13:48:00 for both. The original image finished in 2 min 36 sec while the new one finished in 55 sec.

 

Run 2

Start time was 13:58:00 for both. The original image finished in 2 min 27 sec while the new one finished in 55 sec.

 

Run 3

Start time was 14:29:00 for both. The original image finished in 2 min 28 sec while the new one finished in 57 sec.

 

Summary of results

Run
Original (s)
New (s)
115655
214755
314857
Average15055

 

The results show that the new image is about 63% faster than the original one.

 

New updates (18 Jun 2018)

1. Added reminder to check digest of the image to make sure image has not been tampered with.

Note: Development and Testing purposes only. Not supported in production environments.

 

Link to other containerization articles

Containerization Hub

 

Introduction

Currently, in order to set up an AF Server for testing/development purposes, you have two choices.

 

1. Install SQL Server and AF Server on your local machine

The problem with this method is that there is no isolation from the host operating system. Therefore, you risk the stability of the host computer if something goes wrong. You also can't spin up multiple AF Servers this way.

 

2. Provision a VM and then install SQL Server and AF Server on it

While this method provides isolation, the problem lies in the time it takes to get it set up and also the size of the VM which includes many unnecessary components.

 

There is a better way!

Today, I will be teaching you how spin up AF Server instances in less than 2 minutes (after performing the initial setup which might take a bit longer). This is made possible by the usage of containerization technology.

 

Requirements

Windows Server build 1709, Windows Server 2016 (Core and with Desktop Experience) or Windows 10 Professional and Enterprise (Anniversary Edition). Ensure that your system is current with the Windows Update.

 

Benefits

1. Portability. Easy to transfer containers to other container hosts that meet the prerequisites. No need to do tedious migrations.

2. Side by side versioning. Ability to run multiple versions of AF Server on the same container host for compatibility testing and debugging purposes.

3. Speed. Very fast to deploy.

4. Resource efficiency and density. More AF Servers can run on the same bare metal machine compared to virtualization.

5. Isolation. If you no longer need the AF Server. You can remove it easily. It won’t leave any temporary or configuration files on your container host.

6. Able to use with container orchestration systems.

 

Set up

Install Docker

For Windows 10,

You can install Docker for Windows. Please follow the instructions here

 

For Windows Server 2016,

You can use the OneGet provider PowerShell module. Open an elevated PowerShell session and run the below commands.

Install-Module -Name DockerMsftProvider -Repository PSGallery -Force
Install-Package -Name docker -ProviderName DockerMsftProvider
Restart-Computer -Force

 

Install AF Server image

Run the following command at a console. When prompted for the username and password during login, please contact me (elee@osisoft.com) for them. Currently, this image is only offered for users who already have a PI Server license or are PI Developers Club members (try it now for free!). You will have to login before doing the pull. Otherwise, the pull will be unsuccessful.

docker login
docker pull elee3/afserver:2017R2
docker logout

Remember to check digest of image to make sure it is not tampered with.

 

Deployment

Now that the setup is complete, you can proceed to running the container image. To do so, use the following command. Replace <DNS hostname> and <containername> with one of your own picking. This will take less than 2 minutes. Remember to pick a DNS hostname that is unique.

docker run -di --hostname <DNS hostname> --name <containername> elee3/afserver:2017R2 cmd

 

You can now open up PI System Explorer on your local machine and connect to the AF Server by specifying the DNS Hostname that you chose earlier. When prompted for credentials, use

User name: afadmin

Password: qwert123!

Check the box to remember the credentials so that you won't have to enter it every time.

 

You can choose to rename the AF Server if you wish.

 

And you are done! Enjoy the new AF Server instance that you have created!

 

Using with AF SDK

To connect to the AF Server from code using AF SDK, the following Connect overload can be utilized with the same credentials as above.

PISystem.Connect Method (NetworkCredential)

 

Multiple AF Servers

In order to spin up another AF Server instance, follow the steps above. When you get the new container running. You have to change the ServerID. You can do this via

docker exec -i <containername> cmd /c "cd %pihome64%\af&afdiag.exe /cid:<guid>"

 

You can generate a new guid using this.

 

Destroy AF Server

If you no longer need the AF Server, you can destroy it using

docker stop <containername>
docker rm <containername>

 

Limitations

This example uses a public SQL Express container image which is currently not available for use in a production environment.

This example relies on local accounts for authentication. Refer to the following article if you want to use Kerberos. Spin up AF Server container (Kerberos enabled)

 

New updates (14 Feb 2018)

1. 2017R2 tag is now available. Commands have been updated in the blog.

2. Image has been updated with ability to import in an existing AF Server backup in the form of PIFD.bak file. To do this, run

docker run -di --hostname <DNS hostname> --name <containername> -v <path to folder containing PIFD.bak>:c:\db elee3/afserver:2017R2 migrate.bat

 

New updates (30 May 2018)

1. Local account is no longer in the administrators group. Only a mapping to an AF Identity is done (better security).

 

New updates (18 Jun 2018)

1. Added reminder to check digest of the image to make sure image has not been tampered with.

Filter Blog

By date: By tag: