Skip navigation
All Places > PI Developers Club > Blog > Author: mfoerster

PI Developers Club

6 Posts authored by: mfoerster Employee

In this blog post I will show how to write a machine learning output, that was produced in Python, back to the PI System.

 

This blog post is preceeded by this blog post: Machine Learning Pipeline 1: Importing PI System Data into Python

 

Output of machine learning

The output of machine learning is expected to be a numpy array. The output features and output length determine the dimension of this numpy array.

 

Dimension: (output length, output features)

 

With

output length = number of predicted timesteps

output features = number of predicted features (for example: Temperature, Humdity,..)

 

Example:

Dimensions of this numpy array are: (192, 9)

9 columns and 192 rows

 

These values do not yet have a timestamp.

 

Generating a timestamp

Depending on how these predicted values were generated, a timestamp for these must be generated before they can be written to the PI System.

The timestamp format for the PI System is:

 

"YYYY-MM-DDThh:mm:ssZ"

 

Pythons datetime package can be used to generate timestamps:

 

from datetime import datetime
from datetime import timedelta

timestamp = datetime.now()

print(timestamp)

now = datetime.now() # current date and time

year = now.strftime("%Y")
print("year:", year)

month = now.strftime("%m")
print("month:", month)

day = now.strftime("%d")
print("day:", day)

time = now.strftime("%H:%M:%S")
print("time:", time)

date_time = now.strftime("%Y-%m-%dT%H:%M:%SZ")
print("date and time:",date_time)

(This is just an example how the parts of the timestamp can be generated)

 

Pythons timedelta can be used to add time to a timestamp. We will use timedelta to generate the timestamps for our predicted values. In our case we know that the sampling time of our values is 1h. (This is by design, as we earlier imported events with the same sampling frequency)

 

Posting output to the PI System

 

The following code will use the Python requests library to send a HTTP POST request to the PI Web API endpoint:

Requests: HTTP for Humans™ — Requests 2.21.0 documentation

 

for event in predict:

#build timestamp of format "YYYY-MM-DDThh:mm:ssZ"
timestamp = timestamp + timedelta(hours=1)#as we have 1h delta in between each predicted event
pi_timestamp = timestamp.strftime("%Y-%m-%dT%H:%M:%SZ")

#take only first column
value = event[0]

#Writing back to PI System
response = requests.post('https://<PIWebAPI_host>/piwebapi/streams/<webID_of_target_PIPoint>/value?updateOption=InsertNoCompression', data={'Timestamp': pi_timestamp, 'UnitsAbbreviation': '', 'Good': 'true' , 'Questionable': 'false', 'Value': value}, headers={"Authorization": "Basic %s" % b64Val}, verify=True)

(Sorry for the wrong intendation)

 

Here the UpdateValue method of PI Web API is used:

UpdateValue POST streams/{webId}/value

 

The efficiency can be enhanced by first creating all JSON objects for the events that are supposed to pe posted to the PI System, per PIPoint, and send them in bulk, using the UpdateValues method:

UpdateValues POST streams/{webId}/recorded

With this blog post series I want to enable data scientists to quickly get started doing Data Science in Python, without worrying about how to get the data out of the PI System.

 

In specific i want to highlight 2 options to get PI System data into Python for the use in data science:

 

  1. Writing PI System Data into a .csv file and using the .csv file as data source in Python.
  2. Directly accessing the PI Sytem using HTTP requests in Python.

 

Approach 1: Extracting PI System Data into a .csv file

Please check out these 3 ways to extract PI System data into .csv files:

 

Extracting PI System data in C# with AFSDK:

Extracting PI System Data to file using AFSDK in .NET

 

Extracting Pi System data in C# using PI SQL Client OLEDB

Extracting PI System Data to file using PI SQL Client OLEDB via PI SQL DAS RTQP in .NET

 

Extracting PI System Data in Python using PI Web API

Extracting PI System Data to file using PI Web API in Python

 

In each of the above approaches all events for the requested PI Points are extracted, no matter what how far the events are apart in time.

This can be not wanted, especially when using the data for time series prediction. In this case you would have to exchange the "RecordedValues" method by the "Interpolated" method to be able to define a sampling frequency:

 

PI Web API:

GetInterpolated GET streams/{webId}/interpolated

 

AFSDK:

AFData.InterpolatedValues Method

 

  • PI Datalink can also be used to create the .csv file, but focus is on programmatic approaches.

 

Reading data from .csv file in Python

Sample .csv file:

The events are stripped of their timestamps, as the events have a fixed sampling frequency, which makes a timestamp obsolete.

 

 

import numpy as np
import csv

dataset = np.loadtxt(open('filepath_csv', "rb"), delimiter=",", skiprows=1)

 

skiprows=1: will skip the first row of the .csv file. This can be useful when the header of the file contains column description.

The columns of the .csv file are stored in a numpy array, which can be further used for machine learning.

 

Approach 2: Directly accessing the PI Sytem using HTTP requests in Python.

For this approach we make use of the requests library in Python.

Requests: HTTP for Humans™ — Requests 2.21.0 documentation

 

The PI Web API GetInterpolated method is used to extract constantly sampled values of a desired PI Point:

GetInterpolated GET streams/{webId}/interpolated

 

In order to retrieve data for a certain PI Point we need the WebID as reference. It can be retrieved by the built-in search of PI Web API.

In this case the WebID can be found here:

 

 

 

Using the requests library of Python and the GetInterpolated method of PI Web API, we retrieve the sampled events of the desired PI Point as a JSON HTTP response:

 

import requests

response = requests.get('https://<PIWebAPI_host>/piwebapi/streams/<webID_of_PIPoint>/interpolated?startTime=T-10d&endTime=T&Interval=1h', headers={"Authorization": "Basic %s" % b64Val}, verify=True)

 

The response is in JSON format and will look something like that:

 

 

Parsing the JSON HTTP response:

We only need the values of the events. As they are interpolated, we do not care about quality. The timestamp information is contained in the sampling itnerval, that we have earlier specified in the GetInterpolated method of PI Web API.

We assume that we have 2 JSON responses r1 and r2 for 2 different PIPoints, but both generated with the GetInterpolated method, with same sampling interval, over the same timerange.

 

 

import json
import numpy as np

json1_data = r1.json()
json2_data = r2.json()

data_list_1 = list()

for j_object in json1_data["Items"]:

value = j_object["Value"]
if type(value) is float: #this is important to not iclude the last element which is of type "dict"

data_list_1 = np.append(data_list_1, float(value))
data_list_2 = list()

for j_object in json2_data["Items"]:

value = j_object["Value"]
if type(value) is float:
data_list_2 = np.append(data_list_2, float(value))

# Stack both 1-D Lists into a 2-D Array:
array_request_values = np.array(np.column_stack((data_list_1, data_list_2)))

(Sorry for the wrong intendation)

 

This Python code parses the JSON HTTP responses and writes them into 2 seperate lists. These then are stacked into a numpy array:

 

Example:

 

 

This numpy array can be used as input for machine learning.

 

Please check out Machine Learning Pipeline 2, for an easy way to write back machine learning output to the PI System.

In this post I will be leveraging OSISoft's PI Web API to extract PI System Data to a flat file.

To keep things simple and easy to reproduce, this post will focus how to extract data with this technology.

 

Prerequisites:

 

Remote PI System:

PI Data Archive 2018

PI AF Server 2018

 

Client:

Python 3.7.2

 

For simplicity 7 days of data of solely the PIPoint "Sinusoid" will be queried and written to a .txt file.

 

In order to retrieve data for a certain PI Point we need the WebID as reference. It can be retrieved by the built-in search of PI Web API.

In this case the WebID can be found here:

 

 

 

Given the WebID of the PI Point "Sinusoid", the following code will request historical data for the previous 7 days. It will parse the response JSON package, and write "Timestamp, Value, isGood" to the datafile specified.

 

Python Code:

import requests
import json
url = "https://<piwebapi_endpoint>/piwebapi/streams/<WebID_of_Sinusoid>/recorded?startTime=*-7d&endTime=*&boundaryType=Inside&maxCount=150000" #maxCount will set upper limit of values to be returned
filepath = "<filepath>"
response = requests.get(str(url), auth=('<user>', '<password>'), verify=False) #verify=False will disable the certificate verification check
json_data = response.json()
timestamp = []
value = []
isGood = []
#Parsing Json response
for j_object in json_data["Items"]:
 timestamp.append(j_object["Timestamp"])
 value.append(j_object["Value"])
 isGood.append(j_object["Good"])

event_array = zip(timestamp, value, isGood)
#Writing to file
with open(str(filepath), "w") as f:
 for item in event_array:

 try:
 writestring = "Timestamp: " + str(item[0]) + " , Value: " + str(item[1]) + " , isGood: " + str(item[2]) + " \n"

 except:

 try:
 writestring = "" + str(item[0]) + " \n"
 except:
 writestring = "" + " \n"

 f.write(writestring)
 f.close()

(*intendation not correctly displayed)

 

Result:

 

Timestamp, value and the quality for this time range were successfully written to the file.

In this post I will be leveraging OSISoft's AFSDK to extract PI System Data to a flat file.

To keep things simple and easy to reproduce, this post will focus how to extract data with this technology.

 

Prerequisites:

PI Data Archive 2018

PI AF Server 2018

PI AF Client 2018 SP1

Microsoft Visual Studio 2017

 

For simplicity 7 days of data of solely the PIPoint "Sinusoid" will be queried and written to a .txt file.

 

Following code will establish a AFSDK connection to the default PI Data Archive Server specified in the local Known Servers Table. A query for PI Points is launched to find the PIPoint "Sinusoid".

The method PIPoint.Recordedvalues is used to retrieve a list of AFValues. Their properties "Timestamp, Value, IsGood" are then written to a flat file.

 

C# Code:

namespace Data_Access_AFSDK
{
class Program
{
static void Main(string[] args)
{
PIServer myPIserver = null;
string tagMask = "";
string startTime = "";
string endTime = "";
string fltrExp = "";
bool filtered = true;

//connection to PI server
if (myPIserver == null)
myPIserver = new PIServers().DefaultPIServer;

//Query for PI Point
tagMask = "Sinusoid";
List<PIPointQuery> ptQuery = new List<PIPointQuery>();
ptQuery.Add(new PIPointQuery("tag", AFSearchOperator.Equal, tagMask));
PIPointList myPointList = new PIPointList(PIPoint.FindPIPoints(myPIserver, ptQuery));

startTime = "*-7d";
endTime = "*";

//Retrieve events using PIPointList.RecordedValues into list 'myAFvalues'
List<AFValues> myAFvalues = myPointList.RecordedValues(new AFTimeRange(startTime, endTime), AFBoundaryType.Inside, fltrExp, filtered, new PIPagingConfiguration(PIPageType.EventCount, 10000)).ToList();

//Convert to PIValues to string[]
string[] query_result_string_timestamp_value = new string[myAFvalues[0].Count];
string value_to_write;
string quality_value;
int i = 0;

foreach (AFValue query_event in myAFvalues[0])
{
value_to_write = query_event.Value.ToString();
quality_value = query_event.IsGood.ToString();
query_result_string_timestamp_value[i] = "Timestamp: " + query_event.Timestamp.LocalTime + ", " + "Value: " + value_to_write + ", " + "IsGood: " + quality_value;
i += 1;
}
//Writing data into file
System.IO.File.WriteAllLines(@"<FilePath>", query_result_string_timestamp_value);
}
}
}

 

 

Result:

Timestamp, value and the quality for this timerange were successfully written to the file.

In this post I will be leveraging OSISoft's PI SQL Client OLEDB to extract PI System Data via the PI SQL Data Access Server (RTQP).

To keep things simple and easy to reproduce, this post will focus how to extract data with this technology.

 

Prerequisites:

PI SQL Client OLEDB 2018

PI Data Archive 2018

PI AF Server 2018

PI SQL Data Access Server (RTQP Engine) 2018

Microsoft Visual Studio 2017

 

For simplicity a test AF database "testDB" with a single element "Element1" which has a single attribute "Attribute1" was created.

This attribute references the PIPoint "Sinusoid".

 

 

SQL Query used to extract 7 days of events from \\<AFServer>\testDB\Element1|Attribute1

 

SELECT av.Value, av.TimeStamp, av.IsValueGood
FROM [Master].[Element].[Archive] av
INNER JOIN [Master].[Element].[Attribute] ea ON av.AttributeID = ea.ID
WHERE ea.Element = 'Element1' AND ea.Name = 'Attribute1'
AND av.TimeStamp BETWEEN N't-7d' AND N't'

 

Example C# Code:

 

using System;
using System.Data;
using System.Data.OleDb;
using System.IO;


namespace Data_Access_PI_SQL_Client_OLEDB
{
class Program
{
static void Main(string[] args)
{
DataTable dataTable = new DataTable();
using (var connection = new OleDbConnection())
using (var command = connection.CreateCommand())
{
connection.ConnectionString = "Provider=PISQLClient; Data Source=<AFServer>\\<AF_DB>; Integrated Security=SSPI;";
connection.Open();


string SQL_query = "SELECT av.Value, av.TimeStamp, av.IsValueGood ";
SQL_query += "FROM [Master].[Element].[Archive] av ";
SQL_query += "INNER JOIN [Master].[Element].[Attribute] ea ON av.AttributeID = ea.ID ";
SQL_query += "WHERE ea.Element = 'Element1' AND ea.Name = 'Attribute1' ";
SQL_query += "AND av.TimeStamp BETWEEN N't-7d' AND N't' ";


command.CommandText = SQL_query;
var reader = command.ExecuteReader();
using (StreamWriter writer = new StreamWriter("<outputfilepath>"))
{
while (reader.Read())
{
writer.WriteLine("Timestamp: {0}, Value : {1}, isGood : {2}",
reader["Timestamp"], reader["Value"], reader["IsValueGood"]);
}
}
}

Console.WriteLine("Completed Successfully!");
Console.ReadKey();
}
}
}

 

 

Result:

Events were successfully written to flat file:

 

I just needed to merge some data from one PI Point into another PI Point on the same PI Server.

 

Background:

The PI OPC UA Connector was running for a while before following the PI Connector Mapping Guide and routing the output of the OPC UA Connector to the previously OPC DA PI Points.

 

This Powershell script uses AFSDK to get the events from PI Point "sinusoid" and write these into PI Point "testtag" for the last hour.

 

[Reflection.Assembly]::LoadWithPartialName("OSIsoft.AFSDK") | Out-Null
[OSIsoft.AF.PI.PIServers] $piSrvs = New-Object OSIsoft.AF.PI.PIServers
[OSIsoft.AF.PI.PIServer] $piSrv = $piSrvs.DefaultPIServer
[OSIsoft.AF.PI.PIPoint] $piPoint = [OSIsoft.AF.PI.PIPoint]::FindPIPoint($piSrv, "SINUSOID")
[OSIsoft.AF.PI.PIPoint] $piPoint2 = [OSIsoft.AF.PI.PIPoint]::FindPIPoint($piSrv, "testag")
[OSIsoft.AF.Time.AFTimeRange] $timeRange = New-Object OSIsoft.AF.Time.AFTimeRange("*-1h", "*")
[OSIsoft.AF.Asset.AFValues] $piValues = $piPoint.RecordedValues($timeRange, [OSIsoft.AF.Data.AFBoundaryType]::Inside, $null, $true, 0)






foreach ($val in $piValues)
{
    Write-Host $val.Value " at " $val.Timestamp
   $piPoint2.UpdateValue($val,1)
} 

Filter Blog

By date: By tag: