Introduction

This blog post will describe how to retrieve predictions from a Microsoft Azure Machine Learning (Azure ML) web service into the PI System. The Azure ML web service is assumed to be already deployed from an Azure ML model. This post will not get into the details of exporting PI data using the PI integrator for Azure to an Azure SQL database and then creating an Azure ML Model with a web service, it assumes that these steps have already been implemented.

There are several ways to import the results from your Azure Prediction model into the PI Server as predicted data, and store them into future PI tags.  The Web service you have created is available for any application capable of invoking Web services.  To help you get started, there are code snippets provided by the Azure ML Studio, specific to your Web service, for several common programming languages. Please locate this by clicking the “API help page” on the Web service dashboard.

This post will focus on using a custom developed .NET Windows Application to bring Azure ML predictions into the PI System.

The example we will focus on in this blog post will be regarding predicting the Active Power of Wind Turbines based on their forecasted and historical Wind Speeds and Air Temperatures. However, please note that you can apply the same concepts mentioned here to work with your own Azure ML web service model which might be using a different logic and utilizing other parameters, and that can be achieved easily by implementing minor changes to the attached code to fit your needs.

The below screenshots display the PI Asset Framework (PI AF) structure for this specific Wind Turbines Example:

  Wind Farm Attributes

Wind Turbine attributes

Users will be able to use this application to either bring in predicted future values, or historical predicted values from the model in order to compare them with the actual historical values (to check if the prediction model is working properly).

  In order to get a feel of what to expect at the end of this blog post, please check the image below which portrays a PI Coresight display showing a comparison between the actual historical “Active Power”, and the predicted Active Power from the Azure ML Model (which was imported into PI using the custom Windows Application).

PI Coresight display showing the Actual Active Power vs the Predicted Active Power

 

The steps in this blog post will involve:

  1. Examine the attached application and code
  2. Examine and test the Azure ML Prediction Web Service
  3. Examine and Update the Application’s Configuration File
  4. Run the Application in historical mode to check the accuracy of the predicted model
  5. Run the Application in future mode to get the predicted values

 

Examine the attached application and code

  You can download the attached “AzureML_To_PI.zip” file to retrieve the Windows .NET application which contains the code for retrieving Azure ML predictions into PI. The code is commented and you can modify it as necessary to work with your own Azure ML models and specific web service parameters.

 

Examine and test the Azure ML Prediction Web Service

  1. Open your Azure ML Workspace
  2. Navigate to the Web Services icon on the left and then choose the predictions web service which you have already deployed

 

   3. After the Web Service Dashboard Opens, click on the Blue “Test” Button to test the web service

  4. Enter values for your parameters and click the Check Box to start the test

  5. Examine the Result at the bottom of the screen; the first few values displayed are the inputs you specified, while the last value will be the predicted value

 

Examine and Update the Application’s Configuration File

The custom developed application is using a configuration file for some global parameters which can be changed by the user if needed; these were placed in the configuration file instead of being hardcoded in the code in order to make the application more generic and reusable.

Please follow the below steps to open the Configuration File:

  1. Open the “AzureML_To_PI.sln” file in visual studio
  2. Open the App.config file
  3. Check the different parameters and their values and update them if necessary. The table below defines each parameter

 

Parameter

Description

PIAFServer

The PI AF Server Name.

PIAFServerDB

The PI AF Database Name.

PromptLogin

[1]-Yes (opens a dialog box to log on to the AF Server when the App Starts)

[0]-No (Logs on automatically)

AFTemplate

Specifies the name of the Wind Turbines AF Template

WSParam_AirTemp

Specifies the name of the Air Temperature Parameter as specified in the predictions web service.

WSParam_WindSpeed

Specifies the name of the Wind Speed Parameter as specified in the predictions web service.

AFAttributeName_AirTemp

Specifies the name of the Air Temperature Attribute in AF for the Wind Turbines.

AFAttributeName_WindSpeed

Specifies the name of the Wind Speed Attribute in AF for the Wind Turbines.

AFAttributeName_ForecastAirTemp

Specifies the name of the Temperature Forecast Attribute in AF for the Wind Farm.

AFAttributeName_ForecastWindSpeed

Specifies the name of the Wind Speed Forecast Attribute in AF for the Wind Farm.

AFAttributeName_RatedPower

Specifies the name of the Rated Power Attribute in AF for the Wind Turbines.

AFAttributeName_ActivePowerPredicted

Specifies the name of the Predicted Power Attribute in AF for the Wind Turbines.

AF_WindSpeedCutOffValue

Specifies the value of the maximum wind speed after which the Rated Power of the turbine will be used (instead of calling the web service).

WSAPIKey

The API key of the Web Service which will be used for Authentication, this can be found on the main dashboard of the web service.

WSURL

The POST URL of the Web Service, this can be found by navigating to the web service and then clicking on “REQUEST/RESPONSE”.

 

Rebuild the solution after you update the parameters.

 

Run the Application in historical mode to check the accuracy of the predicted model

We are now ready to use the application to import results from the Azure ML Web Service into PI. In this section we will run the application in historical mode (From and To dates are set in the past), in order to compare the Historical Active Power to the Predicted Active Power.

Please follow the below steps to run the application:

  1. Open the file “AzureML_To_PI.exe”, the below application should open:

  2. Enter the From and To dates in the past and then click on Submit

  3. The application starts looping through all the turbines and does the following for each turbine:

    1. Get the values from PI for the Air Temperature for that turbine between the specified dates
    2. Get the values from PI for the Wind Speed for that turbine between the specified dates
    3. Send all these values as inputs to the Azure ML Web Service and receive the predicted Active Power for each “Air Temperature-Wind Speed” pair and store those values in the Predicted Active Power tag for that turbine

  4. Wait for the application to complete the predictions for all the turbines

  5. Test the results: Use any PI client tool (like Coresight or ProcessBook) to compare the Active Power to the Predicted Active Power

 

Run the Application in future mode to get the predicted values

We will now use the Azure ML web service to write future predictions into PI. These predictions will be based on forecasted Temperatures and Wind Speeds on the farm level. The Forecasted Temperatures and Wind Speeds for this example are only available on the farm level which means that all the turbines will have the same predicted active powers. Also note that the forecasted Temperatures and Wind Speeds for this example span the next 24 hours with values available every 15 minutes.

  In order to get the predicted future Active Powers for all the turbines based on the forecasted Temperatures and Wind Speeds on the farm level; please go through the same steps mentioned in the previous section except for Step 3: this time instead of using dates in the past, use the current date in the From field and tomorrow’s date in the To field.