Skip navigation
All Places > All Things PI - Ask, Discuss, Connect > Blog > 2016 > December
2016

Purpose

I have seen quite a few questions and confusion over the last year regarding the timestamps returned when querying data using the PowerShell Tools for the PI System.  Specifically, that the timestamps returned do not match the client time-zone settings similar to the behavior of other PI client tools. (PI DataLink, PI ProcessBook, PI Coresight, PI SMT etc)    It also came to my attention that there was not a clear explanation for this behavior in documentation or here on PI Square at this time.

 

Current time on test machine:

In PI SMT - values for SINUSOID over the last two hours:

Using PowerShell tools for the PI System - values for SINUSOID over the last two hours:

# connect to PI Data Archive
$con = Connect-PIDataArchive -PIDataArchiveMachineName "afink-pi1" 

# set start time to two hours ago
$starttime = Get-Date
$starttime = $starttime.AddHours(-2)
# set end time of now
$endtime = Get-Date

# query values of SINUSOID for last two hours
$values = Get-PIValue -PointName Sinusoid -StartTime $starttime -EndTime $endtime -Connection $con

# write values to console
Write-Output $values


Background/Explanation

The PI Data Archive stores all data using UTC time.  Therefore, when a PI client tool requests data for a tag, the values are returned with UTC timestamps.  It is the responsibility of the PI client to convert these times from UTC to local time based on the Windows Time Zone settings on the client machine.  When a client requests data from the PI Data Archive, the timestamps for the returned values are absorbed by the client as System.DateTime .NET objects in UTC time.  The client then calls the ToLocalTime() method on the returned values and populates the spreadsheet, trend, or other symbol with the timestamp in the client's local time zone.  This is the behavior of OSIsoft's most used client tools such as PI DataLink, PI Coresight, PI ProcessBook.

 

The PowerShell tools for the PI System, however, do not automatically call the ToLocalTime() on returned values, so all values are still shown in UTC by default.  This is what raises the questions/confusion described above.

 

Solution

Luckily for us, since PowerShell is also built on .NET, we simply need to the same steps used for all other PI client tools as described above.  The results of my SINUSOID query are stored in the variable $values.  If we select the "Timestamp" property of our resultant events and pipe it to the Get-Member cmd-let (aliased as "gm"), we can see the object type is indeed System.DateTime.  We can also see the various methods and properties associated with the object type.

 

 

Specifically, notice the ToLocalTime() method.  Using this method, we can verify that we can now return the values with the proper client time zone timestamps.

 

Finally, since the ultimate goal is often to generate a .csv or .txt file with the resultant archive data, we can use some PowerShell tricks to replace our UTC timestamp column with our client local time column. Specifically, using calculated properties.

 

In fact, thanks to the PowerShell pipeline, we can select our properties, build our custom output column, and write the result to a .csv file all in a one-liner:

# selects the value, and creates a custom column for client tz timestamps and outputs to file C:\values.csv
$values | Select-Object -Property value,@{Name='Timestamp (Client Time Zone)';Expression={$_.timestamp.tolocaltime()}}  | Export-Csv -Path C:\values.csv -NoTypeInformation

 

Lets break this down a little bit... this one liner can be broken down into 3 parts separated by the  "|" characters.  The | allows us to take the output of one cmd-let or script block, and pass it as the input to the next cmd-let or script block.

 

Part 1:

$values |

 

Here we are simply setting our "start off point".  We currently have our $values variable which contains our SINUSOID values and UTC timestamps.  Keep in mind this variable also contains other properties, such as isGood, isAnnotated, and WriteMode.  We take all of this information and pass it to the next step:

 

Part 2:

Select-Object -Property value,@{Name='Timestamp (Local Time Zone)';Expression={$_.timestamp.tolocaltime()}}  |

 

In this case, we are only interested in the value and our new client timezone timestamp properties for our resultant .csv file.  Therefore, from our query results, we are only selecting the "value" column and our new column "Timestamp (Local Time Zone)", which we are creating by specifying the Name of the new column, and the Expression to evaluate for the new column.  In PowerShell, we do this by adding the new column as a hash-table, which is why we must use the @{Name='';Expression=''} syntax.  The operation we want to perform to generate the data for our "Timestamp (Local Time Zone)" column is call the ToLocalTime() method as previously discussed.  The "$_." tells PowerShell to call the timestamp.ToLocalTime() method on whatever object was piped to it, and from Part 1, we had piped our $values variable.  Therefore, we will be taking the "timestamp" column of our $values variable, and calling ToLocalTime(), resulting in our new column "Timestamp (Local Time Zone)".

 

For more information on calculated properties in PowerShell, check out the following TechNet article:

< Windows PowerShell Tip: Using Calculated Properties >

 

Part 3:

Export-Csv -Path C:\values.csv -NoTypeInformation

 

Now that we have selected only the columns we are interested in from the previous step ("value" and our new calculated "Timestamp (Local Time Zone)" columns), we can export the result to a .csv file using Export-Csv and specifying a path for the output file.  The NoTypeInformation flag prevents object type data from being pre-appended to the .csv file header.  Oftentimes this extra note in the .csv file can be cumbersome, so we can remove it in this way.  Without the -NoTypeInformation flag, the output file will look like this:

 

Now, we have a .csv export of our desired PI data and have it properly returned in our client time zone which we can now use for further analysis in other programs.  Furthermore, since we were able to do this in PowerShell, we can run our queries in parallel, against multiple servers, and in fully automated fashion.

 

The full script would look something like this:

# connect to PI Data Archive
$con = Connect-PIDataArchive -PIDataArchiveMachineName "afink-pi1" 

# set start time to two hours ago
$starttime = Get-Date
$starttime = $starttime.AddHours(-2)
# set end time of now
$endtime = Get-Date

# query values of SINUSOID for last two hours
$values = Get-PIValue -PointName Sinusoid -StartTime $starttime -EndTime $endtime -Connection $con

# write values to console
Write-Output $values

# selects the value, and creates a custom column for client tz timestamps and outputs to file C:\values.csv
$values | Select-Object -Property value,@{Name='Timestamp (Client Time Zone)';Expression={$_.timestamp.tolocaltime()}}  | Export-Csv -Path C:\values.csv -NoTypeInformation


 

Happy slinging!

 

- Adam Fink

KristofDM

Osi to the cloud

Posted by KristofDM Dec 15, 2016

So, we are moving the entire pi infrastructure to the cloud or at least are trying to...

We are looking at IaaS and PaaS where possible.

I'll write down my successes (and losses)  here to help the community members that might have the same idea.

 

Please note that not all things we do are 'supported' by osisoft.. (yet!).

 

First some info on the Environment.

We have a pi data archive 3.4.935.72 (2015 R2) server.

Pi coresight

Pi AF

Pi SQL Data access server

and a  'heavy consumer app' server.

SQL server for coresight and AF.

 

Testing:

To make sure we make the most out of it i wanted to have the AF and coresight databases hosted in PaaS.

This is not yet supported and was only tested on an older PaaS sql edition (pre v12) .

We have VPN with azure and are able to join AzureVM's in our local AD.

 

PiArchive

I went ahead and Installed the PI data archive on a AzureVM with ssd storage;

No problems here

 

PiCoresight

Installed Coresight on an AzureVM and adapted connect strings to use PaaS.

Migrated on-premise database to PaaS using sql management studio.

issues

1: The DvService user (in the COresight SQL database) is linked with a sql login from the domain !

=> Not supported in Azure,Dropped and recreated on PaaS.

2: Coresight connection string to sql in web.conf adapted so a Azure database can be used

code:

<add name="CoresightConnectionString"  connectionString="Data Source=yourPaaSSQlServer.database.windows.net;Initial Catalog=YourPaasDatabaseName;User ID=DatabaseUSer;Password=DatabaseUSerPassword;Encrypt=true;Trusted_Connection=false; Application Name='Coresight Web Service'" />

 

It works ! the Coresight overview page does not show any more errors.

 

PiAfServer

Moving the AF database to the cloud sql PaaS.

I probably have and advantage here since i'm a classic Database Admin so if anything is not clear. shoot!

 

The AF Database in SQL servers uses some stored procedures/functions  to create a replica, get the status of the sql agent, get the Sysadmin name, etc. ...

Since these are all calls to the database instance (the sql Engine) they will no longer work in a PaaS database. For obvious reasons Microsoft does not allow these kind of queries.

The instance catalog is not the traditional sql server kind and thus is unusable.

 

Since this is a tricky part, we made a copy of the database in our local sql server (PiAfCopy) and:

 

Changed the stored proc on the sa user to always return SA as string value.

Changed the Check SQLagent Job to always return 1

Dropped the procs for the replica (Osi discourages users to use it anyway)

Changed all windows logins to sql logins.

 

Adapted Functions:

fn_getSaName

IsSqlAgentEnabled

 

Dropped Stored Procedures:

[Ddbo].[USP_AUDIT_ENABLED]

[dbo].[usp_replicate_primary_createdistributor]

[dbo].[USPC_CDC_ENABLE]

 

Now , using sql management studio 2016, i deployed the database to the PaaS sql.

Issue: Out of memory

This can be quite intensive on the client computer so having enough RAM is essential !

In case you haven't, try closing as much apps ass possible.

You could also do this on the database server itself but it needs an internet connection!

 

After some time (it takes a long time) the database should be deployed in Azure PaaS.

 

Installing AF server + System Eplxorer!

 

We'v setup an IaaS server and installed AF like we normally did. We disabled the database execution utility and disabled check connection while specifying the database server.

Just specified the PaaSSQLserver.

 

Installation went smooth.

 

The connect string needed to be changed.  this time in AfService.Exe.Config

It's located  C:\Program Files\PIPC\AF. It is somewhat different from coresight...

 

    <add key="connectString" value="Persist Security Info=False;server=YourPaasSqlServer.database.windows.net,1433;database=YourPaasDatabase;USER id=DatabseUser;PASSWORD=DatabaseUserPassword; Application Name=AF Application Server;" />

 

Restarted af and..

 

It worked!

Initially i had an error while creating new ellements.

"cannot check in model with unique id {Some very long SID} because there is another model with an identical name"

But after retrying some times, it vanished!

 

Next steps:

Analysis service, Notifications, pi sql data access server.

To be continued!

 

 

SQL DATA ACCESS (PIOLEDB)

After installing the pi data access layer on a IaaS server, running the sql commander fails.

Under the hood  the PI sql commander tries to use the (undocumented) system sp sp_oledb_ro_usrname .

The sp_oledb_ro_usrname stored procedure returns the oledb database read only status.

This kind of queries are not supported on Microsoft SQL Azure Paas.

 

Here ends the SQL PaaS try-out.

 

We'll look at Hosting the SQL on premise ...

rkoonce

Excel Is Our Friend – Part V

Posted by rkoonce Employee Dec 14, 2016

Excel Is Our Friend – Part V


Congratulations! You have escaped Interface Island! And if you are ever caught on another similar island, you have developed the survival skills to survive and eventually escape. The bad news is that some of your toolbox was left behind on that island. When you find yourself in need the tools there, you could always swing by there and get a “copy” You left your shelter behind, it will have to be rebuilt. On a new island, project references will need to be re-added, functions re-created in the new Island’s project. But what if there was a way to take the recyclables with you? Like that user defined function? That might come in handy anywhere. So, the good news is that you can have your cake and eat it too.

How to: Making Your Custom Spreadsheet Functions Available Anywhere

To use a custom function, the workbook containing the module in which you created the function must be open. If that workbook is not open, you get a #NAME? error when you try to use the function. Even if the workbook is open, if you use the function in a different workbook, you must precede the function name with the name of the workbook in which the function resides. For example, if you create a function called DISCOUNT in a workbook called Personal.xlsb and you call that function from another workbook, you must type =personal.xlsb!discount(), not simply =discount().

You can save yourself some keystrokes (and possible typing errors) by selecting your custom functions from the Insert Function dialog box. Your custom functions appear in the User Defined category:

ExcelFig7.png

In the instructions, a workbook type of xlsb is referenced above. But the screenshot has an xlsm… my workbook was created long ago, originally as an .xls (I am a dinosaur).

* Tip: An even easier way to make your custom functions available at all times is to store them in a separate workbook and then save that workbook as an add-in. You can then make the add-in available whenever you run Excel. Detailed instructions are found at the office.microsoft.com site linked here. This varies slightly from my own “old skool” methods, but I agree with the authors that this new way seems an easier way. I’ll be converting to an add-in when I get the chance.


Note: An .xlsb extension indicates Excel Binary Format. The .xlsm extension denotes a macro enabled workbook. An .xlam extension indicates Add-in. All of these formats require a trust for location and/or a much more complicated trust for publisher.

  • How to: Let Excel Teach You VBA


All of us can learn a lot of VBA by using the “Record Macro” function. After recording your actions within Excel, just examine the VBA code afterward and learn. It’s like pulling the curtain back on the Wizard of Oz. Prior to VBA, I was using Excel 4’s macro language, and the macro recorder was instrumental for me, personally, when first learning VBA. I still use it on occasion.


* On the Developer Ribbon, click “Record Macro”.
* Note the Macro Name and location.
* Perform some actions within the workbook, select a range of cells, change the font, change the number format, type some text or a formula.
* Click the Stop Recording button (same location as above).
* Examine the code that was generated.

ExcelFig8.png

  • FizzBlog Exercise:


  • Create a new macro-enabled workbook.
  • Copy the mySplitter function into this new workbook (the functions that we created earlier in this blog series).
  • Save this workbook as an Excel Add-In > “MyFunctions.xlam”
  • In Excel>File>Options> Add-ins>Load the newly created add-in in Excel.
  • Create a new workbook. Click Insert Function
  • FizzBlog Challenge:


  • In the dialog to select the user defined functions, is the function name prepended with the add-in worksheet name? workbook name? Or none? Prove it with a screenshot.
  • Were you able to make it work with a regular workbook? Or does it only work in macro-enabled workbooks?
  • Did you have to do anything special to make it work? (Did I leave out any significant steps in the FizzBlog Exercise?) If so, please explain.
  • Sharing your recyclables:


  • If you have a useful reusable function(s), please share it below.
  • Perhaps you just had an idea for one, please post it as a comment! (Santa might be reading this thread?)


This entry concludes my blog series for "Excel Is Our Friend" (perhaps I will append it someday). I hope that some of you have learned a few things along the way. Regardless, I really hope you enjoyed the series.

I plan to begin another series soon, but first I will have to choose my next destination. I have several ideas that I have been kicking around, but I would welcome any suggestions regarding any "islands" you would like to explore.


Cheers! (from the grass hut)

Filter Blog

By date: By tag: