Skip navigation
All Places > All Things PI - Ask, Discuss, Connect > Blog
chuck

Hardware LifeCycles (OpenVMS)

Posted by chuck Employee Mar 29, 2016

A couple years ago I posted articles inquiring as to the state of OpenVMS and PI System dependencies on OpenVMS within our PI System user community.  Now that a couple of years or so have passed - just the other day I was thinking about this again.  There still remain a number of PI systems, mostly PI3 servers, which depend on interface nodes or application nodes running OpenVMS.  A goodly number of these OpenVMS systems, based on VAX and Alpha platforms, are considered mission or business critical.

 

Just how long can we expect this stuff to run that we type on and archive PI data with?  Let's think about just the disk drives for a moment… 

Disk drives for ”modern” OpenVMS systems had advertised very long lifetimes and smart hardware.  Disk drives in OpenVMS systems which were new in the late 1990’s and early 2000’s were pretty capable and included advanced features helping to implement RAID and caching implementations in drive hardware as compared to other computing systems of the same era which may have implemented such features in disk controllers, or microcode in processor boards, or in operating system disk device drivers.

 

A typical Wide SCSI drive from this era might have advertised 800000 hours MTBF (nearly 100 years!), however… That was a hugely big "However":  MTBF was affected by handling, storage, power up method (and power fail), error rate and so on.

  • Nearly 20% of storage capacity of such drives was reserved for system/hardware functions such as engineering partition, dedicated head landing zone, sectors reserved for error replacement, and auto calibration/head alignment functions.
  • These disks anticipated an average of as many as 100 sectors failing per year of operation. 
  • The way DC power was applied to the drive and the manner of application of the first SCSI controller commands after power up affected drive life and lifecycle.

As a consequence of these features, disk life was expected to be only 5 years in normal use.  Here at OSIsoft we are experiencing average lifetime of these disks to range from 5-8 years.

 

Similarly limited lifetimes apply to our OpenVMS server's disk controllers, memory boards, processor boards and so on.  The last VAX/VMS system was built September and shipped in December of 2000.

And we know the last Alpha OpenVMS system was ordered in April 2007 and shipped later that year.  We can expect nearly all VAX and Alpha based OpenVMS systems to be past end of life at this point in time.

 

Reference:
Seagate Barracuda ST15150N specification sheet and reference manual

Hi all,

 

I have created a .Net application that can help in the maintenance of PI Datalink Reports in Excel.   Some of you may find this extremely useful.

 

This program was developed out of the need for a project that I was working on where we are renaming around 100,000 PI tags as part of a PI tag standardization effort.  Renaming of PI tags has a detrimental effect on PI Datalink report and can also impact VBA code that may contain PI tags as well.

 

I have a screen shot of the GUI below that shows where you select an input folder which contains all the Excel files that you want to convert, a conversion file which contains all the old and new PI tag names, and an output folder to store the converted files.  I have tested this with 100,000 tag names.  You can easily handle all of the Excel reports in your system in a single batch, test them, and then reload them to your production folder afterwards.

 

You can get a trial version of the software at http://www.exceltextswitcher.com

 

Enjoy...

 

 

Brad

The LazyPI project was created to hopefully provide an API that feels more like the native AFSDK. The project has come along nicely and is ready for testing. Testing requires the development of Unit Tests and an access to a WebAPI server. The currently implement pieces should allow for basic management of AFElements, AFEventFrames, and AFAttributes. LazyPI will simplify interactions with the WebAPI and reduced required knowledge of WebAPIs.

 

Contributors Needed

Anyone that might be interested in helping to test or expand on the library please see the GitHub page. If you would be interested in contributing in any way please fill out this super short application.

 

Outstanding Tasks

  • Find and report bugs
  • Create Unit Tests for all implemented pieces
  • Implement "Controller" objects that are missing (ex. AFTable, Calculations)
  • Iron out design flaws

 

Disclaimer

This code is still not ready for production as of Feb. 15, 2016. Please do not use LazyPI in its current state for important projects.

There has been a few posts about inputting data into the PI system from devices such as the Raspberry PI or Arduino using the UFL interface and PI Web API (see here, here and here).  In Carlos’s post he mentions projects using the HTML interface too, though I couldn’t find any details on it so I thought I would share some of my experiences using the HTML interface to bring data into the PI system from a Raspberry PI.

 

Overview

 

A few months ago I had some free time and wanted to learn more about both python and SQL, so I started playing around with both on my raspberry PI at home, whilst I was waiting for a temperature sensor to arrive (postage to Australia can be slow!) I started by using python to measure some performance statistics about my raspberry PI. I used Python to read the performance statistics and write them to a database in MySQL on the Raspberry PI.  I then used PHP to access the data in MySQL from a webpage, see below:

 

Fast forward a little while to the new year period and I again had some free time on my hands so I decided to configure a HTML interface to read this data into my PI server, so the data flow now looks like this:

 

 

The Raspberry PI is a model B running Raspbian, the HTML interface is version 2.3.0.71 and the PI server (AF and Data Archive) are 2015 R2.

 

Implementation

 

Once MySQL and Apache were installed I used the following Python script to read the performance statistics and write is to the MySQL database, I used crontab to make the following script run every minute.

 

#! /usr/bin/env python


import subprocess, os, datetime, sys, statvfs
import MySQLdb as mdb


def get_ram():
  "Returns a tuple (total ram, available ram) in megabytes"
  try:
  s = subprocess.check_output(["free","-m"])
  lines = s.split('\n')
  return ( int(lines[1].split()[1]), int(lines[2].split()[3]) )
  except:
  return 0
def get_process_count():
  "Returns the number of processes"
  try:
  s = subprocess.check_output(["ps","-e"])
  return len(s.split('\n'))
  except:
  return 0
def get_up_stats():
  "Returns a tuple (uptime, 5 min loading average)"
  try:
  s = subprocess.check_output(["uptime"])
  load_split = s.split('load average: ')
  load_five = float(load_split[1].split(',')[1])
  up = load_split[0]
  up_pos = up.rfind(',',0,len(up)-4)
  up = up[:up_pos].split('up ')[1]
  return ( up , load_five)
  except:
  return ( '' , 0 )


def get_connections():
  "Returns the number of network connections"
  try:
  s = subprocess.check_output(["netstat","-tun"])
  return len([x for x in s.split() if x == 'ESTABLISHED'])
  except:
  return 0


def get_temperature():
  "Returns the temperature in DegC"
  try:
  s = subprocess.check_output(["/opt/vc/bin/vcgencmd","measure_temp"])
  return float(s.split('=')[1][:-3])
  except:
  return 0


def get_ipaddress():
  "Returns the current IP address"
  arg='ip route list'
  p=subprocess.Popen(arg,shell=True,stdout=subprocess.PIPE)
  data = p.communicate()
  split_data = data[0].split()
  ipaddr = split_data[split_data.index('src')+1]
  return ipaddr


def get_cpu_speed():
  "Returns the current CPU speed"
  f = os.popen('/opt/vc/bin/vcgencmd get_config arm_freq')
  cpu = f.read()
  return cpu


def get_current_time():
  "Returns the current time as a string"
  ctime = datetime.datetime.now().strftime('%Y-%m-%d %H:%M:%S')
  return ctime


def get_free_MB():
  "Returns the number of available MB on the SD card and % free"
  x = os.statvfs(".")
  freeMbytes = x[statvfs.F_BSIZE] * x[statvfs.F_BAVAIL]
  totalMbytes = x[statvfs.F_BSIZE] * x[statvfs.F_BLOCKS]
  "convert bytes to MB"
  freeMbytes = freeMbytes / (1024*1024)
  totalMbytes = totalMbytes / (1024*1024)
  PCfree = 100 * (float(freeMbytes)/float(totalMbytes))
  return(freeMbytes,PCfree)




FreeRam = get_ram()[1]
Processes = get_process_count()
UpTime = get_up_stats()[0]
Connections = get_connections()
Temp = get_temperature()
Time = get_current_time()
FMB = get_free_MB()
FreeMB = FMB[0]
PCFreeMB = FMB[1]


"Insert values into SQL"
try:
  con = mdb.connect('localhost', 'XXXXXX', 'XXXXXX', 'perfmon');
  cur = con.cursor()
  cur.execute("INSERT INTO NalaStats (id,timestamp,freeram,processes,uptime,connections,temp,freeMB,PCfreeMB) VALUES (NULL,%s,%s,%s,%s,%s,%s,%s,%s);", (Time,FreeRam,Processes,UpTime,Connections,Temp,FreeMB,PCFreeMB))
  con.commit()
except mdb.Error, e:
  print "Error %d: %s" % (e.args[0],e.args[1])
  sys.exit(1)
finally:
  if con:
  con.close()

 

I also set up the following script to run every week to delete data in MySQL which was more than 14 days old so as not to fill up all the space on the SD card.

 

#! /usr/bin/env python


import datetime, sys
import MySQLdb as mdb


"Number of days to store data"
TimeRange = 14


"Delete historical data from SQL"
try:
  con = mdb.connect('localhost', 'XXXXXX', 'XXXXXX', 'perfmon');
  cur = con.cursor()
  cur.execute("DELETE FROM NalaStats WHERE TIMESTAMPDIFF(DAY,timestamp,NOW()) > %s;", TimeRange)
  con.commit()
except mdb.Error, e:
  print "Error %d: %s" % (e.args[0],e.args[1])
  sys.exit(1)
finally:
  if con:
  con.close()

 

The code for the webpage is below, it grabs the data from MySQL as well as doing some aggregate calculations (though I don’t read them into PI)

 

<html>
  <body>
  <h1>Nala Performance Monitor</h1>
  <?php
  // Connection information
  $server ="localhost";
  $username = "XXXXXX";
  $password = "XXXXXX";
  $dbname = "perfmon";


  // Create connection and select database
  try {
  $DBH = new PDO("mysql:host=$server;dbname=$dbname",$username,$password);
  } catch(Exception $e){
  echo $e->getMessage();
  }


  // Prepare to query MySQL


  // Get most recent values from MySQL
  $STH_Rec = $DBH->query("SELECT * FROM NalaStats ORDER BY id DESC LIMIT 1");
  // Get averages from MySQL
  $STH_1hrAvg = $DBH->query("SELECT AVG(temp) FROM (SELECT temp FROM NalaStats ORDER BY id DESC LIMIT 60) AS temperature");
  $STH_24hrAvg = $DBH->query("SELECT AVG(temp) FROM (SELECT temp FROM NalaStats ORDER BY id DESC LIMIT 1440) AS temperature");


  // Set the fetch mode
  $STH_Rec->setFetchMode(PDO::FETCH_ASSOC);
  $STH_1hrAvg->setFetchMode(PDO::FETCH_ASSOC);
  $STH_24hrAvg->setFetchMode(PDO::FETCH_ASSOC);


  // Fetch data
  $Rec = $STH_Rec->fetch();
  $Temp1hrAvg = $STH_1hrAvg->fetch();
  $Temp24hrAvg = $STH_24hrAvg->fetch();


  // Print values
  print "<P>Measurement #: <B>" . $Rec['id'] . "</B></P>";
  print "<P>Timestamp: <B>" . $Rec['timestamp'] . "</B></P>";
  print "<P>Free RAM: <B>" . $Rec['freeram'] . "</B> MB</P>";
  print "<P>Free disk space: <B>" . $Rec['freeMB'] . "</B> MB</P>";
  print "<P>Free disk space: <B>" . round($Rec['PCfreeMB'],1) . "</B> %</P>";
  print "<P>Processes: <B>" . $Rec['processes'] . "</B></P>";
  print "<P>System Uptime: <B>" . $Rec['uptime'] . "</B></P>";
  print "<P>Connections: <B>" . $Rec['connections'] . "</B></P>";
  print "<P>Temperature: <B>" . $Rec['temp'] . "</B> DegC</P>";
  print "<P>Temperature 1 hour average: <B>" . round($Temp1hrAvg['AVG(temp)'],1) . "</B> DegC</P>";
  print "<P>Temperature 24 hour average: <B>" . round($Temp24hrAvg['AVG(temp)'],1) . "</B> DegC</P>";


  // Cleanup
  $DBH = null;
  ?>
  </body>
</html>

 

Results

 

Here is an AF element template I created for my Raspberry PI performance monitor, it is configured so that the tags can be automatically created, to roll out an element from this template the user just needs to enter the data in the ‘Tag Configuration’ Attributes and press ‘Create or Update Data Reference’ and all tags will be created.

The current tag values, note that 226693 measurements at 1 measurement/minute is around 157 days.  The longest up-time I have had so far is around 100 days, showing that the platform is fairly stable (being in my living room the biggest threat it has are me doing the vacuuming and my cat's curiosity)

An interesting thing a found was that it is possible to determine when I had the air conditioning on by looking at the CPU temperature of the raspberry PI, you can see in the plot below that the temperature drops markedly (more than 3 degrees in about 15 minutes) when the air conditioning is turned on.

 

Another interesting observation is the behaviour of the free disk space on the Raspberry PI, data is inserted into a SQL table every minute and then old data is deleted once a week, you can see this pattern reflected in the free space available.

 

Final Thoughts

  1. It was interesting to see how I could accurately determine whether the air conditioning was on based on a measurement of the core temperature rather than measuring the air temperature directly.  It goes to show how much information you can get from data if it is available.  I think my next steps may be to use analyses and event frames to see what else I can gleam from this.
  2. The timestamp of my measurements is consistently 2 seconds past the minute though it is scheduled to run on the minute.  This probably reflects a combination of the relatively inefficient way I have implemented this and the limitations of the Raspberry PI.
  3. Configuring the HTML interface was really easy as I had complete control of both the interface configuration and data source (web page) having this degree of control is relatively uncommon in my experience, but helped a lot in this case.  The reason the values on the webpage are in bold (first figure above) is that I used the bold HTML markers to find the values with the HTML interface.

There is no way to send a negative number directly to a digital PI tag. Or more accurately, sending a negative integer to a digital state tag will result in a system digital state being written to the tag. For more details on why that happens, see this KB article.

 

Scott Robertson recently introduced me to a clever work-around for this issue. Instead of sending the negative number directly to the digital state tag you can instead do the following:

  1. Store the value in an integer PI Tag. We will call this PI Tag JANET.
  2. Reference JANET in an AF attribute. We will call the AF attribute CHRISSY.
  3. Use an enumeration set for CHRISSY that translates the integer into a string, and then you have two options:
    1. Write the string from CHRISSY back to a second digital PI Tag using a PI Analysis Service Expression. We will call the second PI Tag JACK.
    2. Use CHRISSY directly and save yourself a PI Tag.

 

If you do not have PI AF Server at your site, then you have one option:

  1. Store the integer value in JANET like before.
  2. Create JACK as a digital Performance Equation (PE) tag that uses a long, nested IF THEN ELSE statement to perform the translation from the integer to the digital state.

 

If you are new to enumeration sets, then these videos will help you get started:

Most users assume that setting compression deviation to 0 will mean that even the smallest change in a tag's value is still archived and that repeated values are discarded. This assumption is valid for the vast majority of data. However, there are two special cases where that assumption is not true. In an earlier blog post I described why a user can see repeated values in his or her archive even when compression deviation is set to 0. In this post I will address the other special case where changes in a tag's value are not archived even when the compression deviation is set to 0.

 

Let's say we have incoming data from an integer tag that looks like this:

TimestampValue
00:00:011
00:00:022
00:00:033
00:00:044
00:00:055
00:00:065

We would expect to see each value in the archive because the value is changing. What we actually see in the archive is the following:

TimestampValue
00:00:011
00:00:055
00:00:065

The values 2, 3, and 4 are compressed out because they fall exactly on the line between 1 and 5. In other words, their deviation from the slope of the line is 0, which is equal to the specified compression deviation; therefore, they do not pass compression. This is the key difference between turning compression off and setting the compression deviation to 0. With compression off, all values are archived. If compression is on and the compression deviation is set to 0, then all values whose deviation* is greater than the compression deviation are kept.

 

Note that we have not lost any information after compression. If we were to interpolate for the value at 00:00:02, then we would get a value of 2, which is exactly the value that entered the snapshot and was compressed out.

 

*that is the deviation from the line drawn between the most recent archive value and the newest value

First, if you haven't seen OSIsoft: Exception and Compression Full Details - YouTube, then go watch it now because the rest of this post assumes you have seen it.

 

I have gotten this question a few times from customers, "I see repeated values in my archive even though I have compression turned on. I thought compression prevented repeated values from occurring. What is wrong with my compression settings?" Chances are, there isn't anything wrong with your compression settings.

 

There are two possible reasons you are seeing repeated values:

  1. Your compression maximum value is taking effect.
  2. The values in the archive are necessary to avoid losing information.

 

The CompMax value for a tag determines the maximum amount of time between data points in the archive. By default it is set to eight hours. If no events pass the compression test for eight hours, then the current snapshot will be archived as soon as the next value is received. This means that the time between values in the archive will always be less than or equal to eight hours because the timestamp of the current snapshot must be at or before the eight hour mark. If you are looking in your archive and seeing identical values whose timestamps are almost but not quite 8 hours apart, then you are probably exceeding CompMax.

 

If the spacing between repeated values is much less than your CompMax setting, then you are probably running into the second scenario. As an example, let's say you have a tag writing a value of 1 or 0 once a second, and your archive data looks like this:

TimestampValue
00:00:011
01:00:001
01:00:010
02:00:000
02:00:011
03:00:001

And your question is, "Why is the data 1,1,0,0,1,1 instead of 1,0,1?" If we plot the first data set that has the repeated values, and the Step attribute for the tag is set to OFF, then we should see a plot that looks like this:

Note that I have altered the spacing on the x-axis to highlight the transitions between 1's and 0's. If we plot the second data set that does not have repeated values, then we should see a plot that looks like this:

Now we have lost information. The first plot showed that the value held steady for an hour, changed suddenly, held steady at the new value, and then changed suddenly back. In the second plot, the values changed gradually over the course of an hour, and then gradually changed back.

 

Many users think that setting compression deviation to 0 and leaving compression on will eliminate all repeated values.  This is not without cause, at 10:03 in the video mentioned at the beginning of the article the narrator states, "[Setting] compression to on [and setting] the compression deviation to zero... means that successive identical values will not be archived." This statement is true the vast majority of the time but not always. In our first plot above only two values in an entire hour's worth of data were archived; the majority of repeated values were discarded. We had to keep the two values at the start and end of each horizontal line in order to show that the value did not change for the whole hour. As we saw in the second plot, if we were to discard the end point for each horizontal section, then it completely alters the information stored in the archive.

 

And that is why you see repeated values in your archive even when compression deviation is set to zero.

I’ve seen a lots of questions regarding URL from notifications to Coresight and decided to write a blog post regarding the topic.

During my test’s I found two main ways of doing this, but I have to say I prefer the built-in “Web Link” feature in notifications.

The second option is to specify the URL in an attribute and add the attribute value to the notification message. This works if you don’t have any spaces in your server or database name but that wasn’t the case in my environment so this is how it turned out:

How it’s done

First of all, I’ve created a ProcessBook ERD display, as you may be familiar with there is possibilities to manipulate the element of interest in Coresight by specifying it in the URL by “?CurrentElement=\\SERVER\DATABASE\PathToElement\”. (If you don’t want to use a ProcessBook ERD display, just use “?Asset=\\SERVER\DATABASE\PathToElement\”)

 

First, publish your ProcessBook display to Coresight.

Then you’ll need to navigate to that display in order to get the correct url.

Copy the URL and start configuring the notification.

On your right hand side click “Add->Web Link->Other”

Display name: Open in Coresight

Link address: http://complete/url/to/coresight/display example “https://democor/coresight/#/PBDisplays/55/”

 

Then add a parameter.
For ProcessBook display the parameter name is “CurrentElement”
For Coresight display the parameter name is “asset”
The value is simply the Target (path)

Just click OK and drag the link into the notification message

 

Verification


 

 

Using these kinds of URL parameters in notifications could also open up possibilities to specify the start and end time for Coresight (parameters: StartTime=*-1h&EndTime=*). You could for example use start time of the notification or create an analysis that writes a start-end time to an attribute before sending the e-mail.

A customer stopped by the PI ProcessBook pod on Wednesday asking how to grab just the M/DD part of a timestamp from a Value symbol with VBA. Here's a quick example when you have a value symbol on your display called Value1.

 

Sub Test()

 

Debug.Print showMonthAndDayOnly(Value1)

 

End Sub

 

'takes a value symbol and returns M/DD string for the timestamp

Function showMonthAndDayOnly(valueSymbol As Value) As String

 

Dim vValue As Variant

Dim vTime As Variant

Dim vStatus As Variant

 

vValue = valueSymbol.GetValue(vTime, vStatus)

 

Dim sDate As String

Dim i As Integer

 

'finds location of 2nd "/" in the timestamp's date

i = InStr(1, vTime, "/")

i = InStr(i + 1, vTime, "/")

 

showMonthAndDayOnly = Mid(vTime, 1, i - 1)

 

End Function

We have made our PI Developer Technologies more easily available. Anyone with an account on the Tech Support website or PI Square is now able to download the PI AF SDK, PI Web API, the PI OPC DA and HDA Servers, and all products in the PI SQL Framework (PI OLEDB Provider, PI OLEDB Enterprise, PI JDBC Driver and PI ODBC Driver). We are doing this to make it easier for you to build applications that leverage PI System data. See this blog post for details.

OSIsoft_Webinars_Top_Banner-600x67px.png

Wednesday, March 4th

Session 1: 6 am PT / 9 am ET / 3 pm CET
Session 2: 10 am PT / 1 pm ET / 7 pm CET

Presenters:
Keith Pierce, OSIsoft
Global Solutions Group
Chris Crosby
Industry Principal

Discover:

  • Why condition monitoring and CBM are high value initiatives.
  • Best practices for designing a condition-based maintenance (CBM) strategy.
  • Hear directly from PI System customers about their implementations

This complimentary webinar will deliver content to those interested in:

  • Preventing equipment failures, improving uptime, and optimizing operations.
  • Implementing a real-time data infrastructure for asset health and CBM.
  • Integration with computerized maintenance management systems.

Register now.png

Over the decades we’ve seen PI Systems used in many creative and valuable ways. Sharing tips and tricks discovered along the way is kind of fun.  So welcome to PI Square and let the fun begin!

 

One of the ‘geekier’ (is that a word?) use cases involve mining system log data. OT systems have become more complex than ever. Monitoring the associated performance indicators and log events can provide an edge for operational mission assurance.

 

Like process data, system logs come in a variety of formats and access methods. There are some generally applicable tips and tricks for handling log data. This post explores the Windows firewall as representative of nuances related to logging based on clear text flat files. A PI System with at least one UFL interface node is a classic approach for processing file based data sources.

 

First we notice that although the Windows firewall is enabled by default, logging is disabled by default. Right or wrong, having to enable ‘extra’ or ‘verbose’ logging is actually fairly common for all kinds of systems. In this case, logging is enabled with a few clicks of the mouse; here are the equivalent console commands:

 

     netsh advfirewall set allprofiles logging droppedconnections enable

     netsh advfirewall set allprofiles logging allowedconnections enable

 

The default setting creates two files in the %systemroot%\system32\logfiles\firewall folder: ‘pfirewall.log’ and ‘pfirewall.old’.  Current events are appended to the log file. The log file overwrites to the old file based on reaching a size limit (4MB default).

 

You might have guessed a fairly common ‘monkey wrench’ with this pattern. Windows firewall keeps the current log open which blocks UFL from processing the file. Getting the events from the old log is viable but adds significant delay waiting for the current log to fill.

 

A small script can generate files for UFL. The trick is to copy the current log so it’s easy to measure of the number of lines in the file.  The script selects only the lines that were appended (including checking for a roll over) and outputs to a unique file for processing by UFL. Extending the script to copy logs from remote systems is left to a future post.

 

The Measure-Object  -line and Select-Object –last Powershell cmdlets made this script kind of fun.  The biggest trick was Out-File –encoding “Default”. Without specifying “Default” the file will be UNICODE which isn’t compatible with UFL.

 

Firewall log entries have a simple space delimited structure as shown in the file header:

#Version: 1.5

#Software: Microsoft Windows Firewall

#Time Format: Local

#Fields: date time action protocol src-ip dst-ip src-port dst-port size tcpflags tcpsyn tcpack tcpwin icmptype icmpcode info path

 

The action field notation is “ALLOW or “DROP” and the path field notation is “SEND” or “RECEIVE”. Four corresponding UFL message filters are as follows:

 

InboundAllow.FILTER=C1==”*ALLOW*RECEIVE*”

InboundDrop.FILTER=C1==”*DROP*RECEIVE*”

OutboundAllow.FILTER=C1==”*ALLOW*SEND*”

OutboundDrop.FILTER=C1==”*DROP*SEND*”

       

UFL message filters provide a natural way to store data in different PI points depending on the type of event (i.e. protocol, source IP address, source port, destination port and size in points named for inbound dropped traffic).

 

StoreInPI("NODE:PROTOCOL-DROP.IN",,TimeStamp,Protocol,,)

StoreInPI("NODE:SRCIP-DROP.IN",,TimeStamp,SrcIP,,) 

StoreInPI("NODE:DSTIP-DROP.IN",,TimeStamp,DstIP,,) 

StoreInPI("NODE:SRCPORT-DROP.IN",,TimeStamp,SrcPort,,)

StoreInPI("NODE:DSTPORT-DROP.IN",,TimeStamp,DstPort,,) 

StoreInPI("NODE:SIZE-DROP.IN",,TimeStamp,Size,,)

 

PI points are configured as follows: Ports and size as INT32 points; IP addresses and protocol are STRING points; optionally, protocols may be setup as a digital state set (eg. use IANA codes per %systemroot%\system32\drivers\etc\protocol).

 

It’s especially handy to monitor dropped traffic both for troubleshooting and even as an attempted security breach indicator. Even with the limitations in this example, one typically discovers there is more activity affecting a system than is immediately obvious.

 

In summary, monitoring the host firewall on a mission critical system makes sense but we are only getting started – there is more fun to come!

 

TL;DR  System logs come in many sizes and shapes. There can be subtle nuances in mining events from file based logging systems. This post introduced a general pattern for copying from open log files and processing with UFL. A future post in this series extends the script used in this pattern to gather log files from remote systems.

The fundamental goal of PI Square is to allow for better information sharing between PI Users, Partners, and OSIsoft employees. With this in mind, OSIsoft support will be monitoring content, answering questions, and participating in discussions in PI Square from offices all around the world.


We look forward to your questions, answers, suggestions, ideas, product discussions, PI jokes, tips and tricks, videos, and anything else you want to share with us.

 


Will you only take questions in English?

 

Not at all. You can post in your local language allowing community members and support to answer you in that language.


Will you answer all questions in PI Square?


The same OSIsoft Tech Support that answers calls and emails today will be monitoring the All Things PI - Ask, Discuss, Connect space and the product sub-spaces there, but we hope that you and other community members answer questions as well. If a PI Square member outside of support answers first, we will take a look and if we have something to add, we will definitely add to it. (For questions: Dirk Veenema)

 

OSIsoft Development Support teams will be monitoring questions in the PI Developers Club space and sub-spaces there to help provide suggestions and answers to development questions. (For questions: Ahmad Fattahi)

 

OSIsoft Learning will be monitoring the Learn PI space and class content and will be replying to questions, comments, ideas, and suggestions within that area.(For questions: Buck Bard)

 

OSIsoft PI Square team will be monitoring the Welcome to PI Square! space and will reply to your questions, comments, ideas, and suggestions within that area.(For questions: Jarita Sirois)

 

What if we need to do a remote session?

 

Not a problem! If we can't get to the solution in PI Square, someone in support will contact with you to figure out the best way to get your issue resolved. Then one of us can go back to your PI Square post to be sure we put the solution into the question/discussion you started. We want to be sure PI Square is as helpful to others as it can be.

 

Additional questions about how we will support you here?


Please comment on this blog and let us know! Our goal is to continue to provide you with the best support possible and make sure you are successful in all your PI System endeavors.

 

 

Take me to the ALL THINGS PI space now!

Excel Is Our Friend – Part IV


While writing this blog series, I became consciously aware… my Dad knew that I didn’t know the names of those tools. I bet he was smiling to himself under the truck… It was a sneaky way of teaching me something useful, without my consent.

  • How to: VBA and FSO


Aside from importing the MDB, and getting current values with DataLink, we have confined ourselves to within the boundaries of the workbook. But with column G, we will access the File System Object, and extract a file property (version) from external files. This same library has other useful methods as well, such as testing for file and folder existence, reading and writing files, to name a few. The following will be limited to getting the file versions, but you can find plenty of information on the web should you have the need to utilize those other methods.


Note: The following code/ example will not always work. It is dependent on a number of things, but most particularly security. If accessing files on a remote machine, and your current logged in credentials don’t equate to an administrator on the remote machine, or if a firewall or policy prevents access. The code utilizes the hidden administrative shares of the remote machine. The remote machine could be in a different domain or workgroup, causing Windows Authentication to fail also. A quick test of accessibility, using start> run>
\\YOURinterfacemachinenameORipaddress\c$

Enough with the disclaimers!

  • In the Visual Basic Editor (VBE), add a reference for Microsoft Scripting Runtime (scrrun.dll) -> Tools menu> References (if you skipped “Foundation” entry, Part II)
  • Create an ActiveX Commandbutton on your worksheet, where the click event will execute the following VBA code (in the screenshot, the button is labeled “GetVersion” and has the name “cmdGetVers”)
  • The following code belongs in the worksheet module for the sheet that is the “parent” container for our button>


Note: Regarding #2, I use an ActiveX Commandbutton, but Excel’s Form Control Button will work too (with some adjustments that I won’t cover here).

 

 

Private Sub cmdGetVers_Click()  ‘The name is unimportant, but with AcitveX, it should match the object name’
    Dim strPath As String     
  If Selection.Columns.Count > 1 Or Selection.Column <> 7 Then
       MsgBox "Please select cell(s) in column G before executing GetVersions"
       Exit Sub
  End If
  For Each c In Selection         
       strPath = "\\" & c.Offset(0, -5).Value & "\" & Replace(c.Offset(0, 11).Value, ":", "$", 1) & c.Offset(0, -2).Value & ".exe"
       c.Value = GetVersion(strPath)     
  Next c
End Sub 

 

The code above calls the function below, passing the path\filename:


4. The following code belongs in the Module1, created previously to hold our User Defined Function "mySplitter", Part III.
Public Function GetVersion(strFilePath As String) As String
   Const ForReading = 1, ForWriting = 2, ForAppending = 3
   Const TristateUseDefault = -2, TristateTrue = -1, TristateFalse = 0
   Dim fs As Scripting.FileSystemObject, f As File, ts, s
   
  On Error Resume Next
     Set fs = CreateObject("Scripting.FileSystemObject")
     GetVersion = fs.GetFileVersion(strFilePath)
     If GetVersion = "" Then    ‘if no version, the mod date instead’
       Set f = fs.GetFile(strFilePath)
       GetVersion = f.DateLastModified
  End If
  If GetVersion = "" Then 'just in case it ain't there’
       GetVersion = "no data or file"
  End If
        
End Function 
Note: This function could also exist on the worksheet module above. (it is good practice to use a module for functions (IMO). However, Excel requires that User Defined Functions be in a module, like “mySplitter” above. They won’t work otherwise.) Notice the use of “Public” in the declaration. This allows the function to be called by the remote procedure in the worksheet module. Declaring it as Private would result in an error when we clicked the button above, “Sub or Function not defined”. Translation= I don’t “see” anything named “GetVersion”. (Access to a Private function would only be available to functions and subroutines that also reside within Module1, Public functions would be available globally at the Project level.)


5. Select a range of cells with your mouse, like cells “G28:G30”, and click the button! Hopefully, you will see versions popping into those cells in the blink of an eye.




You may notice that I have several unused variables in the function. These are standard variables that I copied from another routine. I use FSO a lot, for creating, reading, writing, renaming files, etc. More details can be found here:

Visual Basic for Applications Reference for FileSystemObject Object
http://msdn.microsoft.com/en-us/library/aa242706(v=vs.60).aspx

When exploring and understanding any block of code, stepping through the code one line at a time in debug mode is an invaluable learning technique. (see this link http://www.cpearson.com/excel/DebuggingVBA.aspx)

Hopefully, if you have followed along with no major issues, you have a working document. One that saves you a bit of time, monitoring and accessing your own interfaces. But more importantly, you have added new things to your “toolbox”. We have covered quite a bit of ground here: Navigation, Relative vs Absolute Addresses. PI MDB, VBA, Project References, VB Modules, Public vs Private Declarations, FSO, IF Statements, FOR loops, User Defined Functions, Functions (VLOOKUP, MID, LEN, FIND, HYPERLINK, SUBSTITUTE, ISNA), and the use of Nested Functions. Hopefully, you don’t feel like you have been drinking from a fire hose. And even if you are not fully versed in the usage of each and every one of these, just knowing that they exist, and some idea of when to use them, is a huge step in itself. I hope you can use this information to escape from an island…

Part V in the series will be posted soon, concluding this series and contains a FizzBlog Challenge (A bit of a contest). Winners will receive restroom breaks, automatic adjustments (for DST) to their system clocks, and an assortment of free junk mail!

Excel Is Our Friend – Part III


Toolbox

As I child I remember my Dad doing most of the auto repair himself. And if anyone was outside doing anything, I was there (a simpler time, before Xbox, even before cable TV). So, when I use the word toolbox, I remember asking Dad… What is this one called? And What is this one for? And I remember him asking, hand me the so and so… my standard clueless response was “this one?”

While escaping the island has the appearance of being our main objective, the true focus is on learning the tools.

  • How to: VLOOKUP – built-in function

 

This built-in Excel function is something you definitely want in your “toolbox”.


The VLOOKUP function, which stands for vertical lookup, can help you find specific information in large data tables. In addition, and equally useful, it can determine what is not in the table. In the example below, I use VLOOKUP to gather external information into a list formatted to my preferences. After creating so many PI Interfaces here, I was having trouble remembering the specifics without having to look for them. So, I built a spreadsheet for a quick reference. But I did not want to manually populate this listing. So with a few items from the toolbox, I was able to semi-automate the gathering of the data, as you will see below:


First, I imported the Module Database from my PI Server, into a separate sheet in my workbook (renamed that sheet “R2-Int1MDB”). I used the PI-SMT > Module Database Builder Add-in > Import Items. I just imported the tree where the interfaces reside (“\\R2-PI\%OSI\Interfaces”), including all subitems, values, etc., everything.

 

After determining which values were of use, I created column headings for my list, some of my columns are visible in the example below.

 

ExcelFig6.png


The next step was to get a listing of the interfaces, for column A. To get this, I went to the sheet “R2-Int1MDB”. After selecting any cell in the column headings of the “list”, I clicked Excel’s Data Menu, and clicked the “Filter” button. (Excel has a help topic on “Quick start: Filter data by using an AutoFilter”, You can point at the icon in the ribbon and press F1).

 

Using “AutoFilter”, for my “Type” column, I selected “Module”, for the “Name” filter, I used the Text Filter > “contains”, and entered a filter value of “\\*\*\*\*\*” (without the quotes). This filtered my list to machine\interface instance. Placing my activecell on the first item in the filtered list (clicking on that cell), I used the shortcuts mentioned above (Ctrl+Shift+DownArrow) to select the entire list. Then, Ctrl+G, followed by Alt+S (or click “Special”), Alt+Y (or click the “Visible cells only” radio button, then . Next, I navigated back to the list I was building. Pasting the values that you see in column labeled “ModulePath”.


So at this point, I have a complete listing in column A of all of the Interfaces used for this PI Server Collective, “R2-PI”. Not very useful, by itself, but this provides the basis for gathering data for our list.


So, let’s utilize the various tools available to us to populate our columns. Our first victim is InterfaceNode, in column B. There are actually several ways to get that tidbit, since it is present in the ModulePath in column A, the last segment, when parsed with “\”.
  • How to: Using Nested Functions and User Defined Functions t
  • Since the root path is always the same length (24), we could use this formula in B28: =MID(A28,25,FIND("\",MID(A28,25,99),1)-1). <– the Find function here is a “Nested Function” within the third argument of the Mid function
    • Explanation of arguments: Explanation of arguments: Since we know the values of interest all begin with the string and fixed length, LEN("\\R2-PI\%OSI\Interfaces\")=24, we can begin our search at character 25, seeking the position of the next "\", subtracting 1 from the length to exclude the "\". If the string was not of fixed length, we would have to nest additional “LEN” functions where we presently use “25”.
  • Alternatively, we could create a User Defined Function, outlined in this post
    • By creating a module for our workbook, in VBE, creating “Module1”, and pasting the UDF’s custom code from the linked post, we can use the following in cell B28: “=mySplitter(A28,"\",5)”
Continuing on, while skipping the “UFOTags” column for now, we move to the “Binary” column. Here we will use a combination of Concatenation and VLOOKUP.
So, looking at E1, you see that I have typed “||InterfaceBinary”. This is the attribute that we want to extract from the MDB Import, for each of the interfaces in column A.

So, we need the contents of A28 appended with the contents of E1, using the & operator, like so: $A28&E$1. Note the position of the $ signs in the concatenation. $ indicates absolute references, as in “absolutely column A”, and the row is relative to the formula location. Meaning that we can fill right and the formulae with always use column A. Likewise, we can fill down, and the formulae with always reference $1, row 1.
We will nest this function inside an argument of the VLOOKUP function.
So, selecting E28 with our mouse, and typing =VLOOKUP( and then clicking “fx” on Excel’s formula bar will present the dialog below (fx is circled red in Screenshot).

Lookup_value = what to “find” in the Table Array (it searches only in the 1st column of the Table_Array, returning only the first matching entry) You can see the use of &, in the first argument in the screenshot’s dialog box.
So, in the screenshot, our Lookup_value = \\R2-PI\%OSI\Interfaces\R2-INT1\RDBMSPI1||InterfaceBinary
Table_Array = where to look for the value. Note the format of this argument, it can be drag-selected, if you click the icon at the right edge of the field. Note that you will have to manually add the $, so that you can filldown and fillright. The rows from the mdb import have an absolute address. Filling down, without absolute cell references would omit rows from the top of the list… You can also toggle between absolute and relative references, by selecting them, in the formula bar or a dialog, and using the key, repeatedly.
Note that the first column is B, where we would our Lookup_value = \\R2-PI\%OSI\Interfaces\R2-INT1\RDBMSPI1||InterfaceBinary
Col_index_num = which column of the Table_array do you wish to return if Lookup_Value is found in column 1 of Table_array
This is the column in the array, not the sheet. In my array, column 19 is the “PropertyValue” column.
Range_lookup = False means find an exact match. I almost never use True for this argument, but it can be useful on occasion.



After completing the dialog with , the formula is usable in the other columns/ rows with the use of filldown and fillright, with exceptions for columns C, D, G. and I. Note that row 1 of these columns are blank. Other methods are used for these.


In the interest of brevity, I won’t go as deeply into the details of columns C, D, and I, other than providing my formulae here:
C28 is =IF(ISNA(O28),"none",E28&K28&"_UFO2_State_"&O28)
O28 is out of view in the screenshot, but it is a lookup for “||Arguments|/UFO_ID”
D28 is =IF(C28="none","none",PICurrVal(InterfaceList!$C28,0,InterfaceList!$H28))
So, if the UFO tags are specified in the MDB, I use DataLink to show me the current values… if the interface running?
I28 is =HYPERLINK("\\"&B28&"\"&SUBSTITUTE(R28,":","$",1),"Explore Folder")
R28 contains the “||CurrentInstallationPath”, and since I am an admin on the remote machine, I can access this through the administrative share, replacing the “:” with a “$”


· Tip: System overload

Sometimes I find a spreadsheet behaving sluggishly, due to the calculation burden that I have created. With larger lists, and more and more functions, the spreadsheet will tax the CPU as it recalculates. As a lot of the functions above have already served their purpose, they can be replaced with the values which have already been calculated or retrieved. So, in these cases, copy those cells containing functions and replace, using Copy > Paste Special > Values. This will replace the functions with their static values unburdening your CPU from the unnecessary.


For those of you that have been eyeballing the Version column and the CommandButton… I saved it for Part IV.

Filter Blog

By date: By tag: