Skip navigation
All People > Lonnie Bowling > Lonnie Bowling's Blog > 2016 > April

Lonnie Bowling's Blog

April 2016 Previous month Next month

A couple of weeks ago I participated in the OSIsoft UC2016 Hackathon. We had 22 hours to create an application that was based on a “Smart City” theme. We were given access to a copy of a production PI System for the San Diego International Airport. We also had representatives from the Airport to help explain their goals and challenges with running such a facility. Overall, the event was extremely well organized and I felt like we were able to get to work right away without the usual amount of time lost trying to connect and understand a system you have never seen.


Lisa Slaughter, Andrew Pong, and myself formed a team called MOAR BOTS!!!1 We all work at DST Controls and had the idea to design a natural language interface to a PI System. Andrew actually came up with the idea while watching the Microsoft Build Conference keynote, which was the week before our event. As luck would have it, Microsoft had just released a beta Bot Framework that we thought would be great to try out.


We spent most of the morning and afternoon learning how the bot technology works and what we could do. As evening rolled around, we spent a few hours spinning up a bot and tying it to the PI System. We kept our sights low. Our goal was to be able to ask the bot in a natural way what the energy usage was for an area of the airport for a certain time. For example we could say, “What was the energy usage for terminal 1 last Saturday?”, or “Tell me how much energy was used in the commuter terminal yesterday.” Either way the bot would need to know three things, what was the KPI of interest, in this case energy usage, what area, terminal 1 or the commuter terminal, and what date. As you could guess, there are countless ways to ask even a simple question like this, so I was skeptical that we would get good results.

Our application followed the following flow:


  1. Users ask a question within our application
  2. The question is sent to our bot service via an HTTP post
  3. The bot gets the message payload and sends the message to Microsoft Language Understanding Intelligent Service (LUIS)
  4. LUIS returns a JSON object that gives us a consistent structure of intents and entities
  5. We examine the intents and entities to structure a PI Web API query
  6. We query the PI System and return the results to the bot
  7. The bot sends back the response to the application


To get started we developed a bot within Visual Studio.  The bot is an http service that applications can use to have a conversation with some kind of system. Bots handle the connection I/O to applications; they keep track of who the user is, translate languages, and handle conversation state. I think of the bot as that tour guide you have on a vacation to a foreign country. Here is a diagram from Microsoft:


After the bot application is developed, it is deployed to Azure and it becomes a simple matter of posting HTTP requests to the endpoint. From there the bot does all the hard work.

For our initial bot service, we are just doing the basics, really not using the bot capabilities, like tracking user or state information. We just grab the message and process. This could have been accomplished with a simple API, but remember, this is just a start and our time was limited!


The first line where something interesting happens is with the http request to LUIS, we send the bot message to our LUIS service that will make sense of the question and then returns a JSON object, that object is then deserialized and sent to our PIService.GetKPI (line 23). The code that does this looks like this:


public async Task<Message> Post([FromBody]Message message)
    if (message.Type == "Message")
        string appId = @"9c1d7df5-92be-4ade-ab29-7affaa91b797";
        string subKey = @"d1a9a95fc7b5400cb7996db63ed26f66";

        string lroot = @"" + appId + "&subscription-key=" + subKey + "&q=";

        string uri = lroot + Uri.EscapeDataString(message.Text);
        string val = "I did not understand...";
        using (var client = new HttpClient())
            HttpResponseMessage msg = await client.GetAsync(uri);

            if (msg.IsSuccessStatusCode)
                var response = await msg.Content.ReadAsStringAsync();
                var data = JsonConvert.DeserializeObject<piluis>(response);
                if(data.intents[0].intent == "GetKPI")
                    var piService = new PIServices(serverUrl, baseElement, userName, password);
                    val =  await piService.GetKPI(data);

        // return our reply to the user
        return message.CreateReplyMessage(val);
        return HandleSystemMessage(message);


This is just an API call to LUIS. At this point, you might be wondering what LUIS is about? I think this is where the really cool part happens. LUIS, Language Understanding Intelligent Service, is the same technology that Microsoft’s Cortana uses. With a little training, which I will get to in a second, you can send the API some text and it will return a JSON object that breaks down the information into intents and entities. These intents and entities allow us to then further process the request with a typical PI Web API request.


To train LUIS, we enter some typical request that a user would make and then tell the system what the intents (like getting a KPI), and the entities like what type of KPI, Asset (area of the airport), and date/time. We were able to use an pre-defined entity for date and time and added KPI and Asset. When LUIS returns the result, it assigns a confidence score (0 to 100%) so you can judge how you want to handle the item.  We can examine a response for LUIS and get the necessary information to go out and find the data:


public async Task<string> GetKPI(piluis message)
        string time = "*";
        string element = "";
        string kpi = "";
        string kpiDescription = "";
        string asset = "";
        foreach (var e in message.entities)
            switch (e.type)
                case "Asset":
                    asset = e.entity;
                case "KPI":
                    if (e.entity.Contains("energy"))
                        kpi = "Real Power";
                        kpiDescription = "Energy Usage";
                case "":
                case "builtin.datetime.time":
                    time = LUISParse.ParseDateTime(e);
        if (time != "")
            var data = await GetKPIData(asset, kpi, time);
            string results = data["AssetName"].Value<string>();
            results += " " + kpiDescription + ": ";
            double x;
            var t = double.TryParse(data["Value"].Value<string>(), out x);
            results += x.ToString("F2", CultureInfo.InvariantCulture);
            results += " " + data["UnitsAbbreviation"].Value<string>();
            results += " " + data["Timestamp"].Value<string>();
            return results;

            return "I did not understand your request...";
    catch (Exception e)
        throw e;


If all goes well then we can make a call to the PI System using the PI Web API to get the information.  Once that is done, we take the data and give a good response back. Here is a sample conversation:


Note how I varied the way the question was asked and in each case the GetKPI, Asset, and time was successfully determined. Training the LUIS application is fairly simple, at least for this example. I defined my intents and entities, then gave sample utterances, and finally, highlighted the text that mapped to my intents and entities. Here is a screenshot:



The colors show what is mapped. The cool thing is, you can monitor this after the application is published and continue training based on what users are asking. That makes it possible to improve the bot over time based on actual use.


I think this technology is very interesting and I can see a lot of useful applications for it. My biggest surprise was how well it actually worked without much training. I’m pretty certain that we all will be interacting with bots much more in the coming years and we most likely won’t even realize that there is a bot on the other side!



We will be presenting during the PI Dev Club Webinar “The Best of the Best: Smart Cities Programming Hackathon 2016”, on May 4th, 9:00 am PT.

I also want to do a detailed YouTube video that will step through the process. I will update this blog when I have that done.


I would love to hear what everyone thinks about this and how this could be used, please post a comment if you have time!



While is it still fresh in my head, I wanted to do a quick recap on my experiences at the UC.


#1 Attendance

During the Keynote it was mentioned that there were 2,500 attendees, which is up from about 1,800 last year. I think this is validation that OSIsoft is doing very well and continues to be the leader in the time-series database market. I met several new attendees that were completely blown away with the capabilities of PI and all the associated products. For people like me, which have been to many UC’s, the excitement of people just being introduced to PI never gets old. It is a way that I get new energy and ideas to make my year productive.


#2 Hackathon

The theme of the Hackathon was working with actual San Diego Airport data to see what kinds of applications and data analysis could be designed, developed, and performed. In the past years there usually were issues about getting access to the platform or data but this year it was very well organized and most did not have those types of problems. This gave us more time to work on our ideas. Also, information was “leaked” out over a couple of weeks preceding the event to give us a chance to really think up a good idea.

I was on one of two teams from DST. My team was just edged out from placing in the top three. We designed a natural language PI Bot that you ask questions and get answers about data in a PI system. It was really interesting and got me thinking about how we will interact with machines in the future.

They also invited five teams do a pitch session. We pitched in front of a panel that included Pat Kennedy, a couple of executives from the San Diego Airport and a real-life Silicon Valley VC exec. Our team was able to do the business pitch, which I thought was awesome. It felt like a real life Shark Tank! Although I did not land a big deal, it was fun to dream. I feel like hackathons are a great way to release the creativity that is bottled up in your average developer. Lots of team work and ideas exchanged.


#3 Coresight 2016

Last year’s addition of ProcessBook displays was just a preview of what was in-store for Coresight. It has lost all of it’s Silver Light dependencies and can run in all modern browsers. With the next release we will be able to build displays within Coresight and take advantage of extensibilities, which will enable you to develop custom widgets. This is due to be released in May. There are many more features to be rolled out in the next year and a half. Big kudos to Chris Nelson, Tom LeBay, and Eugene Resnick! You and your team are doing a great job!


#4 Emerging Technologies

If you feel like OSIsoft might be just kicking back and enjoying the fruits of their software that was developed 30 years ago, don’t be fooled. They have developed Qi which is a cloud based data storage platform over the last couple of years. They also were demo’ing Nova which is the first visualization product designed to work with Qi. It is a full web-base, Coresight like, product that can be used to design displays, dashboard, and trends. All early stuff and in various stages of previews.

There is also much going on with data ingress. OSIsoft Message Format (OMF) was announced; it is a new standard that is being developed to allow writing data in a standard way to a PI System. This will allow new connectors and interfaces to be developed by third parties. In the past you were 100% on your own if you wanted to do this. I think this will create interesting opportunities, especially in the IoT space.



Overall, at the end of the five days, I was exhausted, had far too little time to attend presentations and demo’s, and was only able to talk to a fraction of the people I wanted to. I was very happy with the overall enthusiasm and energy of those with-in OSIsoft, those that have been using PI for years, and those that are just learning about PI for the first time. In a funny way, I feel like this is a new beginning, kind of like a PI 2.0 transition. The PI System is becoming much more open, more scalable, and is being used to solve an ever growing list of business challenges.


Lonnie A. Bowling
PI Enthusiast

My Personal Blog:

My PI YouTube Videos: Lonnie Bowling - YouTube