Skip navigation
All Places > PI Developers Club > Blog > 2011 > November

I just wanted to take this opportunity to pre-welcome everyone who is coming to vCampus Live! 2011. I'm currently already at the Palace Hotel, and I notice people coming in already. Today is the pre-conference day and I expect a lot of us will be coming in today.


We will have a welcome reception today, so make sure you join us there. It will be a great opportunity to meet your fellow vCampus members, so you already know some people before the event.


For everyone who could not make it to the event: we are trying to keep an eye on the forums to be able to answer questions and engage in discussions. But as you may expect, this can be difficult with our full schedule for vCampus Live! We will be doing our best, but chances are it will take longer for you to get a response from us.


To communicate about the event on Twitter, use the #VCL11 tag so everyone can keep a good eye up for new messages about the event. You can follow the team @OSIsoftvCampus.


Looking forward to meeting you (again)!


ps: Twitter is 'over capacity' at the moment, so it's possible you will get error pages when trying to request status feeds or using the search.


edit: the Welcome Reception will be at 05:00 PM at the Gold Ballroom. Please join us there!


C++ and PI - the PI SDK

Posted by andreas Nov 23, 2011

Recently we got increasing interest in providing sample code in C++.


While C++ is a very powerful language you should be aware that .NET provides you with many little things that make your life as a developer much easier.


In the following post I will create a simple command line application that gets the snapshot of tags. The usage is:


PISDK_GetSnapshot <SERVERNAME> <Tagname1> <Tagname2> <…>


So as you may guess, we start with an empty C++ command line project:




Now let’s go to the code. First some includes:


#include "stdafx.h"
"ATLComTime.h" // for COleDateTime

Then we import the PI SDK libraries.


//need to import sdk
"E:\PIPC\PISDK\PISDKCommon.dll"    no_namespace
"E:\PIPC\LIBRARY\PITimeServer.dll" no_namespace
#import "E:\PIPC\PISDK\PISDK.dll"          rename("Connected", "PISDKConnected") no_namespace

One of the things that constantly drives me crazy when switching between C++ and C#.NET is handling the PIValue, especially the variant PIValue->Value – so the very first thing I’d like to do is introducing a simple class that does this for me:


class MyPIValue
_PIValuePtr spPIValue;

MyPIValue (_PIValuePtr);
double  dblValue;
int     intValue;
_bstr_t bstrValue;
_bstr_t bstrTimeStamp;
COleDateTime codtTimeStamp;

The purpose of my simple class is to get variables that handle much better than the variant (at least in my simple sample code snippets).


Now let’s go to the constructor. Actually we need to figure where to find the data – and that is derived by the variant type. The following code gets me first a COleDateTime and a string representation of that – this is the timestamp, and later the value part as integer, double or string representation:


MyPIValue::MyPIValue (_PIValuePtr pv) {
codtTimeStamp = pv->TimeStamp->LocalDate;
bstrTimeStamp = (_bstr_t)codtTimeStamp.Format(_T("%d-%b-%Y %H:%M:%S"));
DigitalStatePtr tmpDigitalState = NULL;
IDispatchPtr    tmpDispatch = NULL;
_PITimePtr      tmpPITime = NULL;
COleDateTime    tmpTS;
HRESULT         hr = E_FAIL;

_variant_t vT = pv->Value;
vt = vT.vt;

switch (vT.vt) {
case VT_I4:
// Int32
intValue = vT.lVal;
dblValue = intValue;
bstrValue = (_bstr_t)intValue;
case VT_I2:
// Int16
intValue = vT.iVal;
dblValue = intValue;
bstrValue = (_bstr_t)intValue;
case VT_R8:
// Float64
dblValue = vT.dblVal;
intValue = (int)dblValue;
bstrValue = (_bstr_t)dblValue;
case VT_R4:
// Float16/Float32
dblValue = vT.fltVal;
intValue = (int)dblValue;
bstrValue = (_bstr_t)dblValue;
case VT_BSTR:
// String
bstrValue = vT.bstrVal;
dblValue = 0;
intValue = 0;
// Digital?
tmpDispatch = vT.pdispVal;
hr =  tmpDispatch.QueryInterface(__uuidof(DigitalState),&tmpDigitalState);
if (hr == S_OK) {
bstrValue = tmpDigitalState->Name;
intValue = tmpDigitalState->Code;
dblValue = intValue;
// Timestamp?
hr =  tmpDispatch.QueryInterface(__uuidof(_PITime),&tmpPITime);
if (hr == S_OK) {
tmpTS = tmpPITime->LocalDate;
bstrValue = (_bstr_t)tmpTS.Format(_T("%d %B %Y %H:%M:%S"));
intValue = 0;
dblValue = 0;
default :
dblValue = 0.0;
intValue = 0;
bstrValue = "n/a";


Preparation doneJ!


As mentioned in the beginning, I want to get the snapshot – so what do I need? The PISDK, the Server, the Point, the Value and just for the fun the PISDK Version J:


IPISDKPtr       spPISDK = NULL;            /* The PISDK */
PISDKVersionPtr spSDKVersion = NULL;       /* PI SDK Version */

ServerPtr       spServer = NULL;           /* The Server */
PIPointPtr      spPIPoint = NULL;          /* The PI Point */
_PIValuePtr     spPIValue = NULL;          /* The PI value */

Now the code – we initialize COM, check for the command line parameters and finally create the PISDK. After this has been done, we print out the PI SDK version, connect to PI and print out the snapshot of all tags provided as command line arguments:


int _tmain(int argc, _TCHAR* argv[])
// Initialize COM
// Check the command line switches
if (argc < 3) {
std::cout << "Command Line:" << std::endl
  << (_bstr_t)argv[0] << " SERVERNAME TAGNAME(s)";
return (1);
// Create an instance of the PI SDK
// Print out the PI SDK version
spSDKVersion = spPISDK->PISDKVersion;
std::cout << std::endl << "PI-SDK Version "
  << spSDKVersion->Version << " Build "
  << spSDKVersion->BuildID << std::endl;
// get the PI Server
spServer = spPISDK->GetServers()->GetItem((_bstr_t)argv[1]);
// You can use more than just one tagname
for (int ii = 2; ii< argc; ii++) {
// Tagname
std::cout << (_bstr_t)argv[ii] << std::endl;
spPIPoint = spServer->PIPoints->GetItem((_bstr_t)argv[ii]);
// Snapshot
spPIValue = spPIPoint->Data->Snapshot;
MyPIValue mPV(spPIValue);
std::cout << mPV.bstrTimeStamp << " ";
std::cout << mPV.bstrValue << std::endl;
catch( _com_error Err )
std::cout << "Error: "
                        << Err.Description()
                        << " : "
                        << Err.Error()
                        << std::endl;
return (1);
return 0;


And here is the result:




This was the first post - stay tuned for more examples using C++!

Today marks the 3rd anniversary of our beloved Community! On this day in 2008, OSIsoft vCampus was officially born paving the way for numerous collaboration opportunities, discussions, webinars, and other PI System development as well as social activities. After 3 years we are proud to have entertained:

  • 1800 members
  • 1800 discussion threads - 11000 posts
  • 250 blog posts
  • 40 exclusive white papers and tutorials
  • 33 webinars
  • Thousands of software downloads
  • Two Live! events with the third one approaching in 2 weeks
  • the vCampus All-Star program since 2010

Many thanks to all of you and everyone who contributed to the success of our community.


Our next vCampus Live! event is happening on Nov 30- Dec 1, 2011 in San Francisco. It should be a great opportunity for all of us to come together, and discuss geeky as well as nontechnical matters, learn, and have fun! Also, expect a number of new announcements from OSIsoft as well as the vCampus community during the event Whether or not you are planning to attend the event you can stay tuned by following us on Twitter @OSIsoftvCampus following the hash tag #VCL11.


We are, as always, eager to hear from you how and what we can do better. Please share your ideas with us on the forums or contact us at Looking forward to another bright and fruitful year for our community!



Evaluating code in .NET and building your own Calculation Engine

This is the second post in a series about Project Roslyn. You can find the first blogpost in this series here.


To provide a short recap of the first post: Project Roslyn is basically a re-imagined C# and VB.NET compiler. In this series I'm focussing mainly on the C# language. Roslyn provides an API into the C#/VB.NET compilation process. This opens up a lot of new possibilities. In the first blogpost we already saw a prime example of this new technology. With Roslyn it is possible to create an interactive C# or VB.NET console. If you download and install the Roslyn CTP, you already get a console (with Intellisense!) in Visual Studio. If you want to get started with Roslyn, please go to the first blogpost for the instructions.


In this post we are going to have a look at code evaluation with Roslyn. Please remember that C# and VB.NET are compiled languages, so the code is compiled 'on the fly' rather then interpreted. This is a big difference between using Roslyn, or using an eval function in Python, JavaScript or PHP.


Why is code evaluation a 'big thing' you might ask? Well, you might have encountered a situation where you simply wanted to calculate the result of the string '100 / 5 + 25'. This seems a very simple task, but it is very cumbersome with a compiled language. There are a few options though: the most difficult would be to write a parser and interpreter (for instance using the Interpreter Design Pattern) for this syntax. There are numerous papers written about writing parsers, but it still is a very difficult, big and risky task. It will take up a lot of time for your project for something that seems so trivial.


The previous example only talks about simple arithmetic. What if you wanted to further enhance your expressions, so that you could calculate something like '100 + Sin(5) + PIValue('cdt158')' ? This would mean a big change to your custom interpreter, and what about '100 + Sin(100 + 4 + PIValue('cdt158'))'. That would mean you have to build a full-blown language interpreter!


There is the option of using the CSharpCodeProvider for C# or the VBCodeProvider for VB.NET. These classes have to ability to compile C# and VB.NET code, and give output to your application. (I have used the CSharpCodeProvider a lot when building extensible applications where the goal was to let users 'script' some application behavior). The issue with this technique is performance, overhead and transparancy.


Here is an example of using the CSharpCodeProvider to evaluate the expression '100 + 5 + Sin(2)'.




And the output would be




The CSharpCodeProvider basically is a wrapper around the C# command line compiler csc.exe. It is not a compiler (API) in itself. Performance is meager, and there are always assemblies created on disk. (Don't let the 'GenerateInMemory' option of the CompilerOptions fool you: it only means that a temporary assembly is created on disk). This means that performance is slower than expected.


There is much overhead: you cannot simply tell the CSharpCodeProvider to compile the string '100 + 5 / 2'. You have to generate some wrapper class, with a method that returns your code.  You then have to use Reflection to get the type and a methodinfo to call the method. This generates a lot of overhead in your application. My guess is that Roslyn does generate some wrapper classes and methods when you evaluate a simple expression, but this is done 'under the hood'.


When you create an object with the CSharpCodeProvider, you will get your CLR object back, just like you would in 'normal code'. Other than using Reflection, there is no way of knowing what the object is and what the statements are in a method. With Roslyn, you have transparancy: you will have access to the entire Syntax Tree (this will be discussed in a future blogpost).


So, where does Roslyn come in: it solves all the issues described above! It's fast, it has no overhead for the developer (that's us), and it is very transparant!


 Lets have a look at how we can achieve the same thing, with Roslyn




And the result is exactly the same




This seems clear: we can easily evaluate (C#) expressions with only 3 lines of code! With the CSharpCodeProvider it took about 5 times more.


Let's take a look at a more real-world example on how we can use Roslyn. We are going to create an application that can calculate expressions and use PI values. To configure these calculations we will be using a simple XML file.


As mentioned in the previous blogpost, you will need the following to get started with Roslyn:


Microsoft Visual Studio 2010 SP1 (download here)
Microsoft Visual Studio 2010 SDK SP1 (download here)
Microsoft ‘Roslyn’ CTP (download here)


Once you have everything installed, lets fire up Visual Studio 2010 and start a new Console project.




 After that, make references to the following Roslyn libraries, they are typically located in:

  • C:\Program Files (x86)\Reference Assemblies\Microsoft\Roslyn\v1.0 for x64 systems
  • C:\Program Files\Reference Assemblies\Microsoft\Roslyn\v1.0 for x86 systems



First, we will create our XML configuration file to the project, let's call it 'Calculations.xml'. In this XML file we have configured two calculations: 'Sinusoid Delta' and 'Athmospheric Tower Vapor Test'. For the first calculation there are two parameters defined, called 'Current' and 'Before'. These parameters refer to PI Tags (in this case 'sinusoid') with a timestamp. The expression is simple: subtract the 'Before' parameter from the 'Current' parameter.The same concept applies to the second calculation.




In our project we will create two classes that represent the information from the XML document. A class named 'SimpleCalculation' to store the expression and the parameters, and a class called 'SimpleParameter' to store parameter settings.






 As you can see, these are very simple classes to hold the information from the XML document. Now, let's go back to our 'Program' file and create a method to read the XML file into a collection of 'SimpleCalculation'. The goal here is to create a static method that reads our 'Calculations.xml' file, and creates an easy to use list with our collections. We will use LinqToXml for this example:




 If you are unsure of what the 'yield' keyword does, have a look at this blogpost. This post explains in detail what the 'yield' keyword does. So, this method will provide us with a collection of SimpleCalculation. Let's add a RunCalculations() method in the Program class, and call it from our Main method.




 We have not implemented anything yet to really calculate anything, but just to make sure we are on the right track: set a breakpoint in RunCalculations() and hit F5 (run). If you step through, you will see the process of getting the calculations from the XML file at work. If everything runs ok, lets continue to actually run these calculations.


Let's first make sure we can get values out of our PI System. Add a reference to OSIsoft.PISDK, and let's create a 'Helper' class to let us retrieve and write values from the PI System.




This class will help us with some of our logic when reading and writing data with the PISDK. Offcourse you want to build in better exception handling and value checking, but in order to keep this example simple: let's go with something easy.


Let's modify our SimpleParameter class to make use of this PIHelper class, so that it will be able to retrieve the parameter value from the PI System. Your SimpleParameter class should look like this:




Whenever we will call GetValue() on one of our Parameters, it will use the PIHelper class to retrieve that value.


Now onto the Roslyn 'magic'. We want to be able to evaluate the Expression bit of our SimpleCalculation class. For this we need to modify the SimpleCalculation class to look like the following code snippet. A description of the statements is provided with the sourcecode.




The last thing we need to do now is modify the RunCalculations() method in the Program class to actually run the calculations.




We are done! Our little calculation engine is completed. Run the application (F5) and see your calculations with PI data being performed.




The full source code is attached to this blogpost. You can now add some calculations and try out different expressions and options with the ScriptEngine.


I would really value some feedback on this topic. Is there interest in this? If there is, I would be happy to dive deeper into Roslyn and we can have a look at the Expression Trees, creating Visual Studio plugins and Refactorings, etc. Let me know!


Please note that this tutorial and the provided source code is for educational purposes only, do not use this in any production situation.

Of the 'community of interest' conferences out there Microsoft's BlueHat feels most like vCampus Live! 


Don't let the spot light on security fool you. It's more about sharing amongst the elite in their disciplines... folks who instinctively push the limits of technology.  Some are expert in finding seams between networked systems while others are masters of reliable system implementation and management.


For the 1st BlueHat in 2005, the idea of inviting external security researchers on campus was quite 'edgy'.  Today it's clear this quest to build bridges between Microsoft developers and executives, key security program partners, and members of the security research community is resulting in more reliable products.


Here are my favorites from this year's conference by category:

  • Most shocking: John Walton, Principal Security Manager, described Microsoft's cyber 'wargaming' approach on the production cloud for Office 365. I doubt we'll see this trend anytime soon for critical infrastructure but it peaked my interest when he started talking about MTTR metrics and findings they would never have identified using a test environment.
  • Most enlightening: Marcus Niemietz and Mario Heiderich both from Ruhr-Univeristy deep dive into click-jacking and XSS defenses. These topics were a big eye opener to the web security initiatives at While HTML5 is bringing a lot of functionality it comes with complexity; these experts suggest it will be very difficult for developers to get security right. Continuing to be very wary of web technology inside the most critical layers of automation systems seems to make sense.
  • Most sobering: Tie. Jeremiah Grossman of WhiteHat Security on recent web vulnerability trends and statistics. On average 230 vulnerabilities per internet website are observed. Banking is a best performer by sector with an average of 30 vulnerabilities per site. Joe McCray of Strategic Security reinforced this message with his entertaining "You Spent All That Money And You Still Got Owned?" presentation from DefCon.

There were many other great presentations and plenty of opportunities to talk to Microsoft developers.


In turning attention to vCampus Live! 2011, I hope we offer a similar vibe for the community of PI System experts. vCampus Live! is a conference where you can really learn about new technology, what works well, and what can be done should function fall short of expectation.


As a performance domain defined by your most knowledgeable people (and conversely your worse practitioner) information sharing is especially important for cyber security. This year our security focus includes presentations from Joel Langill of Joel is passionate about critical infrastructure protection and has many years of practical industrial experience. 


His research will go into detail on how the infamous STUXNET worm spreads; indeed almost all facilities are exposed in similar ways.  In a second session we'll describe important security practices related to network architecture and active directory.  In particular, we'll highlight what hackers call 'pivot attacks', why you should care, and what can be done to mitigate them.


I look forward to seeing you at vCampus Live 2011!



vCampus Live! 2011 is right around the corner: in less than a month (on Nov. 30th and Dec. 1)  we will have our event at the Palace Hotel in San Francisco.


Many of you already registered, if you have not: Check out the agenda and the abstracts. This year we will have 3 tracks of hands-on sessions, and one track of presentations. We also have Vox Pop sessions, Roundtable discussions and the Developers Lounge.


We are very happy with two new security presentations on track 4 by Joel Langill from Check out his website and blog: the aim of is to bring security information to those involved in Industrial  Control Systems in a simple and easy to undertand manner.


Joel has worked for more than 25 years in the industrial automation and control industry. Joel's unique approach to security emphasizes the processes and people used to implement security programs, rather than relying solely on technology or "products".  The best strategy for comprehensive security balances People, Processes and Products.   His perspective has been sought and cited by numerous industry publications focused on both industrial automation and information security.  Most recently he has played a central role in the analysis and implications of the Stuxnet worm, including new methods of mitigating current and future attacks on critical infrastructure.


Joel is also the Director of Critical Infrastructure and SCADA representative for the Cyber Security Forum Initiative, where he was a lead contributor to a report on the use of could in cyber warfare.  He is a Certified Ethical Hacker, Certified Penetration Test, Cisco Certified Network Associate, and TüV Functional Safety Engineer. 


He will be presenting two presentations on Track 4

How Stuxnet Spreads (30 mins. Track 4, Day 1 04:15 pm - 06:00 pm block)

The Stuxnet worm is a sophisticated piece of computer malware designed to sabotage industrial assets. The worm used both known and previously unknown vulnerabilities to install, infect and propagate, and was powerful enough to evade state-of-the-practice security technologies and procedures, including firewalls, authentication, and anti-virus software to name a few.


Since the discovery of Stuxnet, there has been extensive analysis of Stuxnet’s internal workings. What has not been discussed is how the worm might have migrated from the outside world to supposedly isolated and secure industrial control systems (ICS). Understanding the routes that a directed worm takes as it targets an ICS is critical if these vulnerable pathways are to be closed for future worms.


This presentation is meant to provide a summary of how modern day cyber threats may work their way through even the most protected networks. It also takes a look at what can be learned from the analysis of pathways in order to prevent infection from future worms - whether targeted or not. If the systems that control critical infrastructure are to remain safe and secure, then owners, operators, integrators, and vendors need to recognize that their control systems are now the target of sophisticated attacks. Improved defense-in-depth postures for industrial control systems are needed urgently. Waiting for the next worm may be too late.

Network Architecture and Active Directory Considerations for the PI System (30 mins. Day 1 Track 4, 04:15 pm - 06:00 pm block)



Security standards for industrial control systems (ICS) generally emphasize network segregation between corporate information and automation networks. Typical PI System information flow requires connection with data sources and potentially users residing on automation networks. Careful consideration should be given to network design and Active Directory implementation.


Active Directory is very flexible and scalable but can be quite complex in a large enterprise. While there may not be a one size fits all approach this presentation will highlight common do’s and don’ts related to PI System deployment with Active Directory. It will also provide insight into new features that can help improve user authentication throughout the architecture without compromising security within any particular network zone or communication segment.


Please do not forget to register, seating on the hands-on sessions is limited!




The IDataProvider3

Posted by andreas Nov 1, 2011

Have you ever thought about an IDataProvider3 Add In that exposes more than just a single trace? In this blog post we will fill some example code into the Add-in to help you achieve this.


First we start using the AddInPB_IDP3_CS template:




Note that I am using Visual Studio 2010 and set the .NET Framework 2.0.


As a good practice let's rename the menu item. We open IDP3Shell.cs and expand the IDataProvider3_configuration_methods region. We locate PBObjLib.IDataProvider3.GetMenuItem and change the string to something meaningful:

string PBObjLib.IDataProvider3.GetMenuItem()
    return "MyMultiTrace";

Now as we want to get multiple traces we need to add them to the cols collection in bool PBObjLib.IDataProvider3.ShowColumnConfiguration:

bool PBObjLib.IDataProvider3.ShowColumnConfiguration(ref PBObjLib.Columns cols, PBObjLib.DataPoint dp)
    cols.Add(this.dataSetName + ".RandomData");
    cols.Add(this.dataSetName + ".SinusData");
    return true;

Let's collapse the IDataProvider3_configuration_methods region and expand the IDataProvider3_optional_methods region. We are going to add a check on the two columns here as well:

bool PBObjLib.IDataProvider3.IsColumnValid(PBObjLib.DataPoint dp)
    if (dp == null) 
        return false;
    if (String.Compare(dp.ColumnName, "RandomData", true) == 0) 
        return true;
    if (String.Compare(dp.ColumnName, "SinusData", true) == 0) 
        return true; 
    else return false; 

We take care of the configuration later and focus on the data now. Let us collapse the IDataProvider3_optional_methods region and expand the IDataProvider3_data_methods region.


To tell PI ProcessBook some necessary information we need to visit PBObjLib.IDataProvider3.GetColumnAttributes. According to the data we are going to change it to:

// Set type to Float32
type.Value = PISDK.PointTypeConstants.pttypFloat32;
// Check external data provider to determine what the attributes should be
// for the shell dataset, using constants
if (String.Compare(dp.ColumnName, "RandomData", true) == 0)
    zero.Value = 3.0;
    span.Value = 0.3;
if (String.Compare(dp.ColumnName, "SinusData", true) == 0)
    zero.Value = -1.0;
    span.Value = 2.0;

Keeping it simple we return some random data and a sine wave. So let's go to PBObjLib.IDataProvider3.GetData and do some checks:

// Limit the amount of values to 100
if (maxvalues > 100)
    maxvalues = 100;

// Limit the amount of values to 1 value per second
if (maxvalues > (endtime - starttime))
    maxvalues = (endtime - starttime);

 then we add a random and a double:

Random random = new Random();
double num = random.Next(100) / 1000.0;

and we extend the for loop:

// for the random column
if (String.Compare(dp.ColumnName, "RandomData", true) == 0)
    num = 3.1415 +  0.001 * random.Next(100);

// for the sinus column
if (String.Compare(dp.ColumnName, "SinusData", true) == 0)
    num = Math.Sin(4 * 3.1415 * i / maxvalues);
pivalues.Add(timePT, num, attrs);

Now - to debug that thing we need to set the path to PI ProcessBook by a right click on the project:




Ready to hit F5 and build a trend!




While this works fine on a trend - a value symbol will always show only the first trace. So let us add a Windows Form to choose what trace we want to show. Two CheckBoxes and two buttons should be enough:




Let us define some variables for later usage:

// the columns
public  PBObjLib.Columns mCols { get; set; }
// the dataset name
public  String mDataSetName { get; set; }
// the application object
public PBObjLib.Application mApp { get; set; }

When activating the form we update the text on the checkboxes

DS_random.Text = mDataSetName + ".RandomData";
DS_sinus.Text = mDataSetName + ".SinusData";

and also check what datasets we already use

if (mApp.ActiveDisplay.SelectedSymbols.Count > 0)
    // get the active symbol
    PBObjLib.Symbol mSymbol = mApp.ActiveDisplay.SelectedSymbols.Item(1);
    // iterate through all tags
    for (int i = 1; i <= mSymbol.PtCount; i++)
        if (String.Compare(mSymbol.GetTagName(i), DS_random.Text, true) == 0)
            DS_random.Enabled = false;
        if (String.Compare(mSymbol.GetTagName(i), DS_sinus.Text, true) == 0)
            DS_sinus.Enabled = false;

This is necessary as the column collection only contains the selected column when opened from an existing trend definition.


Now we have to fill the ok and the cancel button:

private void btnOK_Click(object sender, EventArgs e)
    // here we add the columns to the data set
    if (DS_random.Checked)
        mCols.Add(mDataSetName + ".RandomData");
    // here we add the columns to the data set
    if (DS_sinus.Checked)
        mCols.Add(mDataSetName + ".SinusData");

private void btnCancel_Click(object sender, EventArgs e)

To make use of this - lets go back to PBObjLib.IDataProvider3.ShowColumnConfiguration and use the form:

bool PBObjLib.IDataProvider3.ShowColumnConfiguration(ref PBObjLib.Columns cols, PBObjLib.DataPoint dp)
    // This is the configuration form. It is used to choose the columns
    AS_IDP3_MT.ConfigDS MyForm = new AS_IDP3_MT.ConfigDS();
    MyForm.mCols = cols;
    MyForm.mApp =;
    MyForm.mDataSetName = this.dataSetName;
    cols = MyForm.mCols;
    return true;

Time to hit F5 again:




And we can select the two different datasets in our Add-In!

Filter Blog

By date: By tag: