I would like to import/write process data from an excel spreadsheet to PI Archive, so that data can be stored like other PI tags, for example like the demo tag SINUSOID.
Is there any way to do that?
In your client computer you should find an example file called piexam32.xls. It contains a simple example on how to accomplish this. Usually it is located at C:\Program Files (x86)\PIPC\Excel.
There are functions like PIPutVal as part of datalink, PI SDK code in the VBA, ... but you'll have to consider fanning the data if you're working with a PI Collective.
I would be tempted to say that you just use PI UFL and have your excel sheet create a csv file and pop it in the configured UFL folder.
Is this a regular process of updating values, or sporadic one-offs?
I would like to update the values on a regular basis.
I agree with what Rhys said. If you are doing it on a regular basis then UFL or a custom solution using the SDK would probably work. You could also use ACE, if you have that running, to process the data on a regular interval.
Here is some code to get you started if you go that route, it is using the PI SDK. PIValue is just a simple class the has two properties, timestamp and value. You would populate a list from your excel file.
public static int InsertValues(string PISystem, string tag, List<PIValue> data)
PISDK.PISDK Pisdk = new PISDK.PISDK();
Srv = Pisdk.Servers[PISystem];
var pValues = new PISDK.PIValues();
pValues.ReadOnly = false;
foreach (var item in data)
pValues.Add(item.TimeStamp, item.Value, null);
pValues.ReadOnly = true;
Thank you Lonnie for the code! I do have ACE installed.
Another alternative particularly if you are doing this on a regular basis is to use something like SSIS (you can also use a stored proc) to load the Excel data in a simple SQL table and then use RDBMS to get the data from the SQL table into PI. The advantage of this is that it is all out the box (no coding required), plus you can use SSIS to do some basic data clean up (e.g. remove values in the future). The disadvantage is that you now have an SQL database in the mix. I've personally done this and it does work reasonably well. There is however a caveat with the Excel and SQL. The ACE or JET driver (the data provider for Excel - ACE is the newer one) was created by Satan. The driver performs a type guess (default is to use the first 8 rows of data) to determine the data, and you can't specify the data type. So if you workbook are all well formed, don't change and the users that enter the data follow the rules (mine never do) then your in luck. If not then you're going to have a problem, there are plenty of resource on the Internet related to this problem. The best way I've found to get around this issue is to export the data from Excel to csv files using a Macro (I hook the macro to save event so the use saves a csv format version automatically when they save the file).
If you are going done the ACE route then you need to look at how to get the data out of Excel. Most people automatically do the COM Interop route. I would strongly recommend against going this route for the following reasons:
So now what? Well if you are using Excel 2007+ files (xlsx or xlsm - OpenXML Document format) then your in luck. These files are really nothing more than a zipped directory of XML files. Even better MS has released an SDK to deal with these files. Alternatively you can make use of EPPlus which is an open source project that wraps the OpenXML SDK with some useful classes.
As a side note I'm working on a similar problem where we get a fair amount of manual data entered into Excel that isn't measured (things like stockpile levels). I have data all the way back to 2007 which I'm dealing with. I've gone the SSIS route using csv files; although I'm looking at using EPPlus to do away with the Macro.
Michael brings up some great points. I have been using OpenXML for years now and it is most definitely the way to go vs. COM. To make it even easier for the coding solution, you could just save your file as a .csv and go from there.
I'm going to check out EPPlus, I have never used it, but sounds interesting. This is turning out to be a nice thread!
My advice here is "keep it simple". I understand that reading files into PI is usually recommended when you don't have much data source flexibility to extract data from it. For example, some energy meters for solar panels provide XML data files with time series information available in a ftp folder. When we have devices (or data-sources) considered a "black box" like this, the easier way to read their data would be through PI UFL. So, in summary, reading files is something that you use when a quick solution is needed to deal with inflexibility of some data sources.
Assuming that you want to build a solution that allows multiple users to enter manual data in PI (with more flexibility than PI Manual Logger), I would stay away of "file reading" approach. You can build a simple web app to store data in PI using PI OLEDB, for example. Of course, other solutions are possible (using a combination of some concepts that you already mentioned), but I think without knowing more details about Konstantino's application requirements it would be like "shooting in the dark" . If Konstantino's wants to put more energy to build something right and scalable from scratch to enter manual data in PI, it would be better.
Take care folks!
I agree with Fabiano whole heartily. Excel is actually an awful tool for manual data entry. However, there are times when it is not possible to avoid and we typically use this as an interim step to a more formal solution.
Thank you Michael for the detailed response! It seems that there are several different ways of importing Excel data into PI with various levels of complexity.
Again, thank you All for your help and your ideas! Since I was looking for a quick and simple solution, I ended up modifying the VB code from the C:\Program Files (x86)\PIPC\Excel file that Fabiano suggested and I successfully wrote data to PI!
Thank you Rhys!
I will check it out. Thank you Fabiano!
Retrieving data ...