We just create tags in PI server, how do we fill history data for those tags. The history data exist in Wonderware historian application
If the wonderware data is being saved in a SQL Database then the RDBMS might be an option. If the Wondeware Historian supports OPC HDA, that might be an option ( It may only support OPC DA). Or of interest might be some of the 3rd party OPC HDA Servers for the Wonderware Historian, companies like Integration Objects (see OSIsoft Partners) . If you have the ability to dump the data to a CSV file in a format like Tag, Timestamp, Value then the UFL interface might be an option.
Hope this helps
Thanks a lot Jim. In our case the UFL probably will be a good option.
Just remember that there need to be archives present for the whole period you will be loading (b.t.w. I only have experience with this for somewhat older PI 3.4.375.x versions). You need to make sure that:
1. archives exist to cover the entire period
2. the archives have enough space to store the data you will backload. (either you make static archives that are sufficiently large, or you make all archives dynamic)
3. when you use dynamic archives, make sure you create all the tags you need before you convert your archives to dynamic,
4. when in doubt about the amount of tags you need create a lot of "spare" tags which you can rename when needed.
I imported 10 years' worth of PI data from WonderWare when I was working in the gulf of Mexico about 5 years ago. Be warned that sucking data out of WonderWare can be a CPU killer. I ran 3 processes (which consumed 75% CPU) 24*7 for 3 months to back fill 10 years of data.
Thanks Peter. In our case, we only back fill number of tags from 100~1000 when asked. We haven't been asked to back fill all the tags yet.
In your earlier post that you mentioned that you have back fill 10 years of data from Wonderware to PI. What's the high level of steps you did to cover the whole process? We have about 5 years of historian data needs to be filled into PI archive.
Don't forget that if you add 'new' tags to PI that you need to populate and backfill, you will need to reprocess any existing archives to provide space in the archives for the tags that would not previously existed in the archives.
Thanks for the tip!
Reprocessing for new tags is only necessary for PI Data Archive versions 3.4.385 and earlier. If you are running PI Data Archive 2012 (3.4.390) or later, you do not need to reprocess.
Also, if you wish for the data to go through compression, make sure there isn't a snapshot value for the tags that is more recent than the data being backfilled. Usually for newly created tags this will be the "pt created" value. Deleting this value will allow data to go through compression normally. For 2012 and later, the tuning parameter Snapshot_DoNotReplacePTCreatedOnOOO controls the behavior of automatically deleting the "pt created" value. If it is enabled (i.e. do not replace), then the event will remain and backfilled events will be considered out-of-order and will not go through compression.
I must be showing my age! Things are moving on apace, and I'll be getting left behind if I'm not careful!...
Retrieving data ...