1 of 1 people found this helpful
Yes, upgrading a PI Server without Data Loss is standard practice. Ensuring that all your data is buffered at the source is the key and should be tested before any upgrade/move. The PI Server 2016 R2 documentation outlines the procedure for testing buffering.
When the PI server is updated, it goes into stand-alone mode which closes/blocks external connections and causes events to fill in the buffer(s). When the server upgrade is complete, the buffer will start sending the queued events to the newly upgraded PI server without a break in the data. The existing archives may be converted during the upgrade process but again, there shouldn't be any data loss.
If you are concerned about the integrity of the data on the secondary node, you can user PI to PI in history recovery mode for the period of the upgrade to ensure the data matches the source system.
Please don't hesitate in contacting TechSupport if you are concerned at any step of the process. They will be able to work with you to ensure the process is smooth!
Excellent, thanks for the quick response!
I appreciate the clarification, I assumed it was possible but wanted to get input from the experts....much appreciated!
If you use AF Analytics, that is another item to consider. You might need to do manual backfilling depending on your setup. For Event-triggered analysis no problem, but scheduled analysis rely on data that is not available. And then there is EventFrames.