It's a open question, what would you do? At 2am the server probably be saturated. Probably the third option it's a valid option too.
First off, I would explore this a little more to understand what the current bottlenecks are and what type of behavior is acceptable. Not using an attribute that stores history (on-demand style) to do this emissions daily calculation could work, but what if the users need 30 days of emissions values for all 200 units? How long does it take to perform this calculation? This solution is really the same as PI Data Calculate function except you now expose the data to client tools as an AF Attribute.
Running all the daily calcs just once a day might not be too taxing. This should be templates based as a best practice. Also, you could look at analysis statistics to see if you could offset the calculations. For some solutions, I have offset these summary calculations several minutes after the start of the hour since many emissions calculations are done on the hour. In this cases, perhaps a few minutes after midnight to let data settle to avoid issues accounting for late arriving data.
For all Datalink usages be sure to use one PI Datalink array formula to allow "bulk" queries in Datalink for any of the solutions as that will reduce the number of trips to the servers.
The largest load on the network and client software would be to bring all of the compressed values into Excel. Instead of the PI Data Archive sending just one value per unit, you are sending 8640 (assuming no compression).
I also would see if you really need this daily report as a PI Datalink report.. Could you email out just one such report with the daily emissions, or allow users to download it from a file server? Could a PI Vision screen be used with perhaps a collection shows all of the emissions values?
Thanks for your time, Dan, you response was so helpfully. Best Regards!
You are welcome! Best of luck and let us know if you discover anything interesting.
Is the Excel strategy the only way to create the report? Why do the same archive calls 200+ times using Datalink? Could one user run a macro that "saves as" a new copy with static values? Could the result be exposed in PI Vision? Could a summary analysis in AF publish its results on a schedule, with users retrieving the output tag? There are probably a few valid approaches here depending on what tools you have available and what is easy, what is robust, what performs fast, etc. I usually default to the strategy that will offer the easiest and cleanest and fastest UX.
Retrieving data ...