Note: Development and Testing purposes only. Not supported in production environments.
Link to other containerization articles
During PI World 2018, there was a request for a PI Analysis Service container. The user wanted to be able to spin up multiple PI Analysis Service container to balance the load during periods where there was a lot of back filling to do. Unfortunately, this is limited by the fact that each AF server can only have exactly one instance of PI Analysis Service that runs the analytics for the server. But this has not discouraged me from making a PI Analysis Service container to add to our PI System compose architecture!
Features of this container include:
1. Ability to test the presence of AF Server so that set up won't fail
2. Simple configuration. The only thing you need to change is the host name of the AF Server container that you will be using.
3. Speed. Build and set up takes less than 4 minutes in total.
4. Buffering ability. Data will be held in the buffer when connection to target PI Data Archive goes down. (Added 13 Jun 2018)
You will need to be running the AF Server container since PI Analysis Service stores its run-time settings in the AF Server. You can get one from Spin up AF Server container (SQL Server included).
1. Gather the install kits from the Techsupport website. AF Services
2. Gather the scripts and files from GitHub - elee3/PI-Analysis-Service-container-build.
3. Your folder should now look like this.
4. Run build.bat with the hostname of your AF Server container.
build.bat <AF Server container hostname>
5. Now you can execute the following to create the container.
docker run -it -h <DNS hostname> --name <container name> pias
That's all you need to do! Now when you connect to the AF Server container with PI System Explorer, you will notice that the AF Server is now enabled for asset analysis. (originally, it wasn't enabled)
By running this PI Analysis Service container, you can now configure asset analytics for your AF Server container to produce value added calculated streams from your raw data streams. I will be including this service in the Docker Compose PI System architecture so that you can run everything with just one command.