6 Replies Latest reply on Sep 27, 2017 12:52 PM by Wissam

    PI Web API Batch Requests Deadlock

    Wissam

      Working with batch requests is a real improvement from performance perspective, however it seems you cannot perform two or more parallel batch requests at the same time, the batch waits until one is served and start with the next one. This is making the performance slow when multiple users are connected.

      Any idea if this is configurable or can I use any alternative service to get values of multiple attributes in one call without blocking the PI Web API thread? (streamsets does not fit here because I cannot select the target attributes)

        • Re: PI Web API Batch Requests Deadlock
          Marcos Vainer Loeff

          Hello Wissam,

           

          This is not true. The Batch requests can work in parallel according to the values of the ParentIds property. If this property is null or undefined, then PI Web API will process the requests as you expect. If not, the request will start only after all their parent requests finish.

           

          Hope this helps!

            • Re: PI Web API Batch Requests Deadlock
              Wissam

              Thanks Marcos,

              That's true, but in my question I am addressing to the case when two different batch requests are called at the same time. it seems that PI Web API is queuing these calls.

              Below an example:

              Batch Request A

              {

                "1": {

                  "Method": "POST",

                  "Resource": "https://localhost/piwebapi/assetdatabases/D0NxzXSxtlKkGzAaZhKOB-KABJ2buwfWrkye3YhdL2FOUAUEhMQUZTMDRcQgYUUEVSRk9STUFOQ0UgVEVTVElORw/elements",

                  "Content": "{\"Name\":\"New Element\"}",

                  "Headers": {

                    "Cache-Control": "no-cache"

                  }

                },

                "2": {

                  "Method": "GET",

                  "Resource": "$.1.Headers.Location",

                  "ParentIds": [

                    "1"

                  ]

                },

                "3": {

                  "Method": "POST",

                  "Resource": "$.2.Content.Links.Attributes",

                  "Content": "{\"Name\":\"New Attribute\"}",

                  "ParentIds": [

                    "2"

                  ]

                }

              }

              Batch Request B

              {

                "1": {

                  "Method": "POST",

                  "Resource": "https://localhost/piwebapi/assetdatabases/D0NxzXSxtlKkGzAaZhKOB-KABJ2buwfWrkye3YhdL2FOUAUEhMQUZTMDRcQgYUUEVSRk9STUFOQ0UgVEVTVElORw/elements",

                  "Content": "{\"Name\":\"New Element\"}",

                  "Headers": {

                    "Cache-Control": "no-cache"

                  }

                },

                "2": {

                  "Method": "GET",

                  "Resource": "$.1.Headers.Location",

                  "ParentIds": [

                    "1"

                  ]

                },

                "3": {

                  "Method": "POST",

                  "Resource": "$.2.Content.Links.Attributes",

                  "Content": "{\"Name\":\"New Attribute\"}",

                  "ParentIds": [

                    "2"

                  ]

                }

              }

               

              if A and B are called at the same time, PI Web API will serve B after A is done, by that if each batch takes 2 secs to complete B will be done after 4 secs since it was waiting A to complete

              This behavior doesn't occur in other services knowing that PI Web PI support multi-threading.

                • Re: PI Web API Batch Requests Deadlock
                  ashaw

                  Hi Wissam Youssef

                   

                  First, I would hesitate to call this behavior a deadlock, which implies that each batch request would be waiting on the other, so that neither could complete.

                  Instead, it looks like you are seeing the requests get executed serially, where the 2nd request waits for the 1st to complete, before it can run.

                   

                  Second, are you submitting the exact same Batch Request twice?

                  Based on your example, it looks like your batch requests are creating an element (and then an attribute).

                  Are you intending to submit parallel requests to create the same element twice?

                   

                  Finally, in general, calls will be parallel-ized as much as possible with the PI Web API - but there are many reasons this might not happen.

                  • One example is that there are many potential machines involved in a simple request (PI Web API server, AF Server, PI Data Archive) and all of these are subject to processor core contention - so at any given time, one of these machines may not have any cores available to serve requests in parallel.
                  • The more likely reason, is that state-altering requests (creations, deletions, etc) are limited in how much they can be parallel-ized. In general, when you make a state-altering request you can expect that it might not execute in parallel.  I notice that your example batch request is state-altering (creating elements/attributes).  Do you still see a linear scale in your response times if you execute only GET requests within your Batch requests?
                  2 of 2 people found this helpful
                    • Re: PI Web API Batch Requests Deadlock
                      Wissam

                      Thanks Adam,

                      The example i showed is just a sample, the real batch request that I am using is a list of streams/{webId}/interpolated GET requests for attributes that could be called more than once if multiple users are logged in. However it seems that the PI Web API is handling multiple calls linearly.

                      I investigated this by executing a long Batch request and while it's pending i triggered a smaller one to get one value at a specific time using Batch too and wasn't completed until the 1st one was done.

                        • Re: PI Web API Batch Requests Deadlock
                          ashaw

                          Hi Wissam,

                           

                          Are either of your requests (the Batch request or the smaller request) sending a "Cache-Control: no-cache" header along with the requests?

                          The Help file for the Batch controller has many sample requests with this header.

                           

                          If this header is present, any requests that include that header cannot be executed in parallel.

                          This is because they instruct the web server to refresh any cached data, which means all currently running requests must finish, then a cache refresh occurs, then the request that sent the no-cache header can be served.

                          3 of 3 people found this helpful
                            • Re: PI Web API Batch Requests Deadlock
                              Wissam

                              No I am not passing the cache-control header in the requests, now after I investigated more in the issue it appears that the delay is in the streams/{webId}/summary service which is part of the batch request body, it is returning  "Value": "Out of sequence data events in summary calculation.", for date range more than 1 month which is taking more than 1 minute to respond.

                              I'll dig more into this and hopefully fixing this will solve my issue.

                               

                              Thanks.