Most of the integrators/connectors use AF and would also recommend the same.Based on your requirement I would suggest PI Business Analytics to Azure Target. Conventional methods you need to define your implementations and maintain the custom code. Creating asset views, publishing and scheduling is not available by default and you may need to depend on windows tasks or services. With PI BA you have all available by default. Regarding license it is tag based license i.e. how many tags and you can contact sales/techsupport team for exact details.
1500-2000 tags taking more time to extract and transfer might me due to data. It depends on tag frequency rate (data collection rate) and if you using summary calculations like total, avg etc it will take more time than expected. If your achieve subsystem is busy then you will have performance impact.
Please find the details in KB acticle above related to scheduling - trigger based and continuous.
PI Integrator for Business Analytics requires AF contexts to your time series data. We believe that this is the only manageable way to deal with numerous asset data by keeping the metadata information in AF and let users access data through AF. AF provides that framework.
Therefore I'm afraid that you will not be able to use PI Integrator for BA if you have no AF configured but I strongly recommend you to configure AF.
Also, the data extracted by PI Integrator will be different from compressed-raw- data within your PI System. There will be an interpolation.
To answer your questions,
2 i ) PI Integrator to Azure Datalake target is based on a time interval. Only the supported targets can be streamed by Advanced edition ( Apache Kafka, Azure Iot Hub and EventHub)
2 ii) No technical limitations as I am aware unless it's licensing.
2 iii) Licensing is done via #streams to your target.
2 iv) PI Integrator is a product (install, configure and use) meanwhile PI ODBC Driver, PI Web API are both member of PI Developer technologies. You have both more freedom and pitfalls with PI Developer Technologies. But you do have to come with your own ways to connect/send data to Azure datalake stores.
Thank you Lal for the KB!