Outputting query results to a csv, or comma separated values, file is a valuable technique for porting data from one application to another, particularly if direct connections are not a valid strategy. This could include sending the data as an email attachment, populating a folder with csv files from one application that another application digests from that folder, or anything in between. Most applications that consume data are capable of reading in a well formatted csv file, so the portability of this approach is very high.
To output a PI SQL query result to a csv file, the first question to ask is whether this is a one time output or a regular task.
For one time outputs, the easiest way would be to use PI SQL Commander, and set the output to file, as opposed to grid. With a query window open, you'll be able to navigate to Query > Result To > Result to File:
Now when the query is executed, you'll be prompted for a save location, and the output will go to that csv file instead of the grid:
The output in PI SQL Commander will show the number of rows, and the file will contain the data:
For a regularly occurring scheduled task, there are a number of options depending on what exactly you have available to you, and what features you're looking for when generating this csv file. The general work flow of these approaches are the same, however; there needs to be a connection to the PI System, the handling of the data/execution of the query, and the scheduler that kicks off the task at the specified interval.
1. The first option is to not use PI SQL, but to use the PI Integrator for Business Analytics and set the Target to be a text file. Even though this is an online course regarding the PI SQL Framework, the answer to every use case does not have to be PI SQL. The PI Integrator for BA for designed for use cases such as this, and it's a very useful plug-and-play solution.
For further information on the Integrator, you can visit the LiveLibrary page on this topic: PI Integrator for Business Analytics - Configure Text File Targets, or Contact Tech Support for further assistance.
2. A Powershell script can be created, and the job executed by the Windows Task Scheduler. The powershell script could use ADODB record sets, such as described in this technet blog. The Provider string would be pulled from the PI SQL Client, PI OLEDB Provider, or PI OLEDB Enterprise user guides, (or from How to Quickly Obtain an OLEDB Connection String with a UDL File). The outputting to CSV can be handled by a Powershell cmdlet such as Out-File or Export-CSV if the object formatting can be tackled. The Windows Task Scheduler task can be set to launch PowerShell, passing it the parameter that causes it to execute the created script.
This approach has the benefit of being very lightweight using software that's already installed on any Windows OS, including a Server Core OS.
3. A Microsoft SQL Server Integration Services package can be created, and the job executed by the SQL Server Agent. The SSIS package could read from the PI System using an OLE DB Source and write to a csv file using a Flat File Destination. For further information on either of these, see the documentation on flat file destinations from Microsoft, and see this video in the Guided Project for the OLE DB Source.
This approach is good if there are multiple data sources being combined together, heavy processing/shaping being done on the data, or if there are already a number of SQL Server Agent job and SSIS packages being monitored by your SQL DBAs.