Has anyone, OSI or partner, ever looked in to a mechanism for storing video streams along side process data in PI? What I had the vision of is having the video stream stored in a PI tag archive directly and not using annotations, so the actual time the frames of the video are captured, the frames of data are sent and stored at that time in a PI tag. Then you have a video stream viewer control to watch the stream directly from the PI tag data - all usual video controls (play, rewind, pause), via usual PISDK methods. Like when you create a slider on a trend to view the data at that time, the video control could would display the frame(s) at that exact time. Playing the video stream you would see the slider move along the trend that has associated process data displayed as traces.
Imagine you are investigating flaring, you see spikes in a flow meter to the flare. If at the same time you had the video stream stored, you could watch the actual footage of the flare whilst reviewing the flow meter data together. (Just an example, possibilities are huge - e.g. production line quality monitoring).
Now data storage is an obvious issue when you talk about video streams but with some clever compression & buffering and the fact storage is relatively cheap (seperate PI archives for video streams via PI Point partitioning) it could work.