pyqt5 multiple video streams
-
Hello,
I have a question regarding streaming video to my GUI. I currently have two threads that manage two different cameras. They both run at different speeds. One at
120fps and the other at 40fps. In reality, While one camera will be running at 120fps, every three frames will be co-added, so the display output will actually be only 40fps.I laid out my GUI with two qLabels side by side, one for each of the two camera streams. I set the first one up with an emitter that signals for each frame of the video stream. Works perfect.
rawImage = QtGui.QImage(img.data, img.shape[1], img.shape[0], QtGui.QImage.Format_Grayscale8) self.utcPixmap.emit(rawImage) ... self.utcThread.utcPixmap.connect(lambda p: self.setUTCPixMap(p)) ... def setUTCPixMap(self, p): p = QtGui.QPixmap.fromImage(p) p = p.scaled(self.sysV.XDim, self.sysV.YDim, QtCore.Qt.KeepAspectRatio) self.UTCImage.setPixmap(p)
I then set up the next camera with its own thread and matching instructions to above.
However, the emitters are obviously triggering so fast and often that there is conflict between the two streams. I'm getting very choppy images on both displays for the two streams.
The emitters are obviously being sent to the same main window. How do I go about reducing the conflict between the two streams.
-
Hi,
One very important thing: to the best of my knowledge there's currently no consumer graphics card that goes at 120fps and you would also need the adequate screen.
The technique you are currently using is also not optimized for speed nor fast rendering.
If you need to go fast you are going to have to consider OpenGL and likely C++ to move things around faster.
What kind of camera are you using ?
-
The two cameras are similiar to these. the high speed camera is connected to a framegrabber.
http://www.sensorsinc.com/products/detail/microswir-camera
https://en.ids-imaging.com/store/ui-3271le.htmlThis is being used in a R&D environment. Only working with fast exposure, low resolution images, 600x300 for example. Nothing that is movie quality.
-
I'm assuming that your suggesting using a single queue. I can insert from either A or B camera, in their respective threads. Then perhaps by using QTimer, I can set it up the main window to read the queue at, lets say 10ms, Then as the frames I would pop and insert into the qlabel that its associated with?
to be fair the top speed is actually going to be more like 40fps. As the camera will be collecting at 120fps, but I'll be co-adding frames in groups of three. So frame 1, 2, & 3 will be co-added to reduce noise, and that combined frame would be displayed on the UI.
-
Ok, this is a rabbit hole. haha. No I am not currently using multi-processing. I'm using threads via QtCore.QThread.
I understand the logic of using multiprocessing over threads. I'm not entirely up on how pipes would be used in this context. How do i set up an event that is triggered when something is available, but synchronized between two events.
It sounds to me that its very similar to how emit() works, but instead I would be using a queue like means of transferring data, FIFO. The synchronization between the two Queues is where I'm getting lost.
I guess for now the first step is to set things up as Processes rather than threads and figure out how to set an event related to the pipe containing data.
-
To achieve the speed you want, you really should consider using something closer to the hardware for your image processing.
Writing a QtMultimedia camera backend could be an option and you would get the QCamera/QVideoWidget integration "for free".
Out of curiosity, on what OS are you running ?
-
Can you get the camera video stream using GStreamer ?
-
In order to capture frames from each setup, I'm utilizing the respective libraries. For the UTC camera connected to the framegrabber, I'm using the xlib library from epix. While the iDS camera is utilizing their own library, well the python wrapper pyueye.
-
@nightpoison do they provide example applications to use their libraries ? Especially at 120 FPS
-
yes they do. I'm able to pull the frames and update the UI at 120fps, for a single stream. Once I added the second stream, that's were things went south. Using emit() just caused issues, that resulted in reduced speeds, torn and distorted images.
keep in mind, in reality, I actually don't need the 120fps output on the UI, I probably don't even really need the 40fps. So, I've slowed everything down for right now. I'm using a single Queue() to load frames from each camera and a Qtimer() to check if the Queue() is empty or not. Each frame is saved as a tuple, with a camera ID, so I know which frames go to which QLabel. I'm limited to about 20fps right now. Any faster and the queue builds up a backlog fast.
-
Then as already suggested, you should write a QCamera backend using these libraries. You'll have better performances.