Fast video display on embedded linux
-
Hi,
I would like to display some video processed in real time on an embedded computer (ARM processor) running Ubuntu. I want to use my program on egl mode, and if possible under the X server, too. I'm using Qt 5.3.
It seems that creating the bitmaps and displaying them in a QGraphicsWidget is the bottleneck of my program.
In short what I want to do:- create an array to store the bitmap, and a QImage based on this array.
- Display this QImage in a widget, scaled to fit.
- Replace the content of the bitmap array with a new image and redisplay it.
Of course, these small computers aren't graphical workstations, so I'm not expecting miracles, but I'm still wondering which is the fastest way to achieve this?
Creating a QLabel and use setPixmap(QPixmap::fromImage(image)) (this won't scale the contents)
Creating a class based on QWidget and using a QPainter in the PaintEvent handler to display the image scaled
Creating a QGraphicsWidget, adding a scene, then a QGraphicsPixmapItem, and set its contents with setPïxmap(QPixmap::fromImage(image))
Subclassing QOpenGLWidget and implement the PaintEvent handler
Subclassing QOpenGLWidget, create a texture from the bitmap data (glTexImage2D) and display the image as a textured polygon
Currently I'm using solution #3, because the QGraphicsView class is really suited for this kind of problems. But the scene is quite simple, so the other solutions should work too.
My questions are:
- Does the #3 solution use OpenGL? So is it much slower than #4 and #5?
- Using low level methods with the QPainter class (#2) will accelerate the program?
- Does #4 use OpenGL? Will it be faster than #3?
- Will #5 bring considerable benefits over #3 and #4?
Thanks!
-
Hi kabarni
over a year ago we had the same problem.
We tried to find out the fastest solution and we came up with this solution.We used the old beaglebord xm with an OMAP3530.
The advantage of this processor was that it has an hardware accelerated overlay unit.So under Linux it has two framebuffers which than will be overlay by the GPU.
We used the mplayer on top framebuffer and displayed the qml based gui on the second framebuffer.With this solution we achieved the best performance.
With all your mentioned solution we couldn't achieved a better performance.
When you use the Multimedia component of Qml it will be based on gstreamer0.1. The performance is not bad but not as good as the mplayer based solution.
Good luck
Best regards
Juergen -
Thanks for the answer, Jurgen!
I still don't think that it's a good solution for my problem, as I display processed images, not just a video stream.
I'm using a Utilite Pro with a quad-core iMX6 processor (with OpenGL and OpenCL support) which is also more powerful than the Beagleboard.
As you say that you tested the other solutions, can you say which was the fastest among them?
-
Hi Kbarni
ok thats a slightly different problem but still the same.
Your first goal must be to get rid of unnecessary indirection by the Qt abstraction.
To achieve that I would go for opengl texture streaming.
http://www.songho.ca/opengl/gl_pbo.htmlWith this solution you should achieve the fastest solution.
Cheers J
-
Dear Kbari
I have talked with my colleague about this topic.
And we are know the meaning that this is not the fastest of all possibilities.
It is the fastest of all mentioned but there is maybe a faster one.In which format are the images RGB, YUV, ...
Try to avoid conversions.
Try to use hardware acceleration. Use the Imaging processing unit of the i.mx6
I'm quite sure that you can access this unit directly.http://hands.com/~lkcl/eoma/iMX6/VPU_API_RM_L3.0.35_1.1.0.pdf
http://cache.freescale.com/files/32bit/doc/app_note/AN4629.pdf