Unsolved How to integrate Qt and vendor-supplied API to display a live stream video in Qt GUI layer?
-
I'd like to use Qt to design a GUI for my embedded target device to stream decoded videos. How may I integrate my 3rd party API into my GUI creation?
I've created a button in Qt GUI and once this button is click, I want it to run the function SAMPLE_VDEC_VdhH264 that display my decoded video. I've two questions:
- With reference to the following code sample, is this the right way to display a live stream video?
- When I clicked on the pushButton that I've created, it jumps into the video layer and basically unable to bounce back to the main screen with the pushButton. Is there any reference I may refer to, to display the decoded video stream in the same screen (same graphics layer) as my pushButton?, i.e. I want to embed the vendor-supplied stream decoder video layer onto Qt graphics layer that has my GUI, all these sitting on top of my vendor-supplied framebuffer.
Snapshot of coding block to display pushButton and link to my vendor-supplied function to start decoded video streaming process:
void MainWindow::on_pushButton_clicked() { pthread_t t1; int ret = 0; ret = pthread_create(&t1, NULL, SAMPLE_VDEC_VdhH264, NULL); if (ret != 0) { printf("Error creating thread \n"); } //SAMPLE_VDEC_VdhH264(); sleep(5); //stop video display; }
-
@embdev sleep(5) will block for 5 seconds, that means your GUI will not react for 5 seconds. Why do you call sleep?
What exactly does SAMPLE_VDEC_VdhH264 do? Does it show a video? If so how (in window?)? -
Hi embdev,
Are you able to run compiled qt gui example project on your embedded target?
I am trying to run one of the example project analogclock on the hisilicon cpu. But getting error on running this application on hisilicon. Did you able to run the example project? If so, could you please let me know what are the configuration options you have used to compile the Qt for your embedded target?
Regards,
Kraj.