iOS camera capture (intended usage of QCamera/QCameraImageCapture/QVideoProbe etc)
I'm building an iOS camera app and need the viewfinder/preview frames in c++-land. I also need full-resolution still images when a capture button is pressed.
I can subclass QAbstractSurface and use QCamera::setViewfinder() to get frames. The problem is that the frames I get for the preview are the same resolution as the still images. I'm testing on an iPhone 5, so I either get 3264 or 1920, let's say 3k or 2k.
I'm using QCamera and QCameraImageCapture in c++ code.
If I use QCameraImageCapture::setEncodingSettings with 3k and QCamera::setViewfinderSettings with 2k, I get 3k preview frames and still images.
I would have thought that the viewfinder api affects the viewfinder resolution and the camera image capture affects a separate still image capture resolution. But this is not the behavior I'm seeing.
Each of these classes are documented in the doxygen sense, but intended usage and how they work together is not.
Pretty much all smartphones and digital cameras show a lower-resolution preview and capture stills at a higher resolution.
My question is how to achieve this with iOS? What should work? How do I get 2k preview frames in c++ and 3k still captures?
The only example apps work on iOS in the sense that they don't crash, but other than that they're not very helpful.. :(
I'd recommend bringing this to the interest mailing list. You'll find there QtMultimedia's developers/maintainers. This forum is more user oriented.
Thanks for the suggestion! I think I'll take your advice. :)