How to record audio from camera?
I need to record audio from camera in a crossplatform application.
How to use QAudioRecorder to record from the camera?
I'd check the list of audio inputs and then select the one from the camera if it offers any.
By the way, no images ?
The problem is that when checks the list of audio inputs, appears descriptive names and are different between platforms. So, how can I determine by code what is the audio input that offers the camera?
on Spanish Windows appears that:
"Micrófono externo (IDT High Definition Audio CODEC)"
"Mezcla estéreo (IDT High Definition Audio CODEC)"
"Batería de micrófonos integrada (IDT High Definition Audio CODEC)"
But on Android appears:
"Default audio source"
"Microphone audio source"
"Voice call uplink (Tx) audio source"
"Voice call downlink (Rx) audio source"
"Voice call uplink + downlink audio source"
"Microphone audio source tuned for voice recognition"
The video capture is a headache, because on windows doesn't work and on iOS the video parameters doesn't work, so it's impossible to create a video with an uniform format. So, I've captured the video with another library, but now I need to capture audio.
At some point you have to let the user decide. On a mobile phone you usually have two cameras and on a computer it can go from one to as many as USB/FireWire/Thunderbolt allows.
Which library did you use for the capture ?
On a mobile phone, when use the Camera on QML, it choose correctly the microphone and record the video. How Camera choose it the correct input?
I've tried ffmpeg, opencv, gstreamer, but with all of them I've had problems to compile with Android (and I can't test iOS). So, after days of frustration, I've tried to do my own implementation. I make a library that creates a video doing grabWindow(). It work's well on a computer, but on the mobile it's very slow, so I must to decide to try to optimize it or try again with some of that libraries.
With Camera QML, Android and iOS records a video, but my problem is that the video that produces in iOS is very high. Only 3 seconds is 1,7 MB. Specifing the size on the Camera parameters doesn't work. After a lot of searching, I've read on the Apple site that are a predeterminated sizes for video capturing, and the minium size for a video is 640x480. I suppose that it is the reason because the parameters doesn't work on Camera. I don't need this resolution, so, for me it's wasting space for nothing and I need a small file.
My first attempt was to transform the video with some library. I've tried to used ffmpeg. I've had a lot of problems with compiling and after got it, I didn't understood the sample code to transform from one video to another, and in some step, the code crashed. Doing tests with the ffmpeg console tool, I can't transform the video produced by iOS. One of the reasons was that needs the h264 codec. After a lot of days doing frustated tests, I've decided to search for another library.
GStreamer was the next, but I've had, again, problems with the video produced by iOS. Using it's console tool, it couldn't resize the video, gave an error. I can't found the reason.
Finally I've tried with OpenCV. It was the library that gaves me less problems in compiling. The strategy here was to capture directly from camera with it, because it's library has this feature. The problem is that the class that save the video on a file doesn't work on Android. So, I can't use it.
I don't know what I'll do. I've lost more than 2 months with this problem and I haven't the solution. Suggestions are wellcome!!