Getting raw H.264 and AAC data from QML Camera



  • According to this code:

    import QtQuick 2.5
    import QtMultimedia 5.5
    
    Item {
        id: root
        Camera {
            objectName: "camera"
                    id: camera
                    captureMode: Camera.CaptureVideo
                    videoRecorder.videoCodec: "h264"
                    videoRecorder.audioCodec: "aac"
            }
    }
    

    is it possible to get raw H.264 and AAC data (for example, in unsigned char * type) without writing it on the disk drive? Can I access that streams from C++ side? In fact, this data at future will be sending to nginx server using librtmp.
    P.S. Really, I love Qt, but QML documentation sometimes... strange ((:


  • Lifetime Qt Champion

    Hi and welcome to devnet,

    Why not use something like QtGstreamer ? That way you could build your pipe to both output to QML and on the network.

    Hope it helps



  • Thank you very much SGaist!
    GStreamer is amazing library, it covers all my needs!
    Tested camera, sending stream over RTMP and other functions - works great (Windows OS). In additional, there are no problems with Qt integration. I will start to write QML plugin for my application using this lib.
    If someone try to use GStreamer:

    #include <QGuiApplication>
    #include <QQmlApplicationEngine>
    #include <gst/gst.h>
    
    int main(int argc, char *argv[])
    {
        QGuiApplication app(argc, argv);
    
        QQmlApplicationEngine engine;
        engine.load(QUrl(QStringLiteral("qrc:/main.qml")));
    
        GstElement *pipeline;
        GstBus *bus;
        GstMessage *msg;
    
        putenv("GST_DEBUG=6");
        putenv("GST_PLUGIN_PATH_1_0=E:\\sdk\\gstreamer\\1.0\\x86_64\\lib\\gstreamer-1.0\\");
        putenv("GST_PLUGIN_PATH=E:\\sdk\\gstreamer\\1.0\\x86_64\\lib\\gstreamer-1.0\\");
    
        /* Initialize GStreamer */
        gst_init (&argc, &argv);
    
        /* Build the pipeline */
        //pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", 
        //NULL);
        pipeline = gst_parse_launch ("ksvideosrc device-index=0 ! autovideosink", NULL); // Windows OS specific
    
        /* Start playing */
        gst_element_set_state (pipeline, GST_STATE_PLAYING);
    
        /* Wait until error or EOS */
        bus = gst_element_get_bus (pipeline);
        msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, (GstMessageType)(GST_MESSAGE_ERROR | GST_MESSAGE_EOS));
    
        /* Free resources */
        if (msg != NULL)
          gst_message_unref (msg);
        gst_object_unref (bus);
        gst_element_set_state (pipeline, GST_STATE_NULL);
        gst_object_unref (pipeline);
    
        return app.exec();
    }
    

    P.S. I tried to build QtGstreamer - spent more than 6 hours with no success... According to this page: http://gstreamer.freedesktop.org/modules/ there is no major updates for this module.



  • Sorry for up... I spoke too soon with marking topic as "Solved" :(((( Аfter spending this week to work with GStreamer, I can say, that functionality of this lib for Android (not checked iOS) is not good. In my case, I work with camera, but it's not available from GStreamer without making JNI interface for Android, and this reason making GStreamer absolutely unusable for me. For making GStreamer work on Android I can use "appsrc" plugin of GStreamer to push raw image/sound to it, but it is double work for me and CPU/GPU of device!

    Qt QML Camera can record video from camera using default audio/video codecs of current OS (sounds like a magic, yeah...). Is it possible way to get this streams?

    UPDATE
    I found some code in qmediarecorder.cpp source from Qt Lib-s:

    void QMediaRecorder::record()
    {
        Q_D(QMediaRecorder);
    
        d->actualLocation.clear();
    
        if (d->settingsChanged)
            d->_q_applySettings();
    
        // reset error
        d->error = NoError;
        d->errorString = QString();
    
        if (d->control)
            d->control->setState(RecordingState);
    }
    

    but I can't find code according to write raw streams into a video file.
    Please help to find the right path, dear developers :(((


  • Lifetime Qt Champion

    You have to look further deep down in the plugins. The mediacapture folder in qtmultimedia/src/plugins/android



  • @SGaist
    Thanks for you reply. The only way to create what I need - using Android functions directly and I must forgot about crossplatform for my app (if you want to use, for example, hardware acceleration or something).
    The code bellow is the main class (java, for use with QAndroidJniObject), that draws camera preview, using hardware:

    import org.qtproject.qt5.android.bindings.QtApplication;
    import org.qtproject.qt5.android.bindings.QtActivity;
    
    import android.hardware.Camera;
    import android.media.MediaCodec;
    import android.media.MediaCodecInfo;
    import android.media.MediaFormat;
    import android.media.MediaMuxer;
    import android.app.Notification;
    import android.app.NotificationManager;
    import android.content.Context;
    import android.graphics.SurfaceTexture;
    import android.view.TextureView.SurfaceTextureListener;
    import android.view.TextureView;
    import android.view.Gravity;
    import android.widget.FrameLayout;
    import android.os.Bundle;
    
    import java.io.File;
    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.nio.ByteOrder;
    import java.nio.FloatBuffer;
    
    public class CameraAndroid extends QtActivity implements SurfaceTextureListener {
    
        private static NotificationManager notificationManager;
        private static Notification.Builder notificationBuilder;
    
        private static CameraAndroid cameraAndroid;
        public CameraAndroid() {
            cameraAndroid = this;
        }
    
        private static MediaCodec mediaCodec;
        private static TextureView textureView;
        private static Camera camera;
        static int encWidth = 640, encHeight = 480;
    
        public static int start() {
    
            // Camera not started yet
            if(camera == null) {
    
                Camera.CameraInfo info = new Camera.CameraInfo();
    
                // Set front facing camera by default
                int numCameras = Camera.getNumberOfCameras();
                for (int i = 0; i < numCameras; i++) {
                    Camera.getCameraInfo(i, info);
                    if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
                        camera = Camera.open(i);
                        break;
                    }
                }
    
                if (camera == null) {
                    camera = Camera.open();
                }
    
                if (camera == null) return 9001;
    
                Camera.Parameters params = camera.getParameters();
                Camera.Size cameraSize = params.getPreferredPreviewSizeForVideo();
                
                for (Camera.Size size : params.getSupportedPreviewSizes()) {
                    if (size.width == encWidth && size.height == encHeight) {
                        params.setPreviewSize(encWidth, encHeight);
                        break;
                    }
                }
    
                // New camera preview size
                if (cameraSize != null) {
                    params.setPreviewSize(cameraSize.width, cameraSize.height);
                }
    
                camera.setParameters(params);
    
                textureView.setLayoutParams(new FrameLayout.LayoutParams(camera.getParameters().getPreviewSize().width, camera.getParameters().getPreviewSize().height, Gravity.CENTER));
            }
    
            return 0;
        }
    
        public static int stop() {
            if (camera != null) {
                camera.stopPreview();
                camera.release();
                camera = null;
            }
            return 0;
        }
    
        @Override public void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
    
            textureView = new TextureView(this);
            textureView.setSurfaceTextureListener(this);
    
            setContentView(textureView);  // <------------------------- Draws on all entire screen :((((
        }
    
        @Override public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
            try {
                if(start() == 0) {
                    camera.setPreviewTexture(surface);
                    camera.startPreview();
                }
            } catch (IOException ioe) {
                return 9002;
            }
    }
        @Override public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
    }
        @Override public void onSurfaceTextureUpdated(SurfaceTexture surface) {}
        @Override public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
            stop();
            return true;
        }
    }
    

    But now, I can't see QML elements of my app! :(( Is it possible to draw QML elements over/above TextureView, change draw order or something? What I must to do?
    Many thanks!


Log in to reply
 

Looks like your connection to Qt Forum was lost, please wait while we try to reconnect.