Important: Please read the Qt Code of Conduct -

How do you share data between QWebEngineView and the C++ side?

  • Hello,

    I would like to implement a small web crawler in C++, using Qt. What I would like to be able to do is have a QWebEngineView navigate to some url, and then extract some links using a CSS selector. Then proceed to process these links, if possible using several QWebEngineViews concurrently.

    However, I am currently stuck on how to get any data out and to the surrounding C++ process. I saw that you can use the LocalStorage API with Qt Quick, but I did not intend to switch to QML.
    So my other idea was to use QWebEnginePage::runJavaScript to assign some value to document.cookie and intercept that using a signal handler for QWebEngineCookieStore::cookieAdded.

    Unfortunately, that does not seem to work either since now I am getting
    "Uncaught SecurityError: Failed to set the 'cookie' property on 'Document': Access is denied for this document."

    Is my approach a fool's errand? Or what would be some hints on how to proceed from here?

  • Thanks, I started looking into this yesterday as well, but the usage of websockets added some more complexity. So this example is really helpful!

    Since this seems to be the correct (or at the very least more principled) approach to my problem, I will mark this thread as solved.

Log in to reply