Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Search
  • Get Qt Extensions
  • Unsolved
Collapse
Brand Logo
  1. Home
  2. Qt Development
  3. Installation and Deployment
  4. How to use OpenVINO library in Qt5.9.1 on Ubuntu
Forum Updated to NodeBB v4.3 + New Features

How to use OpenVINO library in Qt5.9.1 on Ubuntu

Scheduled Pinned Locked Moved Unsolved Installation and Deployment
2 Posts 2 Posters 835 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • H Offline
    H Offline
    Hank_Lee
    wrote on last edited by
    #1

    Environment:
    *OS: Ubuntu 16.04
    *Qt: 5.9.1
    *OpenVINO: toolkit for linux 2018 R5
    *OpnenCV: 4.0.1

    Hi
    I want to use Intel OpenVINO library in Qt 5.9.1
    My Qt pro file setting is:
    INCLUDEPATH += /usr/local/include
    /usr/local/include/opencv4
    /usr/local/include/opencv4/opencv2
    /usr/local/include/opencv4/opencv2/videoio
    /opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/inference_engine/include
    /opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/inference_engine/include\cpp

    LIBS += /usr/local/lib/libopencv_.so
    /usr/local/lib/libopencv_highgui.so
    /usr/local/lib/libopencv_core.so
    /usr/local/lib/libopencv_imgproc.so
    /usr/local/lib/libopencv_imgcodecs.so
    /usr/local/lib/libopencv_video.so
    /usr/local/lib/libopencv_videoio.so
    /opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/inference_engine/lib/ubuntu_16.04/intel64/
    .so
    /home/fic/inference_engine_samples_build/intel64/Release/lib/*.so
    -ldl -lpthread -fopenmp

    And my code is:
    // load network from IR
    CNNNetReader netReader;
    netReader.ReadNetwork(PATH_TO_IR_XML);
    netReader.ReadWeights(PATH_TO_IR_BIN);
    // set maximum batch size to be used
    netReader.getNetwork().setBatchSize(1);
    CNNNetwork network = netReader.getNetwork();
    // instantiate a plugin for a target hardware
    InferencePlugin plugin = PluginDispatcher({""}).getPluginByDevice("CPU");
    // create executable network and infer request
    ExecutableNetwork executable_network = plugin.LoadNetwork(network,{});
    InferRequest infer_request = executable_network.CreateInferRequest();

    But I got the error message :
    undefined reference to 'omp_get_thread_num@VERSION' libinference_engine.so
    undefined reference to 'omp_get_max_threads@VERSION' libinference_engine.so
    undefined reference to 'GOMP_parallel@VERSION' libinference_engine.so
    undefined reference to 'omp_get_num_threads@VERSION' libinference_engine.so

    Has anyone experienced this?
    Or know how to write Qt pro file can let OpenVINO be worked?

    Thanks.

    jsulmJ 1 Reply Last reply
    0
    • H Hank_Lee

      Environment:
      *OS: Ubuntu 16.04
      *Qt: 5.9.1
      *OpenVINO: toolkit for linux 2018 R5
      *OpnenCV: 4.0.1

      Hi
      I want to use Intel OpenVINO library in Qt 5.9.1
      My Qt pro file setting is:
      INCLUDEPATH += /usr/local/include
      /usr/local/include/opencv4
      /usr/local/include/opencv4/opencv2
      /usr/local/include/opencv4/opencv2/videoio
      /opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/inference_engine/include
      /opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/inference_engine/include\cpp

      LIBS += /usr/local/lib/libopencv_.so
      /usr/local/lib/libopencv_highgui.so
      /usr/local/lib/libopencv_core.so
      /usr/local/lib/libopencv_imgproc.so
      /usr/local/lib/libopencv_imgcodecs.so
      /usr/local/lib/libopencv_video.so
      /usr/local/lib/libopencv_videoio.so
      /opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/inference_engine/lib/ubuntu_16.04/intel64/
      .so
      /home/fic/inference_engine_samples_build/intel64/Release/lib/*.so
      -ldl -lpthread -fopenmp

      And my code is:
      // load network from IR
      CNNNetReader netReader;
      netReader.ReadNetwork(PATH_TO_IR_XML);
      netReader.ReadWeights(PATH_TO_IR_BIN);
      // set maximum batch size to be used
      netReader.getNetwork().setBatchSize(1);
      CNNNetwork network = netReader.getNetwork();
      // instantiate a plugin for a target hardware
      InferencePlugin plugin = PluginDispatcher({""}).getPluginByDevice("CPU");
      // create executable network and infer request
      ExecutableNetwork executable_network = plugin.LoadNetwork(network,{});
      InferRequest infer_request = executable_network.CreateInferRequest();

      But I got the error message :
      undefined reference to 'omp_get_thread_num@VERSION' libinference_engine.so
      undefined reference to 'omp_get_max_threads@VERSION' libinference_engine.so
      undefined reference to 'GOMP_parallel@VERSION' libinference_engine.so
      undefined reference to 'omp_get_num_threads@VERSION' libinference_engine.so

      Has anyone experienced this?
      Or know how to write Qt pro file can let OpenVINO be worked?

      Thanks.

      jsulmJ Offline
      jsulmJ Offline
      jsulm
      Lifetime Qt Champion
      wrote on last edited by
      #2

      @Hank_Lee LIBS should actualy be like this:

      LIBS += -LPATH_TO_LIBS_DIR -lLIBA_NAME
      

      http://doc.qt.io/qt-5/qmake-variable-reference.html#libs

      LIBS += -L/usr/local/lib -lopencv_ \
        -lopencv_highgui \
        -lopencv_core \
        -lopencv_imgproc \
        -lopencv_imgcodecs \
        -lopencv_video \
        -lopencv_videoio \
      ...
      

      https://forum.qt.io/topic/113070/qt-code-of-conduct

      1 Reply Last reply
      1

      • Login

      • Login or register to search.
      • First post
        Last post
      0
      • Categories
      • Recent
      • Tags
      • Popular
      • Users
      • Groups
      • Search
      • Get Qt Extensions
      • Unsolved