Solved QT integration with Azure Pipeline
-
I'm new to QT and azure devops. Hopefully, the question has a simple answer.
I have a subdirs project created with unit tests that run fine on my computer. When I run qmake, all of the Makefiles have hard-coded paths to qmake and the QT environment based on my environment.
I have a pipeline job in azure devops that I want to compile and run the unit tests. What is the best way to script the pipeline job? I can see two paths:
-- Check the Makefiles into git and script the jobs cd somePath; make -f Makefile. The challenge is that the paths in the Makefile are specific to my environment and the build server environment is different.
-- Script the pipeline job to use qmake to build the Makefiles. I'm not sure how to get the kits set up correctly with the different paths.
Thank you in advance for you advice.
-
A co-worker of mine talked about the "lazy programmer" is the best programmer because they automated as much of the mundane stuff as possible. With that in mind, I didn't want to have to go try to figure all this out again when I have to set up a new pipeline. So, I re-created the entire pipeline again and documented all of the steps. As a thank you to the community for the help, I'm adding that information below. There is a lot of azure stuff in here, but it also has info on how to integrate QT project files in the process. I hope this helps you...
#Setting up a debian VM in Azure to work with the pipeline jobs
Last Updated: 20 December 2018
Since, QT is not one of the tools supported by the Microsoft Azure build servers, you'll need to create a self-managed VM in the Azure portal to use. This assumes that you have the proper Azure accounts:
- Azure: azure.microsoft.com (used to create a VM.)
- Azure devops: dev.azure.com/devops (dev.azure.com/YourDevopsAccount)
used to create pipeline jobs.
You will need to do the following:
*Prep your QT project file (.pro)
*Create a debian VM and add it to the server pool
*Set up a server pool to manage the server
*Set up the pipeline jobPreface & definitions
The pipeline job runs in the root of the repo that it downloads, so all of the paths listed below are relative to the root of the repo. This guide assumes that the code is in an Azure repo. However, the Azure pipeline can equally read from github.
In our repo, we have two directories directly below the root of the git repo:
- ./builds
- ./ProjectSource
In the builds directory we have a .gitignore file with '**/*'' to ignore all of the files in the builds directory.
The script for the pipeline will run something like the following:
mkdir -p builds/release/ProjectName
cd builds/release/ProjectName
qmake -makefile -o ./Makefile ../../../ProjectSoure/project.pro
make -f MakefilePrep the QT project file (.pro)
The ".pro" project file is the central configuration for the Azure pipeline, so the script listed above needs to work on your development workstation properly if you hope to have it work on the build server.
cd RootOfYourRepo
Run the commands of the script you will use for the pipeline and ensure that
the script works completely on your development workstation. Ensure that
all paths are relative as the paths will be very different on the build
server.I found that I had to update the .pro files of the sub-projects to ensure library and include paths were correct. I also had to add dependency files in the main .pro file. For example: devicelibrary-impl.depends = devicelibrary-abstract
These dependency definitions are important to make sure the Makefiles are set up correctly.
Fix any errors you find with "make -f Makefile" and check those changes back into the repo.
Creat the debian VM
We have chosen debian as the target for the application, so the build server will need to be the same version to ensure that the builds are compatible. I will not give much detail on that as it's all basic Azure VM creation that is well documented in the Azure documentation. Be sure to give it a public IP so you can connect to it via ssh.
You will need to create a user that will be used for the builds. It can't be root. So, pick a username that makes sense. Add your public key to that user's .ssh/authorized_keys file to make access easier.
Install the build dependencies
You will need to install whatever third-party libraries you use. (e.g., sdk's, etc). We have a script called "aptPackages.sh" that we use to maintain what is needed for a development workstation or build server. That can be added to the VM post creation build script. Be sure to install the appropriate version of QT. (e.g., apt-get install QT5-default). You may need to reboot the vm after installing QT. When I first started the process, the compile was failing until the VM
was stopped and restarted. Then it started working.At this point, you should be able to connect to the server via ssh. The server is nearly ready to work as a build server. But, you will need to install the agent pool software. That can only be retrieved after the agent pool is set up which is described in the next section.
Set up the Deployment Pool
The deployment pool is a high-level grouping of deployment resources created at the Organization level. This is important to understand as it is outside of the project. The Deployment Group is inside the project. We created a Deployment Pool named "IPS Deployment Group IPS Debian Build Server." The name is long but I'm trying to understand the relationship between Deployment Pool, Deployment Group and Agent pool. By giving it a long name, I can see where it's defined and the intention of the resource no matter where I see in the interface.
You manage Deployment Groups from the "Organization Settings" in the https://dev.azure.com/YourOrganization site. The "Deployment Pool" connects a project with the Deployment Group.
Once you have the Deployment Pool created, there is a script to run on the server which will connect the VM with the Deployment Pool. Run the script on your server.
Set up the Deployment Group
The Deployment group is defined inside of a project and associated with pipelines. It relates a Deployment Pool with a project. We named our, "Deployment Group IPS Debian Build Server" and select "IPS Deployment Group IPS Debian Build Server" as the Deployment pool.
This is manged from the project, pipelines, deployment groups section. Once you have created the Deployment Group, there is a script that you will need to run on the VM.
Set up the agent pool
In order for the deployment pipeline to know which server it can use to run the build, there has to be a linkage between the VM and the pipeline. This is done with "Agent Pools" in Azure devops. In the pipeline.yml file associated with the pipeline job, there is a setting for "Agent Pool." You can create an agent pool, add the VM to the pool then specify that pool in the yml file and voila, the pipeline job will know which server(s) it can use for builds.
We used a pool name of "Debian server" which will be needed in the final step of setting up a pipeline job.
To manage the agent pools, login to the Azure devops site:
https://dev.azure.com/YourOrganizationIn the bottom-left of the screen, click on "Organization Settings." Under "Pipelines" click on "Agent pools." From here you can create, edit & delete agent pools. Once have a pool created, you can download the script necessary to install the agent software. Click on the "Download" button after you have selected your agent pool.
Installing the agent software is essentiall to completing the linkage of the VM with the agent pool and the pipeline. Installation involves downloading a tarball, scp'ing to to the server and running the simple script. All of the steps and software are available on the Azure devops Agent Pool screen.
You created the VM in the previous step. Download the agent software and install it on the VM. This should complete the linkage between the VM and the agent pool.
Note, the install and configure jobs run once. There is a "run.sh" command that must run in order for the agent to scan for jobs. By default it is not set up as a service. You have to run it manually to get it working. However, it can be set up as a service so that it is running all of the time.
Once you have the agent software installed and running, look at the agent pool and see if your server is listed as a know server in the agent pool. If so, you have a server that is ready to run your builds.
Set up the pipeline job
At this point, you are ready to create a build pipeline and build QT on newly create VM. Login to the Azure devops site:
https://dev.azure.com/YourOrganizationClick on "Pipelines" the click on "+ New" to create a new pipeline. Follow the prompts and enter the information as needed. You will end up with a azure-pipelines.yml file that will be checked into the root of your repo.
Be sure you edit the pipeline in the visual designer and select the appropriate pool that you just created. Even if you put the pool in the yaml file, you have to "authorize" it in the designer.
This is a sample file:
# C/C++ with GCC # Build your C/C++ project with GCC using make. # Add steps that publish test results, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/apps/c-cpp/gcc # Trigger on changes to the develop brandh # Use servers in the "Debian server" agent pool # 4 Steps in the script to build the site. trigger: - develop pool: name: 'Debian server' steps: - script: | pwd cd builds/release/ProjectName qmake -makefile -o ./Makefile ../../../ProjetSource/Project.pro make -f Makefile - displayName: 'make'
#Troubleshooting the Azure Pipeline
Pipeline Setup
If the pipeline job never kicks off, then it may be a set up issue with the pipeline.
Assume you created a VM for the pipeline name "MyBuildServerABC." You need to check to be sure everything is set up correctly.
- Check the VM itself
- Make sure the VM is running and you can log into
- check to see if ~/azagent directory exits
- Both the Deployment pool and deployment group installation scripts use that directory. If it doesn't exist, they probably aren't set up.
- Make sure that the agent software is running
- The build agent software installs whereever you installed it. We installed in the home directory of the user that we created for the VM. The directory where you created the agent software is where you'll need to go to run the run.sh command.
- cd ~; ./run.sh
- Make sure the azure pipeline is set up correctly.
- Log into https://dev.azure.com/YourOrganization
- At the root / organization level, click on "Organization Settings" in the bottom-left of the screen.
- Click on Deployment pools and check your deployment pool has at least 1 server running.
- You can click on the pool name and see the names of the targets.
- You should see the name of your VM (MyBuildSeverABC)
- Click on Deployment pools and check your deployment pool has at least 1 server running.
- Navigate into your project. This is inside of the organization.
- Click on Pipelines
- Click on Deployment Groups
- You should see your deployment group with "1 Online"
- Click on your deployment group to see the name of your build server (MyBuildServerABD)
- Navigate into the project settings (bottom-left of the screen)
- Note, the menu item at the bottom-left of the screen changes from "Organization Settings" to "Project Settings" when you navigate into a project. To get back to the organization level, click on "Azure DevOps" in the upper-left of the screen.
- Click on Agent pools.
- Click on the agent pool you set up.
- You should see your VM listed (MyBuildServerABC)
If the pipeline job kicks off, but you get an error like, "Could not find a pool with name {PoolName}. The pool does not exist or has not been authorized." You need to edit the pool in visual mode, click on "Edit in the visual designer," and be sure the pool you want to use is selected. See also: https://docs.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=vsts#troubleshooting-authorization-for-a-yaml-pipeline.
Pipeline script fails
If the pipeline job runs, but the script fails, you can drill into the build history and look at the logs in the azure devops portal. Once the job has run once the source code will exists on the hard drive of the VM. The default is in the home directory of the user in _work (~work/...). You can ssh into the VM and run the script commands to simulate what is happening with the build job. With that, you can debug errors in the script.
-
@dwilliams The only possible way is to use qmake to create the Makefiles on every build station.
Usually there is not much magic involved:
- make sure the compiler is correctly set up (e.g. for MSVC, call the vcvars*.bat
- call the qmake from the Qt version you want to use with full path. This will create Makefiles suiting the Qt version of that qmake
Check the Makefiles into git
DONT do that! It will bite you and break your build anytime. The same applies for generated files, like the ui_xxx.h.
Note that using shadow build gives an easy way to start a completely broken build from scratch, which often helps to recover from strange problems.
Regards
-
Thank you for the info. That helped and I'm making progress.
I created a debian VM in azure and have installed QT (apt-get install QT5-default). As part of my build, I run qmake -makefile pathToProFileForTheMainProject -o buildPathMakefile
It creates the make file and then I run make -f Makefile. It runs and has all of the dependencies, but now I'm getting compilation errors.
error: ISO C++ forbids declaration of 'Q_ENUM' with no type
-fpermissive]
Q_ENUM(ControllerState)error: 'qInfo' was not declared in this scope
qInfo() << "changing state from" << _currentState << "to" << newState << endl;Where are the script files to set up the compiler? I suspect it's just a configuration issue.
-
@dwilliams said in QT integration with Azure Pipeline:
-- Check the Makefiles into git and script the jobs cd somePath; make -f Makefile. The challenge is that the paths in the Makefile are specific to my environment and the build server environment is different.
-- Script the pipeline job to use qmake to build the Makefiles. I'm not sure how to get the kits set up correctly with the different paths.never go the first way. Makefiles are basically meant to be kept local, not shared between machines.
error: ISO C++ forbids declaration of 'Q_ENUM' with no type
-fpermissive]
Q_ENUM(ControllerState)did you miss to include QObject (directly or indirectly) anywhere in your file?
Also check the version you have installed on the system. Q_ENUM is only available since Qt 5.5error: 'qInfo' was not declared in this scope
qInfo() << "changing state from" << _currentState << "to" << newState << endl;add
#include <QDebug>
different compilers can very likely behave differently on the same code.
-
It seems to be working now, but I didn't make any changes today. The only change I can think of is that the VM shut down last night per the schedule and I restarted this morning. Maybe the install of QT5 took effect better with the reboot.
I'm going to create a new VM and document every step along the way to get it working with Azure pipeline.
Thanks for your help!
-
A co-worker of mine talked about the "lazy programmer" is the best programmer because they automated as much of the mundane stuff as possible. With that in mind, I didn't want to have to go try to figure all this out again when I have to set up a new pipeline. So, I re-created the entire pipeline again and documented all of the steps. As a thank you to the community for the help, I'm adding that information below. There is a lot of azure stuff in here, but it also has info on how to integrate QT project files in the process. I hope this helps you...
#Setting up a debian VM in Azure to work with the pipeline jobs
Last Updated: 20 December 2018
Since, QT is not one of the tools supported by the Microsoft Azure build servers, you'll need to create a self-managed VM in the Azure portal to use. This assumes that you have the proper Azure accounts:
- Azure: azure.microsoft.com (used to create a VM.)
- Azure devops: dev.azure.com/devops (dev.azure.com/YourDevopsAccount)
used to create pipeline jobs.
You will need to do the following:
*Prep your QT project file (.pro)
*Create a debian VM and add it to the server pool
*Set up a server pool to manage the server
*Set up the pipeline jobPreface & definitions
The pipeline job runs in the root of the repo that it downloads, so all of the paths listed below are relative to the root of the repo. This guide assumes that the code is in an Azure repo. However, the Azure pipeline can equally read from github.
In our repo, we have two directories directly below the root of the git repo:
- ./builds
- ./ProjectSource
In the builds directory we have a .gitignore file with '**/*'' to ignore all of the files in the builds directory.
The script for the pipeline will run something like the following:
mkdir -p builds/release/ProjectName
cd builds/release/ProjectName
qmake -makefile -o ./Makefile ../../../ProjectSoure/project.pro
make -f MakefilePrep the QT project file (.pro)
The ".pro" project file is the central configuration for the Azure pipeline, so the script listed above needs to work on your development workstation properly if you hope to have it work on the build server.
cd RootOfYourRepo
Run the commands of the script you will use for the pipeline and ensure that
the script works completely on your development workstation. Ensure that
all paths are relative as the paths will be very different on the build
server.I found that I had to update the .pro files of the sub-projects to ensure library and include paths were correct. I also had to add dependency files in the main .pro file. For example: devicelibrary-impl.depends = devicelibrary-abstract
These dependency definitions are important to make sure the Makefiles are set up correctly.
Fix any errors you find with "make -f Makefile" and check those changes back into the repo.
Creat the debian VM
We have chosen debian as the target for the application, so the build server will need to be the same version to ensure that the builds are compatible. I will not give much detail on that as it's all basic Azure VM creation that is well documented in the Azure documentation. Be sure to give it a public IP so you can connect to it via ssh.
You will need to create a user that will be used for the builds. It can't be root. So, pick a username that makes sense. Add your public key to that user's .ssh/authorized_keys file to make access easier.
Install the build dependencies
You will need to install whatever third-party libraries you use. (e.g., sdk's, etc). We have a script called "aptPackages.sh" that we use to maintain what is needed for a development workstation or build server. That can be added to the VM post creation build script. Be sure to install the appropriate version of QT. (e.g., apt-get install QT5-default). You may need to reboot the vm after installing QT. When I first started the process, the compile was failing until the VM
was stopped and restarted. Then it started working.At this point, you should be able to connect to the server via ssh. The server is nearly ready to work as a build server. But, you will need to install the agent pool software. That can only be retrieved after the agent pool is set up which is described in the next section.
Set up the Deployment Pool
The deployment pool is a high-level grouping of deployment resources created at the Organization level. This is important to understand as it is outside of the project. The Deployment Group is inside the project. We created a Deployment Pool named "IPS Deployment Group IPS Debian Build Server." The name is long but I'm trying to understand the relationship between Deployment Pool, Deployment Group and Agent pool. By giving it a long name, I can see where it's defined and the intention of the resource no matter where I see in the interface.
You manage Deployment Groups from the "Organization Settings" in the https://dev.azure.com/YourOrganization site. The "Deployment Pool" connects a project with the Deployment Group.
Once you have the Deployment Pool created, there is a script to run on the server which will connect the VM with the Deployment Pool. Run the script on your server.
Set up the Deployment Group
The Deployment group is defined inside of a project and associated with pipelines. It relates a Deployment Pool with a project. We named our, "Deployment Group IPS Debian Build Server" and select "IPS Deployment Group IPS Debian Build Server" as the Deployment pool.
This is manged from the project, pipelines, deployment groups section. Once you have created the Deployment Group, there is a script that you will need to run on the VM.
Set up the agent pool
In order for the deployment pipeline to know which server it can use to run the build, there has to be a linkage between the VM and the pipeline. This is done with "Agent Pools" in Azure devops. In the pipeline.yml file associated with the pipeline job, there is a setting for "Agent Pool." You can create an agent pool, add the VM to the pool then specify that pool in the yml file and voila, the pipeline job will know which server(s) it can use for builds.
We used a pool name of "Debian server" which will be needed in the final step of setting up a pipeline job.
To manage the agent pools, login to the Azure devops site:
https://dev.azure.com/YourOrganizationIn the bottom-left of the screen, click on "Organization Settings." Under "Pipelines" click on "Agent pools." From here you can create, edit & delete agent pools. Once have a pool created, you can download the script necessary to install the agent software. Click on the "Download" button after you have selected your agent pool.
Installing the agent software is essentiall to completing the linkage of the VM with the agent pool and the pipeline. Installation involves downloading a tarball, scp'ing to to the server and running the simple script. All of the steps and software are available on the Azure devops Agent Pool screen.
You created the VM in the previous step. Download the agent software and install it on the VM. This should complete the linkage between the VM and the agent pool.
Note, the install and configure jobs run once. There is a "run.sh" command that must run in order for the agent to scan for jobs. By default it is not set up as a service. You have to run it manually to get it working. However, it can be set up as a service so that it is running all of the time.
Once you have the agent software installed and running, look at the agent pool and see if your server is listed as a know server in the agent pool. If so, you have a server that is ready to run your builds.
Set up the pipeline job
At this point, you are ready to create a build pipeline and build QT on newly create VM. Login to the Azure devops site:
https://dev.azure.com/YourOrganizationClick on "Pipelines" the click on "+ New" to create a new pipeline. Follow the prompts and enter the information as needed. You will end up with a azure-pipelines.yml file that will be checked into the root of your repo.
Be sure you edit the pipeline in the visual designer and select the appropriate pool that you just created. Even if you put the pool in the yaml file, you have to "authorize" it in the designer.
This is a sample file:
# C/C++ with GCC # Build your C/C++ project with GCC using make. # Add steps that publish test results, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/apps/c-cpp/gcc # Trigger on changes to the develop brandh # Use servers in the "Debian server" agent pool # 4 Steps in the script to build the site. trigger: - develop pool: name: 'Debian server' steps: - script: | pwd cd builds/release/ProjectName qmake -makefile -o ./Makefile ../../../ProjetSource/Project.pro make -f Makefile - displayName: 'make'
#Troubleshooting the Azure Pipeline
Pipeline Setup
If the pipeline job never kicks off, then it may be a set up issue with the pipeline.
Assume you created a VM for the pipeline name "MyBuildServerABC." You need to check to be sure everything is set up correctly.
- Check the VM itself
- Make sure the VM is running and you can log into
- check to see if ~/azagent directory exits
- Both the Deployment pool and deployment group installation scripts use that directory. If it doesn't exist, they probably aren't set up.
- Make sure that the agent software is running
- The build agent software installs whereever you installed it. We installed in the home directory of the user that we created for the VM. The directory where you created the agent software is where you'll need to go to run the run.sh command.
- cd ~; ./run.sh
- Make sure the azure pipeline is set up correctly.
- Log into https://dev.azure.com/YourOrganization
- At the root / organization level, click on "Organization Settings" in the bottom-left of the screen.
- Click on Deployment pools and check your deployment pool has at least 1 server running.
- You can click on the pool name and see the names of the targets.
- You should see the name of your VM (MyBuildSeverABC)
- Click on Deployment pools and check your deployment pool has at least 1 server running.
- Navigate into your project. This is inside of the organization.
- Click on Pipelines
- Click on Deployment Groups
- You should see your deployment group with "1 Online"
- Click on your deployment group to see the name of your build server (MyBuildServerABD)
- Navigate into the project settings (bottom-left of the screen)
- Note, the menu item at the bottom-left of the screen changes from "Organization Settings" to "Project Settings" when you navigate into a project. To get back to the organization level, click on "Azure DevOps" in the upper-left of the screen.
- Click on Agent pools.
- Click on the agent pool you set up.
- You should see your VM listed (MyBuildServerABC)
If the pipeline job kicks off, but you get an error like, "Could not find a pool with name {PoolName}. The pool does not exist or has not been authorized." You need to edit the pool in visual mode, click on "Edit in the visual designer," and be sure the pool you want to use is selected. See also: https://docs.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=vsts#troubleshooting-authorization-for-a-yaml-pipeline.
Pipeline script fails
If the pipeline job runs, but the script fails, you can drill into the build history and look at the logs in the azure devops portal. Once the job has run once the source code will exists on the hard drive of the VM. The default is in the home directory of the user in _work (~work/...). You can ssh into the VM and run the script commands to simulate what is happening with the build job. With that, you can debug errors in the script.
-
Hi and welcome to devnet,
Thank you very much for the detailed instructions ! Would you mind turning it into a Wiki article ? This will make it more visible as forum threads tends to disappear over time due to new threads piling upon them.
-
@SGaist I have moved this to the Wiki along with a number of corrections and simplifications I found. The page has been submitted to moderation and should be available once they have reviewed it. Thanks for the suggestion of making it a wiki page.
https://wiki.qt.io/index.php?title=QT_Azure_Pipeline_Integration
-
Thanks !
By the way, it's Qt, QT stands for Apple QuickTime which is not what you are writing about.
-
@SGaist That's good to know. thx.