Help required in integrating Objective-C code in Qt6
-
Hello
My environment is Qt6.4 on Iphone. I am looking at interfacing this simple Barcode Scanner code written in Objective C Using AVFoundation.h for a simple QtWidgets application.
This is the link to the Objective-C Code.
https://gist.github.com/Alex04/6976945I successfully ran this code using XCode on an Iphone... It works really good.
Somehow, I am unable to understand the right way to interface this program with Qt6.
What I want to do is when I press a Pushbutton, This code is called which displays the camera for scanning the barcode. Then if the barcode is scanned the data is returned to the Qt Code which i can display in a Label.
Usually, I like to try a few things before asking for help. So I opened a new project and included all the files in the project. I have used simple routine to read the Mac Address of the Iphone but this looks quite a big code in Objective C an somehow all the information that I could go through, I could not understand the steps to integrate this code with Qt.
Can someone please help?
Thanks in advance.
-
Hi,
Since you want to integrate that code into a C++ project, you need to go from Objective-C to Objective-C++.
The header you will use into your C++ application must be plain C++. Hence you should use something like the PIMPL idiom in order to "hide" the pure Objective-C part.
-
@SGaist Thanks for the suggestion.
The original Code that I want to integrate is actually on this link.
https://github.com/renatosc/qrbarcodescanner-simpleI am also taking help of this post... not so successfully
https://stackoverflow.com/questions/1061005/calling-objective-c-method-from-c-member-functionIn this post, there are two different methods suggested. One is not based on the PIMPL idiom whereas other is based on the PIMPL idiom.
Somehow, I have too many questions regarding implementation of the PIMPL idiom type of method. Hence I decided to try the other one.
This is what I have done so far.This is the ScannerViewController.h file that I have created.
#ifndef SCANNERVIEWCONTROLLER_H #define SCANNERVIEWCONTROLLER_H void ScannerViewController(void); #endif // SCANNERVIEWCONTROLLER_H
This is the RBScannerViewController.h file.
#import "ScannerViewController.h" // This is all that I have added to the original file #import <UIKit/UIKit.h> @interface RBScannerViewController : UIViewController {} // I have added the curly brackets here. @end
This is the RBScannerViewController.mm file that was originally a .m file which i have renamed.
#import "RBScannerViewController.h" #import <AVFoundation/AVFoundation.h> //A class extension of RBScanViewController @interface RBScannerViewController () <AVCaptureMetadataOutputObjectsDelegate> @property (strong, nonatomic) AVCaptureDevice* device; @property (strong, nonatomic) AVCaptureDeviceInput* input; @property (strong, nonatomic) AVCaptureMetadataOutput* output; @property (strong, nonatomic) AVCaptureSession* session; @property (strong, nonatomic) AVCaptureVideoPreviewLayer* preview; @end @implementation RBScannerViewController // This is the slot I have added to the original file. void ScannerViewController(void) { viewDidLoad(); } - (void)viewDidLoad { [super viewDidLoad]; // checking if we have at least one camera device NSArray *allTypes = @[AVCaptureDeviceTypeBuiltInDualCamera, AVCaptureDeviceTypeBuiltInWideAngleCamera, AVCaptureDeviceTypeBuiltInTelephotoCamera ]; AVCaptureDeviceDiscoverySession *discoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:allTypes mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack]; NSArray *devices = discoverySession.devices; BOOL hasCamera = [devices count] > 0; if (hasCamera){ [self setupScanner]; } else { NSLog(@"No Camera available"); } } - (void)didReceiveMemoryWarning { [super didReceiveMemoryWarning]; // Dispose of any resources that can be recreated. } // perform the main setup - (void) setupScanner { // creating the camera device self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; // creating the input NSError *error = nil; self.input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:&error]; if (!self.input){ NSLog(@"error setting up scanner: %@", error); return; } // creating the ouput self.output = [[AVCaptureMetadataOutput alloc] init]; // creating the session (which is responsible for managing the data flow between input/output) self.session = [[AVCaptureSession alloc] init]; [self.session addOutput:self.output]; [self.session addInput:self.input]; // setting self to be the delegate [self.output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()]; // specifying which metadata we want to capture. In this case we are setting it to only look for QR codes. self.output.metadataObjectTypes = @[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]; // Line below lists available metadata for this device. //NSLog(@"This device support identifying the following metadatas = %@", [self.output availableMetadataObjectTypes]); // creating the preview layer self.preview = [AVCaptureVideoPreviewLayer layerWithSession:self.session]; self.preview.videoGravity = AVLayerVideoGravityResizeAspectFill; self.preview.frame = CGRectMake(0,0, self.view.frame.size.width, self.view.frame.size.height); [self.view.layer addSublayer:self.preview]; [self.session startRunning]; } // AVCaptureMetadataOutputObjectsDelegate - here is where the decoding takes place! - (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection { for (AVMetadataObject *metadata in metadataObjects) { if ([metadata.type isEqualToString:AVMetadataObjectTypeQRCode] | [metadata.type isEqualToString:AVMetadataObjectTypeEAN13Code]) { NSLog(@"We found a QR or EAN13 barcode!"); // Transforming the metadata coordinates to screen coordinates so we can show a rect around it if we want to (not showing it in this simple example) AVMetadataMachineReadableCodeObject *transformed = (AVMetadataMachineReadableCodeObject *)[self.preview transformedMetadataObjectForMetadataObject:metadata]; // printing the decoded text to console NSLog(@"%@", [transformed stringValue]); } } } @end
And obviously as expected, the code failed to build.
-
I just remembered, and to simplify your life, did you consider using QZXing ? It's a wrapper for the ZXing library that is cross-platform and does barcode scanning.
That would give you a ready made solution that is directly integrated with Qt.
See this excellent KDAB blog entry on the subject.
-
I just remembered, and to simplify your life, did you consider using QZXing ? It's a wrapper for the ZXing library that is cross-platform and does barcode scanning.
That would give you a ready made solution that is directly integrated with Qt.
See this excellent KDAB blog entry on the subject.
@SGaist HAHAHA :)
I am already using the QZXing library for barcode scanning for Android, Windows and IOS. However, for IOS, because of the bug in https://bugreports.qt.io/browse/QTBUG-98651, I had to upgrade to Qt6.4 as the problem is solved in 6.4. But after I switched the Qt6.4, the QZXing library is giving error in compiling.
I am getting an error. The Build happens successfully, application gets deployed and crashes immediately giving an error.Error: You are creating QApplication before calling UIApplicationMain. If you are writing a native iOS application, and only want to use Qt for parts of the application, a good place to create QApplication is from within 'applicationDidFinishLaunching' inside your UIApplication delegate.
Though the creator of this library has released version for Qt6.2, still I am getting this error.
Hence I started looking for alternatives and I came across this Objective-C code without using any external library, which I am looking at incorporating with Qt6.4.
I dont have much knowledge of Objective-C and not much data is available on mixing that with Qt6.