I'm trying to make a simple console program for macOS, which should take one image with built in camera and save it to photo library.
Here is a complete code for that program:
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#import <Photos/Photos.h>
volatile int a = 0;
@interface PhotoCaptureProcessor: NSObject <AVCapturePhotoCaptureDelegate>
@end
@implementation PhotoCaptureProcessor
- (void) captureOutput:(AVCapturePhotoOutput *)output
didFinishProcessingPhoto:(AVCapturePhoto *)photo
error:(NSError *)error {
NSLog(@"Called\n");
if(error) {
return;
}
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
[[PHAssetCreationRequest creationRequestForAsset] addResourceWithType:PHAssetResourceTypePhoto
data:[photo fileDataRepresentation]
options:nil];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
if(error) {
NSLog(@"Error saving photo\n");
}
a = 1;
}];
}
@end
int main(int argc, const char * argv[]) {
@autoreleasepool {
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *inputVideoDevice = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
[captureSession beginConfiguration];
if(inputVideoDevice) {
if([captureSession canAddInput:inputVideoDevice]) {
[captureSession addInput:inputVideoDevice];
}
}
else {
return 1;
}
AVCapturePhotoOutput *photoOutput = [[AVCapturePhotoOutput alloc] init];
if(photoOutput) {
if([captureSession canAddOutput:photoOutput]) {
[captureSession addOutput:photoOutput];
}
}
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
[captureSession commitConfiguration];
[captureSession startRunning];
while(![captureSession isRunning]){
NSLog(@"Still not running\n");
}
AVCapturePhotoSettings *photoSettings = [[AVCapturePhotoSettings alloc] init];
PhotoCaptureProcessor *photoProcessor = [[PhotoCaptureProcessor alloc] init];
[photoOutput capturePhotoWithSettings:photoSettings delegate:photoProcessor];
while(!a){
}
}
return 0;
}
After calling method startRunning
little light on my Mac camera turns green (which I believe tells that camera is in use). Unfortunately, method captureOutput
from PhotoCaptureProcessor
, which is my delegate, is never called. If I right click on captureOutput
and then click on Go To Definition it takes me to captureOutput
in AVCapturePhotoCaptureDelegate
(so I believe method signature is fine).
Can anyone tell what am I doing wrong?
CodePudding user response:
The API is working but as no access is granted or requested it will not execute.
in Info.plist make an Entry for NSCameraUsageDescription
and/or NSMicrophoneUsageDescription
.
Write a short message that will be presented when the dialog for user consent pops up that explains why you want access.
typedef void(^ConsentCompletionBlock)(void);
-(void)requestUserConsentFor:(AVMediaType)mediatype andComplete:(ConsentCompletionBlock)block {
if (@available(macOS 10.14, *)) { // || @available(iOS 7.0, *)
// Request permission to access the camera and/or microphone.
switch ([AVCaptureDevice authorizationStatusForMediaType:mediatype]) {
case AVAuthorizationStatusAuthorized: {
NSLog(@"access is granted by user.");
break;
}
case AVAuthorizationStatusNotDetermined: {
NSLog(@"oops, users consent not known yet, asking user!");
[AVCaptureDevice requestAccessForMediaType:mediatype completionHandler:^(BOOL granted) {
if (granted) {
block();
return;
}
NSLog(@"oops, no access granted");
}];
break;
}
case AVAuthorizationStatusDenied: {
NSLog(@"rrrgh, user did not allow access");
return;
}
case AVAuthorizationStatusRestricted: {
NSLog(@"ok, access is restricted");
return;
}
}
} else block(); //fallback for older systems
}
to use like..
[self requestUserConsentFor:AVMediaTypeVideo andComplete:^{
//dont forget UI work needs main thread.
dispatch_async(dispatch_get_main_queue(), ^{
[self yourSessionSetupStuffMethod];
});
}];
short reminder, this will not keep your app running.
Signing&Capabilities places the needed Entries in the Info.plist before copying it into your app Contents folder. if there is no Info.plist you have to tell the OS the consent rule manually or find another way. This is because we dont want any open door for any kind of code that could access cam/mic without consent. The above code actually places the granted consent in the right place.. if you are able to run it but it will need the strings provided to know what to tell the user whats it for.
as you pointed out it is a "Single File Tool" that has no Info.plist read about placing the Info.plist data directly into your binary. As it is just an Info and should never be edited at runtime this should work.
and you are not alone with this trouble
PS: the above block declaration allows to wrap your actual code so it will not execute when the permission is not granted earlier or from last run or right before the block could be executed. This should safe you from crashing. Be aware that on older systems that did not need permissions for this would just invoke the block as fallback.