Home > Back-end >  captureOutput from AVCaptureVideoDataOutputSampleBufferDelegate is not being called
captureOutput from AVCaptureVideoDataOutputSampleBufferDelegate is not being called

Time:09-06

This seems to be a fairly common problem, and I have linked to similar posts below. Please don't mark this question as a duplicate, as I have carefully read the linked posts and spent a great deal of time implementing every solution provided as an answer or in the comments.

I am seeking to leverage the device camera as a light sensor as described in this post. Unfortunately, the captureObject function is never called by the AVCaptureVideoDataOutputSampleBufferDelegate. It may be relevant that I am attempting this inside of a SwiftUI app, I have not seen this problem posted about or resolved in the context of a SwiftUI app.

class VideoStream: NSObject, ObservableObject, 
    AVCaptureVideoDataOutputSampleBufferDelegate {
    
    @Published var luminosityReading : Double = 0.0
    
    var session : AVCaptureSession!
        
    override init() {
        super.init()
        authorizeCapture()
    }

    func authorizeCapture() {
        // request camera permissions and call beginCapture()
        ...
    }

    func beginCapture() {
        print("beginCapture entered") // prints
        session = AVCaptureSession()
        session.beginConfiguration()
        let videoDevice = bestDevice() // func def omitted for readability
        print("Device: \(videoDevice)") // prints a valid device
        guard
            let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice),
            session.canAddInput(videoDeviceInput)
        else {
            print("Camera selection failed")
            return
        }
        
        let videoOutput = AVCaptureVideoDataOutput()
        guard
            session.canAddOutput(videoOutput)
        else {
            print("Error creating video output")
            return
        }
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: 
             "VideoQueue"))
        session.addOutput(videoOutput)
        session.sessionPreset = .medium
        session.commitConfiguration()
        session.startRunning()
     }

    // From: https://stackoverflow.com/questions/41921326/how-to-get-light-value-from- 
       avfoundation/46842115#46842115
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, 
        from connection: AVCaptureConnection) {

        print("captureOutput entered")  // never printed
        
        // Retrieving EXIF data of camara frame buffer
        let rawMetadata = CMCopyDictionaryOfAttachments(allocator: nil, target: sampleBuffer, attachmentMode: CMAttachmentMode(kCMAttachmentMode_ShouldPropagate))
        let metadata = CFDictionaryCreateMutableCopy(nil, 0, rawMetadata) as NSMutableDictionary
        let exifData = metadata.value(forKey: "{Exif}") as? NSMutableDictionary
        
        let FNumber : Double = exifData?["FNumber"] as! Double
        let ExposureTime : Double = exifData?["ExposureTime"] as! Double
        let ISOSpeedRatingsArray = exifData!["ISOSpeedRatings"] as? NSArray
        let ISOSpeedRatings : Double = ISOSpeedRatingsArray![0] as! Double
        let CalibrationConstant : Double = 50
        
        //Calculating the luminosity
        let luminosity : Double = (CalibrationConstant * FNumber * FNumber ) / ( ExposureTime * ISOSpeedRatings )
        luminosityReading = luminosity
    }
}

Lastly, I instantiate VideoStream as a StatreObject in my ContentView and attempt to read the updated luminosityReading:

struct ContentView: View {
    @StateObject var videoStream = VideoStream()
    
    var body: some View {
        Text(String(format: "%.2f Lux", videoStream.luminosityReading))
            .padding()
    }
}

To repeat, I have carefully read and implemented the solutions described in these similar posts:

Using AVCaptureVideoDataOutputSampleBufferDelegate without a preview window

captureOutput not being called

captureOutput not being called from delegate

captureOutput not being called by AVCaptureAudioDataOutputSampleBufferDelegate

In Swift, adapted AVCaptureVideoDataOutputSampleBufferDelegate, but captureOutput never getting called

AVCaptureVideoDataOutput captureOutput not being called

Swift - captureOutput is not being executed

Why AVCaptureVideoDataOutputSampleBufferDelegate method is not called

Why captureOutput is never called?

func captureOutput is never called

captureOutput() function is never called swift4

Minimal Reproducible Example:

import SwiftUI
import AVKit

struct ContentView: View {
    @StateObject var videoStream = VideoStream()
    
    var body: some View {
        Text(String(format: "%.2f Lux", videoStream.luminosityReading))
    }
}

class VideoStream: NSObject, ObservableObject, AVCaptureVideoDataOutputSampleBufferDelegate {
    
    @Published var luminosityReading : Double = 0.0
    
    var session : AVCaptureSession!
        
    override init() {
        super.init()
        authorizeCapture()
    }

    func authorizeCapture() {
        switch AVCaptureDevice.authorizationStatus(for: .video) {
        case .authorized: // The user has previously granted access to the camera.
            beginCapture()
        case .notDetermined: // The user has not yet been asked for camera access.
            AVCaptureDevice.requestAccess(for: .video) { granted in
                if granted {
                    self.beginCapture()
                }
            }
            
        case .denied: // The user has previously denied access.
            return
            
        case .restricted: // The user can't grant access due to restrictions.
            return
        }
    }

    func beginCapture() {
        
        print("beginCapture entered")
        
        let testDevice = AVCaptureDevice.default(for: .video)
        print("Image Capture Device: \(testDevice)")
        guard
            let videoDeviceInput = try? AVCaptureDeviceInput(device: testDevice!),
            session.canAddInput(videoDeviceInput)
        else {
            print("Camera selection failed")
            return
        }
        
        let videoOutput = AVCaptureVideoDataOutput()
        guard
            session.canAddOutput(videoOutput)
        else {
            print("Error creating video output")
            return
        }
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "VideoQueue"))
        session.addOutput(videoOutput)
        
        session.sessionPreset = .medium
        session.commitConfiguration()
        session.startRunning()
    }
    
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

        print("captureOutput entered")  // never printed
        
        // light meter logic to update luminosityReading
    }
}

CodePudding user response:

You are missing adding the input

if session.canAddInput(videoDeviceInput){
    session.addInput(videoDeviceInput)
}
  • Related