Home > Enterprise >  Using AVKit to detect luminosity
Using AVKit to detect luminosity

Time:09-05

I am working on an app using SwiftUi that leverages the device camera to detect luminosity as described in top answer of this post. The captureOutput(_:didOutput:from:) function in the top answer was used to calculate luminosity. According to Apple Docs this function is intended to notify a delegate that a new video frame was written, and so I have placed this function in a VideoDelegate class. This delegate is then set in a VideoStream class that handles the logic of asking permissions and setting up an AVCaptureSession. My question is how to access the luminosity value calculated within the delegate inside my SwiftUI view?

struct ContentView: View {
    @StateObject var videoStream = VideoStream()
    
    var body: some View {
        Text("\(videoStream.luminosityReading) ?? Detecting...")
            .padding()
    }
}
class VideoStream: ObservableObject {
    
    @Published var luminosityReading : Double = 0.0 // TODO get luminosity from VideoDelegate
    var session : AVCaptureSession!
    
    init() {
        authorizeCapture()
    }
    
    func authorizeCapture() {
        // permission logic and call to beginCapture()
    }
    
    func beginCapture() {        
        session = AVCaptureSession()
        session.beginConfiguration()
        let videoDevice = bestDevice() // func definition omitted for readability 
        guard
            let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice),
            session.canAddInput(videoDeviceInput)
        else {
            print("Camera selection failed")
            return
        }
        
        let videoOutput = AVCaptureVideoDataOutput()
        guard
            session.canAddOutput(videoOutput)
        else {
            print("Error creating video output")
            return
        }
        
        session.sessionPreset = .high
        session.addOutput(videoOutput)
        
        let queue = DispatchQueue(label: "VideoFrameQueue")
        let delegate = VideoDelegate()
        videoOutput.setSampleBufferDelegate(delegate, queue: queue)
        
        session.commitConfiguration()
        session.startRunning()
    }
}
class VideoDelegate: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        //Retrieving EXIF data of camara frame buffer
        let rawMetadata = CMCopyDictionaryOfAttachments(allocator: nil, target: sampleBuffer, attachmentMode: CMAttachmentMode(kCMAttachmentMode_ShouldPropagate))
        let metadata = CFDictionaryCreateMutableCopy(nil, 0, rawMetadata) as NSMutableDictionary
        let exifData = metadata.value(forKey: "{Exif}") as? NSMutableDictionary
        
        let FNumber : Double = exifData?["FNumber"] as! Double
        let ExposureTime : Double = exifData?["ExposureTime"] as! Double
        let ISOSpeedRatingsArray = exifData!["ISOSpeedRatings"] as? NSArray
        let ISOSpeedRatings : Double = ISOSpeedRatingsArray![0] as! Double
        let CalibrationConstant : Double = 50
        
        //Calculating the luminosity
        let luminosity : Double = (CalibrationConstant * FNumber * FNumber ) / ( ExposureTime * ISOSpeedRatings )
         
        // how to pass value of luminosity to `VideoStream`? 
    }
}

CodePudding user response:

As discussed in the comments, the lowest friction option would be to have VideoStream conform to AVCaptureVideoDataOutputSampleBufferDelegate and implement the delegate method there.

  • Related