The swift vision similarity feature is able to assign a number to the variance between 2 images. Where 0 variance between the images, means the images are the same. As the number increases this that there is more and more variance between the images.
What I am trying to do is turn this into a percentage of similarity. So one image is for example 80% similar to the other image. Any ideas how I could arrange the logic to accomplish this:
import UIKit
import Vision
func featureprintObservationForImage(atURL url: URL) -> VNFeaturePrintObservation? {
let requestHandler = VNImageRequestHandler(url: url, options: [:])
let request = VNGenerateImageFeaturePrintRequest()
do {
try requestHandler.perform([request])
return request.results?.first as? VNFeaturePrintObservation
} catch {
print("Vision error: \(error)")
return nil
}
}
let apple1 = featureprintObservationForImage(atURL: Bundle.main.url(forResource:"apple1", withExtension: "jpg")!)
let apple2 = featureprintObservationForImage(atURL: Bundle.main.url(forResource:"apple2", withExtension: "jpg")!)
let pear = featureprintObservationForImage(atURL: Bundle.main.url(forResource:"pear", withExtension: "jpg")!)
var distance = Float(0)
try apple1!.computeDistance(&distance, to: apple2!)
var distance2 = Float(0)
try apple1!.computeDistance(&distance2, to: pear!)
CodePudding user response:
It depends on how you want to scale it. If you just want the percentage you could just use Float.greatestFiniteMagnitude as the maximum value.
1-(distance/Float.greatestFiniteMagnitude)*100
A better solution would probably be to set a lower ceiling and everything above that ceiling would just be 0% similarity.
1-(min(distance, 10)/10)*100
Here the artificial ceiling would be 10, but it can be any arbitrary number.