I have some doubts about Core Image coordinate system, way transforms are applied and extent is determined. I couldn't find much in documentation or on internet so I tried the following code to rotate CIImage and display it in UIImageView.
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
imageView.contentMode = .scaleAspectFit
let uiImage = UIImage(contentsOfFile: imagePath)
ciImage = CIImage(cgImage: (uiImage?.cgImage)!)
imageView.image = uiImage
}
private var currentAngle = CGFloat(0)
private var ciImage:CIImage!
private var ciContext = CIContext()
@IBAction func rotateImage() {
let extent = ciImage.extent
let translate = CGAffineTransform(translationX: extent.midX, y: extent.midY)
let uiImage = UIImage(contentsOfFile: imagePath)
currentAngle = currentAngle CGFloat.pi/10
let rotate = CGAffineTransform(rotationAngle: currentAngle)
let translateBack = CGAffineTransform(translationX: -extent.midX, y: -extent.midY)
let transform = translateBack.concatenating(rotate.concatenating(translate))
ciImage = CIImage(cgImage: (uiImage?.cgImage)!)
ciImage = ciImage.transformed(by: transform)
NSLog("Extent \(ciImage.extent), Angle \(currentAngle)")
let cgImage = ciContext.createCGImage(ciImage, from: ciImage.extent)
imageView.image = UIImage(cgImage: cgImage!)
}
So as I rotate the image every time by the push of a button, the image is rotated by angle pi/10 each time. But I see the image shrinking in UIImageView. The NSLogs show the extent is growing with some rotations with the origin x and y becoming negative.
2021-09-24 14:43:29.280393 0400 CoreImagePrototypes[65817:5175194] Metal API Validation Enabled
2021-09-24 14:43:31.094877 0400 CoreImagePrototypes[65817:5175194] Extent (-105.0, -105.0, 1010.0, 1010.0), Angle 0.3141592653589793
2021-09-24 14:43:41.426371 0400 CoreImagePrototypes[65817:5175194] Extent (-159.0, -159.0, 1118.0, 1118.0), Angle 0.6283185307179586
2021-09-24 14:43:42.244703 0400 CoreImagePrototypes[65817:5175194] Extent (-159.0, -159.0, 1118.0, 1118.0), Angle 0.9424777960769379
2021-09-24 14:43:42.644446 0400 CoreImagePrototypes[65817:5175194] Extent (-105.0, -105.0, 1010.0, 1010.0), Angle 1.2566370614359172
2021-09-24 14:43:43.037312 0400 CoreImagePrototypes[65817:5175194] Extent (0.0, 0.0, 800.0, 800.0), Angle 1.5707963267948966
2021-09-24 14:43:43.478774 0400 CoreImagePrototypes[65817:5175194] Extent (-105.0, -105.0, 1010.0, 1010.0), Angle 1.8849555921538759
2021-09-24 14:43:44.045811 0400 CoreImagePrototypes[65817:5175194] Extent (-159.0, -159.0, 1118.0, 1118.0), Angle 2.199114857512855
My questions:
How exactly do I determine scale factor to rescale the image so that the extent does not cross the original image rectangle?
What exactly does negative extent origin means? Relative to what it is negative? I understand coordinate system in Core Image is relative assuming bottom left corner of image to be (0,0), not with respect to some superview like in UIKit.
CodePudding user response:
It's unclear what the question is, but what you seem to be focussed on is the meaning of the extent
. This is like the frame, and, just like the frame, it loses its meaning if you have applied a transform to the CIImage. After a rotation, the extent
is now based on the bounding box of the transformed image. So if you have a horizontally wider image and you rotate it a little bit counterclockwise, the extent
becomes taller and its top becomes negative.