I am trying to project the 3D human joint points onto the iPhone's screen using ARKit.
- I am extracting the global transforms:
let rightArmPosition = skeleton.modelTransform(for: ARSkeleton.JointName(rawValue: "right_arm_joint"))!
let rootPosition = skeleton.modelTransform(for: .root)!
- I am calculating the offset
let rightOffset = simd_make_float3(rightArmPosition.columns.3)
let rootOffset = simd_make_float3(rootPosition.columns.3)
- I am projecting the points
let pMatrix = camera.projectionMatrix
let pRightOffset = camera.projectPoint(rightOffset, orientation: .portrait, viewportSize: CGSize(width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height))
let pRootOffset = camera.projectPoint(rootOffset, orientation: .portrait, viewportSize: CGSize(width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height))
humanJointsView.frame = UIScreen.main.bounds
humanJointsView.points = [pRightOffset, pRootOffset]
- I am trying to draw the points in the target view:
override func draw(_ rect: CGRect) {
path.removeAllPoints()
self.points.forEach { point in
path = UIBezierPath(ovalIn: CGRect(x: point.x, y: point.y, width: CGFloat(30), height: CGFloat(30)))
UIColor.green.setFill()
path.fill()
}
}
This approach is not working however, where is my mistake?
Thank you!
CodePudding user response:
The right arm and root transforms are in skeleton model space, you need to multiply them with the body anchor transform.
rightArmWorldTransform = body.transform * rightArmPosition