Home > database >  ARKit support ear, nose, neck and finger detection?
ARKit support ear, nose, neck and finger detection?

Time:06-29

I am working on swift native project and I tried ARKit for face detection. I would like create virtual jewellery app.


Scenario 1: If the user select the earrings I have to display to the ear landmark.


Scenario 2: If the user select the bracelet I have to display the bracelet to the hand.


Scenario 3: If the user select the ring, I have to show the ring to their fingers.


Scenario 4: If the user select the chain, I have to show the chain to their neck.



Which framework support all the above cases?

CodePudding user response:

Use VNDetectHumanBodyPoseRequest to detect the body itself. Finding out how to scale the jewellery will not be covered though. You will need to do additional work, but you will get the positions of nose, neck, wrists etc.

https://developer.apple.com/documentation/vision/detecting_human_body_poses_in_images

  • Related