Home > Mobile >  How to find robot TCP in image with CalibrateHandEye
How to find robot TCP in image with CalibrateHandEye

Time:12-06

I'm currently developing a programm for a robot with a camera attached to the end affector. The goal is to calculate where the robots TCP appears in the camera frame. I'm using opencv in c . The robot is a UR5 from Universal Robots.

My plan:

  1. collect multiple (8) data-sets:
  • robot pose (XYZ in meters, directly from the robot controller)
  • robot rotation (rx ry rz in radians, directly from the robot controller)
  • take a picture of calibration pattern for each step
  1. run calibrateCamera over the set of pictures to get tvec and rvec for every step

  2. run calibrateHandEye

  • for t_gripper2base i use the robot pose
  • for R_gripper2base i use the robot rotation
  • for t_target2cam i use tvec from calibrateCamera
  • for R_target2cam i use rvec from calibrateCamera

I seem to get correct values (I measured the distance from cam to TCP and the t_cam2gripper seems to be correct.

Translation vector target to cam: 
[-0.0001052803107026547;
 -0.0780872727019615;
 -0.1243323507069755]

Rotation matrix target to cam: 
[0.9999922523048892, 0.002655868335207422, -0.002905459271957709;
 -0.001229768871633805, 0.9119334002787367, 0.4103363999508009;
 0.003739384804660116, -0.4103296477461107, 0.9119296010009958]

The formula to transform the point from TCP coordinates to the image should look like this:

  • (u,v,w) = C * T * (X,Y,Z,1) But after the division by w my values are still way off (should be around (600,600)
Actual image position vector homogenized: 
[1778.542462313536;
 -1626.72483032188;
 1]

CodePudding user response:

you need to use solvepnp to get rvec and tvec for each image separately and then you will have a list of rvecs and tvecs. list length equals no of images. To get a similar list of rvec and tvec for Gripper_to_base transformation, you need to derive the R and T matrices based on robot dynamics which take rotation angle as input. Then for each pose you need to input the rotation angle data to R,T matrices to get rvec and tvec for each pose and make list of same length. Then you input them to calibrateHandeye function.

CodePudding user response:

I could finally resolve the problem. The calibration was correct, but there were two mistakes in my approach:

  • To find my TCP I used the TCP-Coordinates from the robot control. Instead I had to use (0,0,0), which is the TCP in the TCP-Coordinate-System.
  • The second mistake was to use the transformation matrix built from R and t out of calibrateHandEye. Instead I had to use the inverse transformation matrix.
  • Related