UR5+vision - coordinate system difficulties

I’m having some trouble with a UR5/Cognex inspection system I’m working with.

I have a camera mounted on the end-of-arm locating a part on a conveyor, then sending coordinates (in millimeters; relative to the center of the camera’s field of view) to the UR.

The UR should then use those coordinates to guide the TCP of the picking tool directly onto the part. However, it is without fail several centimeters off when trying to pick from the conveyor belt. Conveyor tracking is working fine; the robot will follow a part perfectly, but in the wrong spot.

The problem is: before, when I had my program working in my office, I had a carefully calibrated feature defined where the origin and x-y axes of the platform where the part sat were perfectly matched to the camera’s origin and axes. Now, since the robot is supposed to be mobile and set up in various places, that won’t be possible anymore. I need to determine x and y offsets to account for the fact that the origin of the conveyor belt “feature” may be a couple feet away from the camera’s center.

I thought I could use get_actual_tcp_pose() and then grab x and y offsets from the result. The problem with that is the script code get_actual_tcp_pose() returns the position of a tool center in base coordinates, and I need conveyor belt feature coordinates. I’ve gotten very close results with the following code:

Pose1=get_actual_tcp_pose()
% this assigns a pose value to the current camera location, in base coordinates
Pose2=conveyor_belt
% conveyor_belt is the name of the conveyor feature, and this line assigns a pose value to the location of its % origin in base coordinates
Result=pose_trans(pose_inv(pose2),pose1)
X_offset=result[0]
Y_offset=result[1]
% These last 3 lines determine a pose vector from the tool center point to the conveyor feature origin and %.grab the x and y offsets from it, but in the TCP coordinate system, not the conveyor’s like I wanted

So now the problem is, since the TCP’s axes aren’t pointing the same direction as the conveyor’s, it’s still off…and the coordinate rotation I’ve tried applying around the z-axis isn’t fixing it.

I’ve also tried just doing all my moves in the base coordinates, with the same issue that the base x-y axes are not pointing the same direction as the camera’s, so it’s still off.

I don’t want to simply put in a manual offset because that’s going to change every time. The robot and camera need to do this without user input.

So after beating my head against this problem for several days, I’m hoping someone has ideas.

1 Like

What did work was a brute-force method of going over to the move screen, checking the x and y location of the TCP relative to the conveyor belt feature, and physically typing them into my program as offsets. That won’t be a good long-term solution, though.

I think if the robot is capable of showing me those offsets on the move screen, it should be capable of grabbing those values in a program using script, right? I just can’t figure out how to do it.

Ok, I’ve solved my own problem again…

The code with coordinate rotation does work. So from the first post code sample, I redefine:

X_offset= result[0]*cos(result[5]) - result[1]*sin(result[5])
Y_offset= result[0]*sin(result[5]) + result[1]*cos(result[5])

With result[0] being the x distance from TCP to plane feature origin in TCP coordinates, result[1] being the y distance, and result[5] being the rotation between the TCP and the plane feature about the z-axis, in radians. You may have to swap x and y around or reverse signs if your z axes are pointing in opposite directions, but play with it a little and you can get the right numbers.

This gives me the exact x and y coordinates of the TCP shown in the move screen,when I select the conveyor plane feature as the reference.

Now, the other reason my coordinates were off every time was that I had previously calibrated my camera’s field of view to a grid defining10mm and relating them to pixels on the camera, and during that calibration the camera had been much closer to the conveyor belt. So because the camera was roughly twice the distance away this time, it was sending me coordinates that were about a factor of 2 too large. Kind of silly, but if someone else runs into these problems, that is something to check. Hope this helps someone.

1 Like

Link to a previous discussion of this script solution:

@anna did you try using pose_trans(). We use this a lot for transposing a coordinate from one frame of reference to another in a lot of the programs that we use. Returns values in the base coordinate system so that it can be used directly in a move function. This is what we use when we will have the camera plane taught as a feature to transpose the coordinates that the camera is returning to us. This can allow for the camera plane for instance to be programmatically changed on the fly and the robot go to a new position but the camera still returns the “same” coordinates without realizing that the actual snap-shot position has changed. Then we just use the pose_trans to get the actual coordinates of where to send the robot. You can then do your pose rotation if you need to as you discovered.

[quote=“anna, post:3, topic:499”]

@mbush
I used pose_trans() for part of my code, but I thought it was giving me TCP coordinates? Do you have an example of how you do that? I have not created a camera plane feature before, and although I have gotten my setup working now, I’d like to see how other people do this. Thanks!

@mbush If you are talking about using the calibration grid for the camera, and setting up a plane feature with x and y axes that match the camera’s, I did that, but then made the mistake of taking pictures from a different z-distance later, which threw me off.

Here is a picture of code that is currently running on a production robot. The coordinates that are being returned are relative to the cameraRefPlane feature, so 0,0 of the camera is also the 0,0 of that feature. We are then setting up a pick location, a prePick location and a blastPick location (used to scatter parts around the one that we want to pick)

All of this is being done in a thread. Then in the main program we are just moving to these variable waypoints where needed.

Did this explain what I was referring to?

1 Like

@mbush
Yes, that makes sense. Overall, I think we are doing the same thing in different order. I defined a plane that lined up with our camera the way you are describing for the first part we inspected, then since our robot is mobile I moved it to a new location, so the camera and reference plane are no longer lined up for the second inspection program. Rather than redoing the calibration every time, I’m trying to leave the reference plane where it is, and calculate x and y offsets from its origin to add to the camera’s results. I now have that calculation working, and the robot is running as I type this :slight_smile:

So basically you have an xCorrection and a yCorrection, and if I’m correct, then I am calculating those using my code, where the corrections are my X and Y offset from the original reference plane. I think we’re on the same page…Thanks for your help!

1 Like

I have a similar problem. I and doing the transpose in a similar waybut how do you create “cameraRefPlane” above which is the camera’s plane.

Your camera should have a calibration grid that you can print off. The one I used is from Cognex, and they have a chessboard grid with a feature to mark the coordinate center. Each square is exactly 1cm, so the calibration will decide how many pixels per centimeter, where the origin of the image is, and what direction the x and y axes point in the camera’s field of view.

Then on the robot side, leave the grid taped to the conveyor belt while you create a plane feature to define the belt. I created a tool center point for the camera lens, then used that as the default TCP while I defined the 3 points for the plane by putting the camera lens right against the origin of the grid for Point 1, then against the ends of the x and y axes for the other two points (I can’t remember for sure whether Point 2 and Point 3 is x or y, so check on this).

What you should end up with is a UR plane feature that lines up perfectly with your camera’s field of view, and as long as you don’t move the robot or change the focal length suddenly, the coordinates of an object the camera finds should line up exactly in the plane feature coordinates.

If you do move the robot someplace else, or start looking for parts in a different location from your feature plane, then use the transpose to get your distance from the plane feature origin and add your camera’s location results to get the actual part location. The coordinate rotation and transformation I used above worked perfectly.

1 Like

Thank you so much. That helps and simplifies my code quite a bit!

On the installation tab you can define a new feature, choose plane

Thanks. I am able to use point itself and call pose_trans to figure out the co-ordinates on that plane. Works well.

@jbm
I have run into problems with my coordinate rotation calculations above because the resulting pose angle values Rx, Ry and Rz are not what I would expect, and I am not sure how they are calculated.

For instance, if my plane feature coordinate system is rotated pi()/4 radians about the z-axis from the base feature, after performing the script calculations above I would expect to see result[5]=Rz=0.785, and Rx=Ry=0, but this is nowhere close to what I get. The actual values are very strange.

What I think I need is RPY values so I can use the RPYz as my θ for coordinate rotation. Do you know how Rx, Ry, Rz are calculated by the UR, and how I can get RPY values instead? Thanks!

1 Like

@anna

The axis angle rotation vector that a pose is generically represented as are sometimes rather hard to grasp just off the top of your head. One axis of rotations works fine, but multiple angles at a time will most likely not give what you expect - however the result is most likely right anyway. There are also multiple representations for the same angle.

Did you try moving to that position, to see if the robot actually went to the expected place?

You can however easily convert from rotation vector to RPY and vice versa, using the script code rpy = rotvec2rpy(rotvec) and rotvec = rpy2rotvec(rpy).
They are both explained in the script manual.

This topic dives a bit into this topic as well.

1 Like

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.