Universal Robots Forum

MoveL relative to a plane feature

I’m having troubles with feature coordinates-system, for a problem that seemed me easy, but after hours of trying and searching, I decided to post a topic.

I’m searching for a way to use a plane feature, to move the robot TCP relatively to this plane (in script). As we can do in the manual mode when we select the feature to ask your help.

I already saw answers from that thread that concerns a relatively close problem (How to get tcp pose in feature coordinates?), but I can’t achieve to a working equation.

For now, I’m just trying to use it to “not move”, and if it move, it’s false.
(as we do when calling “movel(get_actual_tcp_pose())” or “movel(pose_add(get_actual_tcp_pose(),p[0,0,0,0,0,0]))” )

Can you help finding a way to resolve my problem ? Or is there a possibility that I missed a function that will make it for me ? Like adding a argument to the movel function ? Because I find it strange that the manual mode and polyscope bloc-programming allow that feature and not the scripting langage.

Thank you

For a better understanding, the context of the problem is:
I have a 2D trajectory that I want my tcp to travel, but according to a user-defined feature-plane, and not the base-plane.

Thank you again!

I got it working according to the post I made, but not exactly the way the post had it. I will post my functioning code here for you later today.

1 Like

So, here is what mine looks like. Sorry my program is a little messy; I haven’t cleaned it up since developing it and I’m not really a programmer to begin with :smirk:

The UR5 is watching a moving conveyor for a part, and when a part moves underneath the camera, the camera sends x and y coordinates for the part in mm. The UR receives this information in a vector:

conv_find := [3 (for data points to follow, 0 if no data), 1 (for part found, 0 if no part), x (mm), y (mm)].

The x and y are given relative to the center of the photograph being the origin. This is important because the camera is mounted on the end of arm and is defined as the default TCP, so my get_actual_tcp_pose() returns the location of the center of the camera’s lens, in base coordinates.

Then I run this next portion of code to convert my camera’s position from base coordinates to feature plane coordinates. The feature plane I am using is called “conveyor_belt” and is oriented with its x and y axes parallel to those in the camera’s field of view. Because of the direction of the z-axis in my conveyor_belt feature pointing downward by default, I have to reverse my x and y from the camera and flip their signs, as you can see in the code.

Then I define variable positions for the robot to navigate to in order to pick the part. I do this inside conveyor tracking so that the UR is already following the part while it calculates positions. All I have to do is add the x and y location of my part from the camera, to the x and y location of my TCP in my feature plane.

Note, you can see that I have my fancy coordinate rotations commented out and just ended up using the direct x and y values from my conv_pos_inv calculation. It got too complex doing those so it is just easier to make sure your camera’s field of view and your conveyor belt feature axes are lined up so they point the same way. There is some unnecessary code as a result of my playing with this; it would have been simpler to define:

conv_hover := p[ Pc_x + x_conv, Pc_y + y_conv, hover_z_conv, Rc_x, Rc_y, Rc_z]

where x_conv and y_conv are the part coordinates from my camera, and hover_z_conv is a specified z coordinate that I define at the beginning of the program when I initialize variables. I put the rotation coordinates I calculated in there, too, because if I don’t, the UR does some very weird and unpredictable moves.

When I perform my move to the variable waypoint I defined, I have the feature plane I want to navigate in selected, and I use my picking end of arm tool as the TCP for the move. This navigates my picking tool right on top of the part, where the camera lens was previously hovering.

One other thing to watch out for: Make sure your camera is perfectly vertical with respect to the conveyor belt. If it is at an angle relative to your conveyor belt, that will introduce position error.

I hope that helps.


Thanks you very much for your help.
I found where my mistake was. I didn’t understood how the pose_trans worked.
I just inverted my arguments… I was sure the point we want to reach is the first, and the feature the second.
So for make it simple, I can move to my plane origin with “movel(pose_trans(Plan_1,p[0,0,0,0,0,0]))”, and by changing the second arguments x and y I can travel through my plane.

Thank you again for your help.

Exactly, the second argument is given with respect to the feature.

base_to_target = pose_trans(base_to_feature, feature_to_target)

base_to_feature would simply be the Feature coordinate, as these are saved wrt. Base.
Also check some more illustrations of pose_trans in this article.