Move Robot to user entered position UR5e


we want the robot to move to a position that was previously entered via a user input. how is something like this programmed?
Our application: We mounted the robot on a grid table. The robot is in the upper right corner of the table at X=800 Y=800, the lower left corner is stored as a coordinate system with X=0 Y=0, we now want the user to enter an input, for example X=200 Y=200, the robot should then move to this absolute position in relation to the coordinate system.

Thanks & best regards

If you just want to use it like an XY gantry type situation, Features will be your friend here. Teach a Feature (doesn’t even matter where it is) and then teach a Waypoint right at 0,0.

Now prompt your operator for an X offset and a Y offset. Use an assignment node to pose_add() the offsets to the Feature and then just MoveL/J/P to the one saved waypoint.


Just be sure to select the Feature from the POSE dropdown when feeding it into the pose_add() funciton. It will come in with the _const at the end. Use the normal variable version of the Feature in the Assignment. In this way, the single taught point will shift around with your Feature, offset by the user’s inputs. Also be sure to convert the user’s inputs to METERS. So if they input it as inches, or grid units, or whatever, convert it to meters first before pose_adding() it

1 Like

First of all thanks for the help :slight_smile:
I tried it right away - there are still two problems.
I have the value x and y entered by the user in mm divided by 1000, which would be correct, but the robot then only moves in x and y stops and in addition it moves a different value in x than the value entered by the user, example: user input x=50mm y=50mm, robot moves x=70mm y=0mm - how can that be?

What we found out is that the user input, e.g. x=50mm y=50mm, always refers to the base coordinate system, so the robot then also moves these paths but not according to our coordinate system, although we have selected our coordinate system in the move command.
what might be noted: the base coordinate system of the robot has a different orientation than the coordinate system of our table.

In addition, I cannot program waypoint1 to x=0mm and y=0mm because the zero point of our table cannot be reached by the robot (too far away) but we use this as a coordinate system because the grid of the table has a scale that refers to this corner relates. Is there another way to drive from this unreachable zero point with user input?

For the test I have now set waypoint_1 to x=200 y=200 because this point can be reached by the robot.


  1. Not moving the correct values per the user input:

This is almost surely due to imprecise teaching of the waypoint and/or initializing Y to an integer. Be sure to set these values to, say 50.0 and 50.0. If you initialize them to an integer, dividing it by 1000 will result in 0.

  1. Different coordinate systems:

I should have considered that when I outlined this method. You will have to transpose the points as well to make the offsets go in the X and Y of your Feature. Here’s a function to put in a script file. Call this instead of the built-in pose_add()

def getOffsetRelativeToFeature(x_offset, y_offset, z_offset, feature, poseToShift):
  return pose_trans(feature, pose_add(pose_trans(pose_inv(feature), poseToShift), p[x_offset, y_offset, z_offset, 0,0,0]))

Just pass it the Xoffset, Yoffset, and in your case 0 for the Zoffset. Pass it your plane_const, and then waypoint_1. Be aware this Feature needs to be extremely accurate. Any variation in the orientation will become very obvious as the offset becomes larger.

  1. Can’t reach the 0,0 point:

You can teach the waypoint wherever you want, you just need to do a little extra math to center it at 0,0 behind the scenes. If the point you taught as Waypoint_1 is at 200, 200, then take the user inputs, divide them by 1000 to get them as table units, then subtract 200 from both before sending it to the function.