Relative movements to a variable starting point with UR10e

Hi there,

For context I’ll provide a simple scenario.

  1. A worker places an object on a variable position within the workspace of the robot.
  2. The worker then moves the end-effector of the robot to the starting position of object.
  3. The robot movements then start executing from this custom/variable starting point.

This scenario calls for moving the end-effector of the robot to a variable position, and then the rest of the script should be executed relative to this starting position. - I am having trouble figuring out how to do this via the teach pendant.

Right now I am creating a minimal example, where the robot moves between two points, point A and B.

I use relative waypoints to program the movements between each point.

The problem is that I can move the end-effector to a variable location and the transformation to the new relative coordinate system will take place, but only for X,Y,Z.

In other words, if I rotate the tool 45 degrees, the path between A and B should also be rotated 45 degrees, but they remain the same orientation as before the tool was rotated

Under MoveL my feature is set to Base since using a tcp gives some weird behaviour, but maybe that is what I need to do? :blush:

Hey, you changed your message while I was typing! :slight_smile:

You need your points expressed relative to the starting point. And use a plane feature, defined at the current position, to make the moves.

1 Like

Thanks for such a quick answer! :blush:

So can you tell me how I express my points relative to the starting point? I thought I was doing that when using the Relative Position Waypoints.

In regards to the plane feature, what do you mean when saying defined at the current position? Is it the current TCP, and if so, do I need to write something custom in order to set that up?

Sorry if these questions are basic, I am new to using these robots

Something like this:
image

Points and userFrame are defined in the installation.
image

1 Like

I see, thanks for that!

So basically I will have to do the transformation between each point before moving the robot. Then rather than using the Relative Position, I will use Variable Positions based on starting pos. - Since Relative does not consider starting rotation.

I think that is manageable :smile:

I’m still confused about the userFrame plane though?
Could you explain what and why you’re doing this rather than e.g. using the base of tool frame?

The idea is to move to pA, pB… relative to the starting point (pO here). So after doing the
the transformations


the starting point (saved in userFrame) is use as the local reference feature.

1 Like

If you make all the points off a plane and make a reference point in the plane. Then when you use posesub on your new reference point and then add it to you points in the planeit will adjust all the points. The only thing to bear in mind is that it will not take into account the yaw of the point.

1 Like

I implemented it as you said, and it works perfectly, so thanks for your help! :smile: