Using path_offset_set to offset entire program

Hi there! :grinning:

For context:
A welding program including search routines is created by a worker for an item on a welding table. E.g. the starting position is on the corner of the item.

Now the item is moved e.g. 0.5 meters to the left of where the original position of the item was. (So we need to transpose the program 0.5 meters)

I would like a worker to be able to move the end-effector to the new starting position, e.g. the new X,Y,Z (Maybe also Yaw if possible) of the corner on the new item.

The script will then calculate the offset between where the old item was programmed, and the new item, to offset the entire program (including the search routines.)

Here is some code that I have tried setting up, but it does not currently work as expected, and the movements always remain the same without being offset:

#Recording the start pose of TCP
relative_starting_point = get_actual_tcp_pose()

#This is the original start pose from the program.
pOriginal = p[0.5380162871794699, -0.5486028565561584, 0.41514857359159973, 2.2182829077273887, -2.2131184297607467, -0.004239291622238593]

#Figure out the offset between those two points
pOffset = pose_sub(pOriginal, relative_starting_point)

path_offset_enable()

#I am trying to offset the item at least 50 cm, so I change the limits here:
path_offset_set_max_offset(1, 30)
#Alfa filter value is set very low, or the program won’t run.
path_offset_set_alpha_filter(0.000001)

#Define the offset thread:
thread OffsetFromCamera():
while(True):
#Here, I offset the
path_offset_set([pOffset[0],pOffset[1], pOffset[2],pOffset[3],pOffset[4], pOffset[5]], 2)
sync()
end
end
#Start/Run the offset thread
relative_offset_thread = run OffsetFromCamera()

#From here on out, I just run the rest of the program as is, expecting that the values get offset in cartesian space. E.g as #seen in the line below:
#Is this not the correct way to do it?
movej(get_inverse_kin(pOriginal),1.3962634015954636, 1.0471975511965976, 0,0.001)

Looking forward to your feedback!

Edit:
I guess I could do something like:
relative_starting_point = get_actual_tcp_pose()

pOriginal = p[0.5380162871794699, -0.5486028565561584, 0.41514857359159973, 2.2182829077273887, -2.2131184297607467, -0.004239291622238593]

pOffset = pose_sub(pOriginal, relative_starting_point)

and then manually offset each point in the program by adding the pOffset values to every single point in the program. But that will be very inelegant (and tedious for complex programs)

You really just need to teach the points to a Feature, and then just programmatically change the Feature’s pose. Then all the moves that were programmed against that Feature will shift automatically. Path_offset functions are primarily used for modifying a Base path with some superposition of motion, like adding oscillation to create a weave.

1 Like

Perfect, thanks for the feedback!

Anyone with a similar problem, set a userFrame from the current tcp pose.
Then have the starting point of the program something easy to remember like pO.

now calculate pX_pO as the pose_trans(pose_inv(pX), pO)
Lastly, pose_trans(userFrame, pX_pO) should give you a value that can be used for something.