I’m trying to move a UR5 based on inputs from a robotiq FT300. This software is running off the arm and using the various TCP clients to collect data and send commands.
Using the force values is straightforward, with something like:
pose_add(get_actual_tcp_pose(), p[scaled_forces, scaled_forces, scaled_forces, 0, 0, 0])
scaled_forces is calculated in my software (again, running in a separate box) by taking the FT300 signal, filtering it, applying forward kinematics so it’s in the right frame, and scaling according a logistic/sigmoid function.
Where I’m confused is the orientation, and admittedly I’m learning everything as I go and this is all fairly new.
the documentation for pose_add says that it adds the x,y,z and multiplies the Rx, Ry, Rz. However when I pass in zeroes for the second triplet in pose_add (and again the first pose is the current pose), the orientation doesn’t change. Since multiplying by zero should always yield zero, It doesn’t feel like it’s actually multiplying. i.e. since when i pass zeroes the orientation doesn’t change, it feels like it’s adding, not multiplying - counter to the documentation.
Additionally it’s not completely clear to me what
get_actual_tcp_pose is returning in the orientation triplet. I’ve seen some other threads here that the orientation is under-constrained and that’s why the values returned maybe don’t match the Rx/Ry/Rz shown in polyscope. But – and maybe this was a big coincidence – when I checked I had the impression that the values from get_actual_tcp_pose matched the RPY representation, not the Rx/Ry/Rz representation. is it possible that i’m getting RPY from get_actual_tcp_pose?
thanks for any insights you have to offer!
Did you ever get any further information on what is happening with the rotation triplet in pose_add()? I’m trying to implement it in LabVIEW and can not get the results to match what I’m seeing in the pendant. I tried multiplying across as well as taking the cross product of the two but still no luck.