Rotation Transformation for Robot Tracking

Hi,

I’m working on an experiment and trying to set up the necessary processes in advance.

In my experimental setup, I’m using a fixed camera located in a different position from the UR5e robot. The camera detects the target and provides the pose data in quaternion form:
[x, y, z, qx, qy, qz, qw]

As you may know, the UR5e’s TCP operates using Euler angles in the following format:
[x, y, z, rx, ry, rz]

My goal is to take the target data received from the camera, apply the necessary mathematical transformations, and send it to the robot so that the robot can track the target accurately.

I’ve already implemented the tracking for the position [x, y, z] using a rigid body transformation, which allows the robot to follow the target’s position correctly. However, during this process, I assigned a fixed arbitrary value for the robot’s rotation just to confirm that the position tracking was working as intended.

The problem lies with the rotation part [qx, qy, qz, qw]. When I try to apply the rotation data from the camera, the robot’s movements are not natural and don’t track the target smoothly.

How can I mathematically or technically transform the rotation data so that the robot can track the target naturally? I’d appreciate any algorithm suggestions or solutions you can share.

Thank you!

Hi

So you get the right rotations?
and its just a qeustion about how to follow the generated poses?

if thats true you’ll have to show or explain you URscript code.