I’m running an UE 5 Cobot with a wrist cam, it’s supposed to pick up little sticks (Sticks that look the same if you turn them around by 180 °). Because of that, the orientation the camera recognizes differs by 180 ° as well, from time to time.
In principle, that shouldn’t be a problem, because the last joint can be turned 720° in total. However, depending on which orientation is recognized by the camera, it can happen that the last joint turns into the “wrong” direction, and interupts the program when it reaches the limits of the joint.
How can I prevent the cobot from doing this? As I see it I’m not able to affect this behaviour by the Waypoint I give the cobot, because it only contains the TCP Position and orientation (which is the same if I turn the last joint by 360 °).
Does anyone have a hint about how to procees here? I’d be very glad.
With MoveL commands, the robot will rotate whichever way is shortest to get to the target position, but after a few cycles you can end up close to a joint limit as these offsets accumulate.
Best way to deal with this would to be to teach a waypoint under a MoveJ command with the wrist3 joint angle close to zero degrees before your camera capture position, this will essentially “reset” the offsets and should keep you far enough away from the joint limits to avoid the issue.
Hey Jakob, as mutch as I get it, this won’t help me:
I’m already running into the Joint Limits now (the bot is turned of before the limit actually is reached). Making the limits smaller will only make the observed behaviour to appear even more often.
Hey ajp,
by camera capture position, you mean the position where the error takes place?
Your suggested solution is that before I pick up a thing, I turn the wrist into neutral position - kind of? I can do that.
Can you confirm to me that there is no way to force the cobot to take a certain trajectory in the joint space, if a target pose is given?
I will support @ajp suggestion that you make use a MoveJ to insure that you, as minimum have wrist 3 in the range 180° to -180°. In your case you can also use an approach like this:
and compare it to the actual joint positions, to evaluate witch of the two solutions is closest. Afterwards you can make a forward kin on the selected solution.
I see. get_forward_kin is a function that is not mentioned in my SCript Manual, get_inverse_kin is all I have. Can you point me to a manual that includes this?
By capture position, I meant the position where the robot takes the picture of the surface. If that is more than 180 degrees from either end of the joint range, you shouldn’t need to worry about selecting a specific trajectory to get to the pick position (although it is possible as @Ebbe explains).