Hello everyone
I have a problem when I want to approach a position rotated 180 degrees with the TCP with my UR10. The problem, I want it to turn exactly 180 degrees the other way wen i move to my position. It doesn’t matter whether I enter plus or minus, it always turns in the same direction when I approach the position.
Pretty sure you need to use MoveJ. It should automatically use get_inverse_kin() for you if you pass a Pose as its first argument. If you were doing this in Polyscope I would say to just click the checkbox that says “use Joint Angles.”
Thank you for your answer. Linear, so with movel there is no way to control this? Because I have to approach the position once this way and once this way, and all of this with a camera system. And the whole thing offset in xyz and up to 50 cm and more in every possible direction. I have a way now, but it’s not pretty and makes the whole script more confusing.
I would just give it a try, as it should be as easy as changing the L to a J and see if it behaves how you want it to. Basically, MoveLs take the TCP to the desired XYZ position, but doesn’t care what the actual joint positions are. This is why it doesn’t distinguish between -180 and +180. This can also run you into trouble where the robot may try to exceed its +360 degree rotation limit. Because again, it doesn’t really care about its joints, it’s just doing whatever it can to get the TCP to the position.
MoveJs on the otherhand (when passed a joint positions) will actually respect the joint angles. You would be able to program a point, rotate it 360 degrees on wrist 3 only, and save that point. The robot will then sit there and spin the tool flange back and forth. A moveL would just think it’s already at the same position each time, and wouldn’t do anything.
Yes, I tried that today, unfortunately with the same result. movej behaves like movel if you change position too much. At least that’s how it was for me. But who also possible that I made a mistake and therefore the same result. I’ve now managed with an angular position query and turned out where I have space. And then I drove to my position. Unfortunately, I can’t test much anymore because I don’t have time for the project anymore. But for my next project it’s good to know. Thanks for your answer
Idk if you are aware when you teach a Waypoint named “MyWaypoint”, program actually is saving two variables:
A pose called “MyWaypoint_p”: p[x, y, z, rx, ry, rz]
A list of joint positions called “MyWaypoint_q”: [q1, q2, q3, q4, q5, q6]
By default, when you select “MyWaypoint”, internally program understands “MyWaypoint_p”
Considering this information, that’s what I would do:
tempPos = MyPos
tempPos[5] = tempPos[5]-d2r(180) or +d2r(180)
tempQ = MyPos_q # you have to type literally 'myPos_q"
# Then, you edit tempQ components as you need, depending on the movement, your TCP, etc
tempQ[5] = tempQ[5] - d2r[90] # For example, if Wrist3 has something to do with the movement and you want to force it to rotate decreasing its angle
tempQ[n] = (...) # Whatever
movej(get_inverse_kin (tempPos, tempQ)) # Robot moves to tempPos pose using the joints combination closest to tempQ
Maybe you could use a pre-defined list of joints from a different Waypoint, e.g. “RefWaypoint”, which is teached as a reference. Or a manually introduced list… Again, it depends on your application:
Yes, the application is really complexI. I’ll give you a brief overview.
2 places where glasses are palletized on pallets away on a conveyor belt.
The problem is that it’s not a vacuum gripper, it’s clamping the glasses. And it’s about millimeters, that’s why the camera.
The pallets are of different heights and the glasses also have different heights.
And then the carton in each layer, also with different thicknesses.
And then tow different patterns how the glasses are arranged, what the camera needs to detect.
That’s the reason why it’s rotated 180 degrees, i need. Depending on the pattern
The pallets are so bad that I need an offset of 70mm in each direction and 2 Sensors, so that the gripper can adapt to the tilted position of the pallet.
The robot should then also recognize how many layers are left. And so on.
But that is practical if you break off, or the pallet is not stacked that high so that the robot can start again there.
Then there are light curtain sensors, etc. flash Buttons to cancel and start and a UPS, which has to react to a power failure.
Luckily, apart from the UPS and small things, I’m almost done now. The Python script now has over 2000 lines of code.
But I’m really glad I managed to do it that way. And I was able to learn a lot from it.
But thanks for your answer. I’ll take a closer look, I have more time for.
Your case sounds like a very complex palletizing case and I really hope you made it work with turning the gripper for the pick-up. For future reference it’s worth noting, that there are URCaps available in UR+ that optimized gripper orientation at pick-up as a feature in the URCap.
Pally URcap will calculate the best optimize grip for each box, available in UR+: UR+ | Pally Palltizing Software - Universal Robots - the gripper optimization have 4 different variables (that can be turned on and off): InFront, InLeft, InBack and InRight. Learn more here: https://rocketfarm.atlassian.net/wiki/spaces/PB/pages/379125794/Optimize+gripper+at+pickup+-+Gripper+Orientation