Move the robot to the position which is sent from the PLC

Hello, community.

This is my first post here, so please, be gentle… :slight_smile:

I’ve been given a task to program a UR10 robot, which must move according to the points,
which are given to him by the Siemens PLC over the PROFINET.

We have successfully established a Profinet connection between PLC and UR and
in the PLC we can see all the input registers and we can also set the output registers.

Because this UR will be on the machine which can produce different pieces,
they change the tools on this machine for every different piece.
That means that there will be different points (waypoints or poses or locations)
for every machine tool (not the tool/gripper on the UR) from where the UR will pick
the pieces and also different place locations where the UR will be putting down the pieces.

So we thought we would make recipes in the PLC, where we will store all
the points for each robot move for each different machine tool.

Because we can see the current TCP position (X,Y,Z,Rx,Ry,Rz) over the Profinet registers, we can save the
points p[X,Y,Z,Rx,Ry,Rz] in every moment we desire.
Then we can send back to the UR over Profinet registers these points
when they are needed and use ‘movel’ to move the robot to that point.

So far so good…

In our workshop when we attach the UR to the table and the whole concept is working.
Because the robot current TCP position (X,Y,Z,Rx,Ry,Rz) over the Profinet registers is relative to the base,
we can send back these positions and use them to calculate the difference between the current TCP pose and wanted location and use that with the ‘movel’ script command.

But the catch is that in a real situation (at the customer) the UR will be mounted on the top of the machine at around 45° or 50° angle.
And when we mount the UR in our workshop at the 45° angle on the table, the whole concept collapses.

I’ve been trying for a whole week now to make this work but I’m failing.

In the installation tab in the ‘Mounting’ section, I’ve set the correct mounting orientation and angle of the robot.
I’ve also created the plane in ‘Features’ section, which has X and Y in the same direction as the base
(red and green arrows pointing the same as on the base, and the blue arrow is pointing DOWN).

I’ve been reading the forum about pose_trans and base coordinates and TCP coordinates and planes and
trying all sorts of things, but I have to admit that all this stuff is new to me and I don’t understand it… :frowning:

I understand that I need to do some transformations of the points which are sent from the PLC,
so the robot will move relative to the plane (which is flat, like the table; so when I want to move it up (Z) it must go straight up, not diagonally).
But for God’s sake, I don’t know how to do it or what to do.

That’s why I’m asking the community for help.

Can somebody explain (in layman’s terms or simple as possible) to me and/or show me a piece of program/code how this should/can be done, please?

Thank you in advance…

Hi @anton.smerc,

It sounds like you’re on the right track, but your robot base coordinate system is fixed with the orientation of the robot mounting, adjusting the mounting configuration only helps the robot predict which way gravity is going to act, but doesn’t do anything with positions.

You will need to create a feature coordinate system aligned to the horizontal workbench surface. Then within polyscope you can select this Feature under your move command instead of Base.

If you haven’t already gone through the UR Academy e-learning, I’d say that’s a good place to start. There’s a module specifically about feature coordinates in the e-Series pro track that should get you well on your way with this:

https://academy.universal-robots.com/free-e-learning/e-series-e-learning/e-series-pro-track/

Hope this helps (and was sufficiently gentle :slight_smile: ) , feel free to come back with remaining questions.

Thank you @ajp for your answer.
I’ve seen all the videos in the UR Academy (OK, it was a few years ago, but I rewatched the Features again) and I think that I understand how the planes work.
But that still doesn’t help me to solve my ‘problem’… :slight_smile:
I’m using CB3 UR10.
Over the General Purpose Output Registers float on the Profinet the robot is sending current TCP position for X, Y, Z, Rx, Ry, Rz in relative to the base:


PLC is saving these positions when needed and then (again when needed) is sending back the position to the robot. But these again are relative to the base.
In my test program on the robot, I have a thread, where I read the input registers float (where the PLC is sending the position) and create a p[X, Y, Z, Rx, Ry, Rz].

  Thread_2
     Loop GPbi_start and var_5
       var_5≔ False 
       X_in≔GPfi_0
       Y_in≔GPfi_1
       Z_in≔GPfi_2
       Rx_in≔GPfi_3
       Ry_in≔GPfi_4
       Rz_in≔GPfi_5
       tocka_cilj9=p[X_in,Y_in,Z_in,Rx_in,Ry_in,Rz_in]
     sync()

Then I use this variable ‘tocka_cilj9’ in the main program.

   Robot Program
     MoveL
       Waypoint_4
       tocka_cilj9

The MoveL command is set for my custom plane, not for the base.
‘Waypoint_4’ is first point which I created in polyscope and the ‘tocka_cilj9’ is my ‘custom’ point which I get from the PLC.
When I run this, the robot goes to the first point OK, but then it wants to go somewhere totally unexpected and I must stop it.

I understand that this custom point ‘tocka_cilj9’ must somehow be converted to the coordinates for my custom plane (although I thought that MoveL command set to the plane will do that), but where and how?

Or maybe there is another way: If the robot could send the current TCP position relative to my plane (not the base) to the PLC, then I think I could use that directly in the MoveL command?

Hi Anton,

Where is the PLC getting these positions from? Were they recorded from the robot when the base was flat, but now the base has moved, but you still want to go back to where those positions were before it moved?

What you could do is create a custom feature frame that represents the original base position, and transform your points into that frame.

The section at the bottom of this article shows how to convert from base to feature frame in script:

https://www.universal-robots.com/articles/ur/programming/urscript-move-with-respect-to-a-custom-featureframe/

Does that help?

The PLC is constantly reading the current TCP position over the Profinet:


And the position is recorded (saved in the recipe on the PLC, when the button is pressed on the Siemens HMI panel).
And the robot is mounted at a 45-degree angle.
So no, the position is not recorded when the base was flat…

I have seen that article and tried but no matter how I do the pose_trans the robot won’t move to the position sent from PLC…

This is my very simple test program:


The plane is created:

and this is what is happening:
robot_video
The second point (tocka_cilj9) is aprox. 30cm down from Waypoint_4.
But the robot goes instead to somewhere completely different…

And this is what the robot get from PLC:

I found the .script file on the robot FTP of this simple program:


All these numbers in the first movel (for Waypoint_4) are already in the relative to plane2 ‘format’.

Why are they pose_trans again?

Hi @anton.smerc , seems I misunderstood the problem in the first place. In this case you shouldn’t need to do any transformation, if you’re recording the values through fieldbus (which are representated in base frame) then you should just be able to put them back into a pose variable as you are doing, and then execute under a move command with Base selected as the feature. Is this not working?

No, it’s not working… because the robot is at 45 degree angle and the base feature doesn’t ‘know’ that.
That is why I have created ‘plane_2’ which is flat (horizontal) and the robot must move at this plane…

Hi @anton.smerc,

Your previous video showed the outcome when Plane_2 was selected, right? Does it behave differently when you have the base frame selected?

If you save the values with the base frame selected, and play them back in base frame, the 45 degree angle shouldn’t make any difference. Is the problem the position of the final points? Or the robot not approaching them perpendicular to the horizontal plane?

I can see from your above screenshot that the positions of home1 and tocka_clij9 are almost identical. Does the robot go to the same position for both?

Hello,

I have had the same issue when the robot is mounted at 45 degrees.
My solution was to use a fixed Rx, Ry, Rz…but I would really like to pass a rotation based on a vision coordinate.

When I pass a pose from the vision>PLC>robot, it seems like the move uses the base that I select, but not the rotations. Is there a fix for this?

Thanks!
Courtney

can’t see the rigth values, tool values on robot X=-844.40, Y=131.86, Z=510.61, RX=1.624, RY= 3.946, RZ=-1.624
PLC input TCP X=-0.8440, Y=0.13186, Z=0.51061, RX=–0.61078, RY= -1.4840, RZ=-0.61075
the RX, RY and RZ are diferent
Any ideas
Thanks