Hi everyone, been using the UR3E for the past couple months and working on a simple ball balancing platform. There’s a camera which detects the ball and an image processing pipeline gets the XY coordinates and sends them to a PID controller which then controls the robot with the RTDE library.
I want to know if it’s possible to do this development in the UR sim? Basically I need to attach a square platform 30 x 30 cm and have a ball on it and then I need real time access to the coordinates of that ball XY. So for this to work, the simulator needs to take physics into account and gravity. I want to know if it’s possible with any simulator? I want to be able to control the robot with a script on the polyscope, so like have a simulated version of a polyscope which I already have in a docker, but that doesn’t really have a physical simulator for the robot.
How shall I go about this? Appreciate any suggestions thank you!
I’ve been tinkering with UR5 control and just wrapped up a: a Unity-based VR setup that lets you teleoperate the UR5 arm using a Meta Quest controller. It handles real-time pose tracking, gripper actions for picking/placing objects, and smooth joint interpolation via TCP to a Python RTDE server. Super intuitive for immersive robot ops!
If you’re hiring for roles in robotics software, VR/AR integration, or automation engineering, I’d love to chat – hit me up with leads or openings. Open to remote/full-time gigs!
I do not understand how you set up Unity Simulation env. Is there any guide you followed? I am talking about a fully simulated set up so the polyscope which comes with the URSIM docker which you view with VNC in a browser. How do you connect that with your unity and what kind of set do you have in unity?