Universal Robots Forum

Setup Camera/Webcam with URCaps

Hi !

I’m going to do a project that involves a webcam or camera to feed the robot. This camera will search an object and will transmit data, like position or coordinates, to the robot.

Is it possible to do a URCap to setup some parameters of the camera? And how I can send this data from the camera to the robot? Maybe with sockets or with an XML-RPC better?

Thanks for your answer and advice


1 Like

Depends on the interface to the camera.
But from the Java-layer in the URCap in i.e. the Installation node, it shouldn’t be an issue to create a socket or similar TCP connection to the camera.

Thanks Jacob!

Regarding to the image captured to the webcam, is it possible to show this image in Polyscope? Or only with a remote computer?

It would be awesome to show the realtime image from the webcam in Polyscope, just for move or calibrate the camera into the correct position.

Thanks for your time.

Indeed this would be possible.
In your InstallationNodeContribution, you should get the actual snapshot from the camera (as a still image), and them show this image in the UI.
You can in openView() or a sub-call there of, if the image should not be shown in all parts of the installation node, start a Timer, that periodically retrieves the latest image, and updated this to the View.
Then you should remember to nicely stop the Timer again, when the closeView() call comes.

There is an example of this Timer in “MyDaemonInstallationNodeContribution.java” in the MyDaemon example.
You should find a reasonable update-rate, as a compromise between memory/CPU consumption and live-ness, i.e. 2-5 Hz could be reasonable.

Thanks you Jacob, I will try it :wink:

By the way, how can I pass the actual snapshot from the camera to Polyscope?

I would have to write the actual image in the correct path in order to read it periodically?

Well, there would be quite a few ways, to get an image and show it in Java Swing.

Probably involves reading a BufferedImage or using ImageIO, and showing this in a JLabel or JPanel.

The JLabel has a constructor and method that allows you to set an Icon-class object.
First hit I found at StackOverflow.

I tried to implement the timer that reads an image from a selected path. This image is overwritten periodically when I send it via socket from the webcam.

But it doesn’t work. The image shown is always the same, it doesn’t refresh it. I don’t know if I’m doing something wrong. I first tried to read continuosly the image with a HTML display.

Maybe with Java Swing could work. The only thing I don’t understand is how can I read an new image every time the timer runs.

Simple Code :

  1. In open view start the timer and stop it in close view.
  2. Every time the timer run, it will execute :
    BufferedImage image=ImageIO.read(getClass().getResource("/com/ur/urcap/sample/ESRegistros/impl/logo.bmp"));
    Graphics2D g = image.createGraphics();
    g.drawImage(image,0,0, width, height, null);

But, the logo.bmp (Image) is always the same, even if I overwrite it…

I think the problem you have is the fact that the picture does not change because you don’t really change the path dynamically and you don’t set a new picture using the ImgComponent. So what you can do is:

I think that you can try and look at the example that we have provided through SDK in the example called mydaemon.
This example makes it possible to update the UI dynamically after some time ( defined by Timer) :


it calls the updateUI() method shown below every 1000 miliseconds:

Inside the run() method you can call your own method to update the UI.
This method that you call can have these lines of codes:


Which calls the methods below:

The method called could be a method that gets a new Buffered image with a new image path that is maybe randomized.


You can then set the return value as a new path for the bufferedImage using the value from the random method above.

I Hope it helped. It’s not the exact code, but you can try something like this.


Thanks for your answer, indeed it would help :blush:

My idea is to send an image via socket to the path and overwrite it periodically. The new image has the same name, so it’s not necessary to change the path. I thought that if I read as a BufferedImage, it would change…

Maybe, if I generate the file .urcap, it contains all the elements and it’s not possible to add new ones. Am I wrong?

Anyway, thanks for your help !

No, you are right. The file .urcap is an osgi bundle and once it is generated it is not easy to change. I’ve worked with it before where I had to generate a file everytime I’ve made change to it. I don’t know yet if it is possible to make a change or make an update on these during runtime. If I figure it out, I will post right away.
Thanks for being patient

1 Like

Hi there!

I’ve been developing my URCap. My main goal is to display the actual image of the webcam/camera through sockets. I’ve created a ByteArrayOutputStream to send the image captured from my cam. This ByteArray is read in the URCap, and then it “set” a label that contains the image.

My problem is that I’ve tried and tested this socket in the VM and works perfectly. It receives the image every second, set the label and then I refresh it in NodeView, in order to show the actual image.

But it doesn’t work in the real UR. I’ve tried to modify the refresh and the timer that executes the input image, but doesn’t work. Any idea? I received a pixeled image.

This is part of the code to receive the image:

// Se recibe la imagen mediante el input de datastream y se escribe en el directorio

	byte[] imagenrecibida = new byte[62100];
    System.out.println("Stream recibido");
    BufferedImage imagen = ImageIO.read(new ByteArrayInputStream(imagenrecibida));
    System.out.println("Imagen guardada como BufferedImage");
 	return imagen;

Code in NodeView to refresh the label:

	private void refreshPanelLabel(final JLabel label) {
		timer = new Timer(0, new ActionListener() {
		public void actionPerformed(ActionEvent e){
		// Refresca el panel a 60 FPS


public void onStartClick() {

		uiTimer = new Timer(true);
		uiTimer.schedule(new TimerTask() {

			public void run() {
				EventQueue.invokeLater(new Runnable() {

					public void run() {
						// Conexión válida será true siempre y cuando se mantenga la conexión. Si salta el timeout, para
						enviarthreshold.sendNow(getIpAddress(), getThreshold(),getOtsuact(),getErode(),getDilate());
						BufferedImage image = obtenerimagen.readNow(getIpAddress()); // Se establece la conexión y devuelve la imagen del socket

		}, 0, 1500); // Every 1.5 seconds received and set the image

Hi everyone!

I’ve hardly trying to fix this issue, but I don’t know what else it could be. I’ve got some questions?

When I send a bytearray in a socket, there’s a limit?
Why when I try to send the image and add it to a jlabel in the VM works fine but in the real robot no?
Any idea to resolve the flickering?

The code is the same that the last time…

Thanks for your time

Since it receives a pixelated image, it might be the way you’re formatting/ encode the message that you send through the socket or when you decode the message. It is hard to say, but you can read about the Bytearray send through socket using this link: https://s3-eu-west-1.amazonaws.com/ur-support-site/45989/scriptManual.pdf

And maybe look at this topic: Java socket communication

Hope it helped!


I’m pretty sure that I send and receive correctly the image. Notice that I’ve tried it in the virtual machine and works just fine. When turn on and I send the first image, the robot draws without any problem.

I send a new image every 2 seconds and in that moment, the robot draws it wrong. Thus I think the message arrives well.
Maybe is when I refresh the image? I create a new Icon every time and then I set this new icon to the JLabel…

I found the solution.

You cannot read a ByteArray bigger than 32 bytes. With a bucle “for” sending one byte every time, it works and draw it perfectly.

Thank you for your time guys :blush:

1 Like