Unity webcam not pausing?

Hey all,

I’m currently following the Unity documentation about pausing a webcam (with some slight adjustments as I have 2 cameras in my laptop) and for some reason the webcam does not pause. I’m fairly new to Unity but I’ve fooled around with it for a couple of days and cannot seem to get this thing to work. I’ve tried implementing it with buttons too but I got the same result.

Could someone point me in the right direction?

Here is the code (Almost identical to the example in the documentation):

// Starts a camera and assigns the texture to the current renderer.
// Pauses the camera when the "Pause" button is clicked and released.
using UnityEngine;
using System.Collections;

public class TestGUI : MonoBehaviour
{
    public WebCamTexture webcamTexture;

    void Start()
    {
        webcamTexture = new WebCamTexture();
        Renderer renderer = GetComponent<Renderer>();
        renderer.material.mainTexture = webcamTexture;
        WebCamDevice[] devices = WebCamTexture.devices; // gets all cameras
        webcamTexture.deviceName = devices[1].name; // Select back camera
        webcamTexture.Play();
    }

    public void OnGUI()
    {
        if (webcamTexture.isPlaying)
            if (GUILayout.Button("Pause"))
                webcamTexture.Pause();

            else if (GUILayout.Button("Play"))
                webcamTexture.Play();
    }
}

Hi!

I’ve just tried your script and it works fine on my system (OSX, most recent 2020.1 alpha); I have 2 webcams as well. I’ve added your script to a plane 3D object, and the plane correctly receives the webcam content after I start playmode. And when I click the Pause button, the currently displayed image on the plane stops updating.

Now, the example code in the doc doesn’t have the best structure because once Pause has been pressed, both buttons disappear (they are displayed only if the webcam texture is playing …), which means “Play” never re-appears.

So a slightly better implementation for OnGUI would be

if (webcamTexture.isPlaying)
{
    if (GUILayout.Button("Pause"))
        webcamTexture.Pause();
}
else if (GUILayout.Button("Play"))
    webcamTexture.Play();

With this, I can correclty and repeatedly pause and play the webcam; I tried with both the first and the second webcam. But this doesn’t address the fundamental problem you’re describing, which is that the webcam isn’t pausing.

  • First thing to check would be to see if things work out correctly with the first webcam.
  • You may want to print out all the device names returned from the API, just on the off chance that they’re not ordered in the way you expect.
  • For experimentation, also consider unplugging the 2nd webcam to just keep one, if at all possible.

From your description, it also seems that the call to Play() does work, so something unexpected is definitely happening. I’d recommend you submit a bug with your project, as it may be something specific to the platform or Unity version you’re using.

Thanks for reporting this and hope we can iron this out quickly for you,

Dominique
A/V developer at Unity.

1 Like

Hey Dominique, thanks for the reply, yep it turns out it was something unrelated to the script. I highly prefer your button implementation as well so I might borrow that :smile:.

One issue that I am having is that I can only seem to get it to work on cube objects.

When using quad, I can display the webcam, but I cannot pause it as I get the following error “UnassingedReferenceException: The variable webcamTexture of WebcamScript has not been assigned”, yet it works fine for cubes.

For Image, RawImage and Panel the webcam does not display at all, only showing a white screen.

Hi again!

I just tried over here with a quad and am getting the same working result as what I had with my previous object (a plane). The message you are reporting would be consequent with webcamTexture not having been initialized. Maybe you reformulated your logic in Start() and you are no longer assigning to the data member? Or if there’s an exception thrown during the execution of Start() before it reaches the point where webcamTexture is assigned (but has time to start the playback), this would also explain what you observe. In OnGUI, you’d then end up using a uninitialized variable.

As for using your TestGUI script on a RawImage, you need to modify it since RawImage doesn’t have a Renderer component like your script expects. So in Start(), you’d need to establish the RawImage source texture using

GetComponent<RawImage>().texture = webcamTexture;

RawImage lives in a Canvas, and as such is rendering is not handled by the same mechanism as 3D objects like cubes, planes, quads and so on, so it doesn’t have a Renderer component. The CanvasRenderer you see is, perhaps confusingly, not derived from Renderer like the MeshRenderer component you see on 3D objects.

And finally, adding webcam into a Canvas Image or Panel would probably require an indirection through a material since these don’t have a texture field that can be used directly. But at this point, putting a webcam on these objects is no different than putting any other type of texture, so I suggest you look around for examples of what you want to do and then just adapt to use a webcam instead of normal textures.

Hope this helps!

Dominique

1 Like