Hi,
Can anyone clarify for me whether a Nvidia GPU is a pre-requisite for using the Video Render Streaming capability? The following link kind of suggests that this is true:
https://docs.unity3d.com/Packages/com.unity.renderstreaming@2.0/manual/en/faq.html
but in other documentation, I’ve seen a control for ‘using encoding hardware’ which to me suggests that it doesn’t have to have it.
To date, I’ve had no luck (even using it on an Nvidia laptop).
Thanks in advance for any pointers,
Dave
Did you check the linked compatible NVCodec GPUs and that your laptop’s GPU is supported?
If it is supported be sure to update to the latest nvidia drivers (whql certified).
The control you mention is probably just the toggle between software and hardware encoding. For software encoding you’ll ideally need a beefy CPU with many cores. It may not work (well) with a mobile CPU, realtime video encoding in software is very demanding, especially while there’s also a game to be run and rendered at the same time.
Other than that, you may get more help if you mention what “no luck” entails. 
Hi. Thanks for the response and the NVCodec GPUs link. Looks like is compatible (RTX 2070).
Basically, what I am seeing is that using the standard Broadcast example and having run the server, any attempts to view the streamed output on a browser results in a blank/black view. Everything is hosted locally, ie. Unity running the application, the webserver app and my browser(s). I’ve tried Chrome, IE and Edge and all give the same results. The Unity app appears to be free from errors. I’ve tried a couple of different tutorials for the unity end but again, they give the same result.
Interestingly, when I run the Broadcast example, I get the following on-screen message when I click the ‘Show Stats’ button: SignalingHandler is not set or Send/Receive stream has not started.
Not sure what this means - this is using the Broadcast example straight from the box.