Linux - Rendering on the Server

We are trying to generate a Linux build that sits on a server, listens for Firebase events and when receives the request, starts generating a render.

Everything works perfectly, except that all renders are just black. Is there any way to render stuff on Linux when there is no Display Device (a.k.a. monitor)?

1 Like

Yea so we don’t support rendering in a headless/batch mode for the player. When you use this mode we actually set the internal gfx stuff to prevent rendering explicitly.

Thanks for your response, Andrews.

A follow-up question, if you don’t mind:
Is it possible to render if there is GPU available (and all drivers are working fine), like described in the link below, or it isn’t anymore?

So the method described in the link above essentially is not running in true headless mode because it does require that you still go through a window manager such as xorg and it’s actually using the standalone Linux player and not the headless build which is optimized for performance.

At this time there is no way to make the truly Linux headless build render.

maybe you can try this method:
xvfb-run --auto-servernum --server-args='-screen 0 640x480x24:32' /Unity/Hub/Editor/2019.3.7f1/Editor/Unity -openfile /home/Server/Assets/Res/Scene/Main.unity -username "xxx" -password "xxx" -logFile /home/LogFile

Just want to be clear we do not support running in this env

This blog post: https://blogs.unity3d.com/2020/06/17/scaling-kubernetes-jobs-for-unity-simulation/ seems to indicate that Unity has figured out a clean way to scale headless/batch operations. Can you share any details about the docker configuration used here, and how it does headless / batch without X?

Hopefully you are not strategically placing info/techniques on this behind an extra Unity Simulation paywall?

We currently don’t support batchmode rendering. The blog post above talks about the Unity Simulation workflow that supports running a regular Linux standlone player with OpenGL GfxAPI using llvmpipe (CPU based renderer) via XVFB.
Information on how to run your unity build in a docker container with xvfb can be found here (This is in context to Unity Simulation).

More information on Unity Simulation can be found here

Thank you, that is very helpful. If the goal is to pump out maximum “throughput” for Unity Simulation (unique training images per second, say), isn’t CPU rendering a big limitation? Or is it actually dollar-wise more efficient than GPU support?

Unity Simulation provides a way to run your application with different app configurations (aka app-params), each of them generating unique data set, in parallel. If it is more efficient dollar-wise than GPU support, really depends on the simulation workload you are trying to run. But yes, in general, GPU instances in the cloud are a lot more expensive in comparison to spinning up a bunch of vCPUs.

Hi,
Are there any updates regarding the support of rendering in a headless/batch mode for the player? Or other ways to accomplish a Linux build that sits on the server and generate the render for a client based on the input received from the client.

if you havea machine with a gpu, use a headless hdmi or displayport adapter that supports your target resolution and framerate. this negates the need for a display.

in my case i’m working on a client-host rendered game.

our client is light weight to run on the steamdeck and only renders out 2D Ui elements. and gathers player input.

our host is any high end pc or server. we use the standard unity player and create and blanked out render to display with a log events ui. then render out using Texture2D.EncodeToJPG() from the camera. rpc the frames over to client as byte array. client then uses Texture2D.LoadImage(textureBytes) to a world canvas in ui. while also accepting the client ui inputs. and returning ui state events.

shiferawyw I should note that my above example. you can not do this if you run the game as a windows service or linux daemon. as those would place it into system thread 0 which has not GPU acceleration attached. So you must run the host unity player as an actual application which can gpu render.