Any way to know what renderer is being used?

Hi,

Ok this is strange, I’m sure there was a method to determine which renderer is being used at runtime (i.e opengl or directX), but looking through the scripting reference and searching google is coming up empty.

So am I just missing this function or does Unity actually not expose a function to tell you which renderer is being used at runtime?

Camera.actualRenderingPath

Thanks, but my mistake I didn’t make it clear in my original post. I didn’t mean which rendering path but which API (opengl, gles, dx) is being used.

There are various useful bits of information in the SystemInfo class. This is probably your best bet:

Thanks, but i’d already looked through that. Whilst it might work for single API supported devices it (and all others I can think of) fall down when you consider Windows and the fact that Unity could be rendering in Dx or Opengl.

What I have is a shader split into two versions a CG one and a dedicated GLSL version. These are required because the CG version causes graphical errors under opengl for no good reason. So at runtime I need to decide which version to use.

I have also considered using compiler defines in the shader, but I assume thats compile time based and thus wouldn’t solve the problem especially if running under opengl on windows.

So i’m guessing that the current rendering API isn’t actually exposed by Unity?

I don’t think shaders are precompiled. You should try the compiler defines and see if it works.

I managed to address the problem in the end by creating a single shader with subshaders and having the different subshaders defined explicitly for different renderers using the Unity pragma only_renderer or exclude_renderer. Bit obvious when you think about it :wink:

Still surprised there is no method of checking what the current 3d api being used by Unity is.