Unity and OpenGL ES 3.0 vs 3.1

Hi

This is a question about how Unity works with OpenGL 3.0 vs 3.1 in Android builds.

I know the difference between the apis. My current game works fine in 3.0 and it doesn’t need 3.1. The question is whether Unity improves internally anything if I set it to build for 3.1

For instance, does Unity internally adjust the terrain textures to use texture arrays when it detects it’s compiling for 3.1? That kind of thing. Does it do any tricks when I set it to 3.1? Or it really does not matter and I should leave it at 3.0 for more compatibility.

The GLES/EGL context that Unity creates at runtime will be the latest (minor) version that is supported, so ES 3.2 on new devices.
The Player Settings “Require ES 3.1” checkbox adds that requirement to the Android Manifest. It also affects the shader define “SHADER_API_GLES30” which is only used by URP to configure its light data array.
I can’t think of any other effects at buildtime.

So basically there is no difference from using 3.0 or 3.1 on unity side?
Because the textures array thing was new on 3.1 and it has a huge impact on things like terrain. I was expecting Unity would be smart enough to detect that.

Unity doesn‘t have a separate set of shaders for ES3.0 and ES3.1 (only ES2.0 and 3.x). However some shaders have separate subshaders for different target shader level or specific APIs. For those cases Unity would pick the better version at runtime. Other features such as GPU skinning that uses compute also has a runtime fallback.