Do glsl shaders work on ios without hassle?
I mean, i dont have the means to test it on an actual phone or something. But if i write a glsl shader, is it going to work straight away?
Apparently there are some differences. If I remember correctly, Unity on Mac doesn’t allow you to cast a mat4 matrix into a mat3 matrix although this is correct GLSL code and works on Unity on PCs. There might also be other differences.
Well i just made a simple sample and it says no subshader is capable of running on my graphics card.
Soo, glsl seems like a bad idea.
According to documents, i shouldnt use glsl anyways.
GLSL can be useful if youre writing ios optimized shaders, but except from that its better to stick with cg (in my opinion).
Did you start Unity with the -force-opengl command option? ( http://unity3d.com/support/documentation/Manual/Command%20Line%20Arguments.html )
fully agree.
Since the last round of GLSL optimizer implementation in Unity and CG → GLSL generation for mobile I’m actually willing to say “unless you are an expert in writting mobile GLSL (which is not the same as glsl on desktop, there are different things missing and some pretty impacting deltas!), cg is always better”, cause the deltas that still are there require that you know hardware, limitation and behavior inside out to get away more performant normally.
If a shader is written in cg, is it definitely going to work on ios and android?
Yes, Unity 3 has extremely solid and optimized capabilities to generate optimized glsl shaders for iOS and Android.
Great.
This works if you enable a later GLSL version with: #version 120
or something like that… Okay, I didn´t test this with Unity, but it fixed the problem for me with some custom engine and as the GLSL compiler is the same, it should work with Unity too.
Obviously not, for example because hardly any GPU on mobile devices supports texture fetches in vertex programs but (as I was told here) Unity supports texture fetches in vertex programs.
I tried with #pragma target 3.0. I still get
Did I do it wrong?
Shader "" {
SubShader {Pass {
GLSLPROGRAM
#pragma target 3.0
#ifdef VERTEX
void main() {
mat4 m = mat4(mat3(0));
}
#endif
#ifdef FRAGMENT
void main() {}
#endif
ENDGLSL
}}
}
I think Slin was talking about using “#version 120” in the GLSL program. As far as I know this won’t work with standard OpenGL ES 2.0 on iOS (because OpenGL ES 2.0 only requires support for version 1.00 of GLSL). It might work with desktop OpenGL on MacOS X but I don’t know whether (or how) Unity allows you to insert the line “#version120” in a GLSL program.
If so, I can’t find a place to put it without an error. That’s why I tried #pragma target 3.0; I figured it would give equivalent results, but I honestly have no idea. Let me know, guys, if you need me to test anything.
I guess one idea to test would be to compile a GLSL shader without the mat4 to mat3 cast and then to insert the cast into the compiled GLSL code. (I assume this avoids the problem which is probably the processing of the shader code by Unity, not the actual GLSL compilation by the OpenGL driver.)
(If that doesn’t work, one could also try to additionally insert the “#version120” into the compiled shader.)
Of course, that doesn’t solve the problem; I’m just trying to make sense out of what Slin said.
correct but thats not a matter of CG but of you using something that was not supported. Knowing that is what it takes to write shaders, as shader devs are the ones that are meant to know about their target when they start their work, not when they get struck with errors left and right
GLSL would throw a similar errors for the same reason.
The easy solution in this case is using Unitys capability to include and exclude renderers, in this case you want to write a pass that is for everything but opengles and wii which has it in for example, while you write one that does not have it in, that only includes opengles.
Thats exactly what this feature is designed for, allowing you to write platform focused ones that omit missing hardware features or that are lighter for example (or less precise)
#version 120 has to be in the very first line of the GLSL shader and I guess that Unity puts some defines or something in front of it, which causes it not to work…
Also, this does only work with desktop GLSL, which is slightly different than the OpenGL ES ones. The differences are some pragmas like #version and especially the precision specifiers. The “problem” with the precision specifiers can be easily avoided with empty defines checking the GL_ES define, which is automatically set by the GLSL ES compiler. The #version would probably have to be set by the engine in combination with the VERTEX or FRAGMENT define. For this it could for example parse the pragma target 3.0, which would might be a usefull feature request.
Last time I checked excluding renderers, it did not work correctly…