Textures blended differently - OpenGL ES vs. Unity

I am porting a 2D game written for Android (Java, OpenGL ES) to Unity. It’s working great, except there is a difference of how the textures that contain alpha are blended together. Comparison in the following image:

The object in the picture is a series of triangles with the bases in the center of the object. You can see than in OpenGL ES the triangles blend with a sort of shadow, while in Unity they seem flat. I use the same texture/colors in both codes, and no lighting in either of them.

In Unity I render it as a Mesh with the following shader:

Shader "ApplicationRenderer"
{
    Properties{
        _MainTex ("Texture to blend", 2D) = "black" {}
    }

    SubShader
    {
	Lighting Off
        ZWrite On
	ZTest Always
	Cull Back
	ColorMaterial AmbientAndDiffuse
        
        Tags{
            "Queue" = "Transparent"
        }

        Pass{
	    Blend SrcAlpha OneMinusSrcAlpha
            SetTexture [_MainTex] { combine texture * primary }
        }
    }	
}

Could anyone help me with a suggestion on how to get the same result in Unity as in OpenGL ES? Much appreciated.

Thanks.

It looks broken to me in both cases; it looks like your triangles don’t match the shape of your texture. Let’s see the texture and screenshot where the meshes match better. What OpenGL code are you using?

It is slightly difficult to see on those tiny pictures, but the object appears to be a complex concave geometry with a transparent shader. A hardware limitation of using transparent shaders with such objects is that the order in which the triangles are renderered determines what the object looks like. This rendering order may differ between rendering engine implementations and will certainly differ between viewing angles. This sorting issue is also almost certainly the reason why both images contain sections of the model that appear to be cut-off.

The black pixel issue may be related to mip mapping. Does your texture contain any black in areas where alpha is 0?

I would agree with you if it weren’t for ZTest Always.

Thank you Jessy and Tomvds. The geometry is very simple, it is a 2D object made out of several triangles that overlap in the same plane. Please see the attached picture:

  • The small triangles in the wireframe picture are ok - ignore them;
  • The big triangles (I outlined one of the big triangles with white) are arranged in a star-pattern, with the base overlapping and one of the corners pointing outwards, creating the “spikes” you see in the rendered image.
  • The texture is a simple pattern with fading alpha; the checkers pattern in the texture is from Photoshop, meaning 0 alpha;

@Jessy
The texture is mapped on the big triangles in the wireframe, with the middle-top mapped on the vertex that points outwards (0,0 - 0.5,1 - 1,0). The colors are given in a color array, the texture is black white. The OpenGL ES settings I use in Java are the following:

gl_.glEnable(GL10.GL_DEPTH_BUFFER_BIT);
gl_.glDepthMask(true);
gl_.glDepthFunc(GL10.GL_LEQUAL);
gl_.glClearDepthf(1.0f);
gl_.glEnable(GL10.GL_BLEND);
gl_.glBlendFunc(GL10.GL_SRC_ALPHA,GL10.GL_ONE_MINUS_SRC_ALPHA);

This produces the image above “OpenGL ES”.

@Tomvds
Sorry, I wasn’t clear that the object is 2D. Regarding the rendering order, while I do not draw the triangles in a particular order, I don’t understand how can I get such different results in Unity. It most certanly seems to be a shader problem.

I must add that I am not very familiar with shaders, that’s why I am relying on someone with experience to point out what is wrong in this case.

Thanks!

I don’t know OpenGL, but that looks like ShaderLab’s default, ZTest LEqual, not ZTest Always, which you’re using in Unity. Try commenting the latter out. Not that I think it will look better, but it should match better.

I would recommend that you try to do that in more than one draw call.
Neither the OGL ES, or Unity (OGL on Mac, D3D on PC) are rendered correctly.

The difference definitely comes from depth writing and testing being set differently, as Jessy pointed out.

I’m not sure what IJM means by “correctly” given the abstract nature of the effect.

It is evident on both renders that the depth sorting is not correct.
Fully transparent fragments from the foreground polygons are hiding nontransparent fragments of the background poygons, If they are drawn into the same pixel. This is a well known problem of order-dependent transparency :slight_smile:
You can fix it by doing multiple draw calls.
Unity has no renderer written for D3D11/OGL 4.0, so order-independent transparency is impossible.

https://graphics.stanford.edu/wikis/cs448s-10/FrontPage?action=AttachFile&do=get&target=CS448s-10-11-oit.pdf

No,that can’t happen with the Unity shader, only the other one, as I responded to tomvds.