[SOLVED] Different working on OpenGL and DX

I have this shader, used inside a update loop in this way:

        if ((DateTime.Now - mLastEffectTime).TotalSeconds < 1.0f / EffectRate)
            return;
        
        mLastEffectTime = DateTime.Now;
        mDisperseMat.SetFloat("_AlphaTransmition", AlphaTransferRate);
        mDisperseMat.SetFloat("_ScanOffset", AlphaScanInterval);

        Graphics.Blit(AuxRT, AuxRT, mDisperseMat);

The fragment shader:

float4 frag(pData pixelData) : COLOR
			{
				float4 texCol = tex2D(_MainTex, pixelData.uv);
				if(texCol.w > 0)
				{
					for(int i=-1; i<2; i++)
						for(int j=-1; j<2; j++)
						{
							float2 shiftedUV = pixelData.uv;
							shiftedUV += float2(i*_ScanOffset, j *_ScanOffset);
							float4 nearCol = tex2D(_MainTex, shiftedUV);
							texCol.w -= (1.0-nearCol.a) * _AlphaTransmition;
						}
				}
				
				texCol.w = max(0,texCol.w);
						
				return texCol;
			}

It works perfect on DX, but in OpenGL it seems like just one loop of the Update script is called.
What could be the problem?

(The shader disperses the alpha on a texture).

Perhaps using the same texture as the source and destination doesn’t work properly in OpenGL? Have you tried using two different textures?

This is correct. You will need to use two textures and ping pong between them as target and source.

I was thinking that could be something like that.

I’ll try.

edit: same effect, in DX works pefect but in OpenGL dont work at all.
Sometimes it gives a black texture as a result, and some others dont do anything (in the same execution).

PROBLEM SOLVED!

i defined my sampler as a sampler2D, but the texture was a RECT instead (a rendertexture), so the texture wasnt binded by Unity.

I redefined my sampler as a samplerRECT and used texRECT instead of tex2D. (Becarefull cause the uv coordinates used in texRECT are [0-screenWidth, 0-screenHeight] instead of [0-1,0-1]).