Input.Touch on Two GUITextures for Android.

Hello, I am making this simple 2.5D platformer and I’m having two GUITextures on my screen.
One is for movement to the left and the other is for movement to the right.

I have got one of my GUITextures to respond the the touch and activate the walking motion.
But when I try to touch it the other way the application doesn’t work.

Here is my current code:

public float runSpeed = 10f;//runspeed..

public GUITexture androidRight, androidLeft; // This is the two GUITextures 

void Update(){

if(androidLeft.HitTest(Input.GetTouch(0).position)){
		
		
			if(Input.GetTouch(0).phase == TouchPhase.Stationary){
				
				transform.Translate(-runSpeed*Time.deltaTime,0,0);
				
			}
			
			
		}
		
	if(androidRight.HitTest(Input.GetTouch(0).position)){
			
		
			if(Input.GetTouch(0).phase == TouchPhase.Stationary){
				
				transform.Translate(runSpeed*Time.deltaTime,0,0);
				
			}
			
			
		}	


}

I assign the two textures in the inspector to the player with the script

And when I run it on my android device it only works for one way, the way to the left, if I change the place of the GUITextures in the inspector the one to the right works just fine.

Please any input on how to get my android to register both buttons instead of just one?

PS: I suspect it has something to do with the TocuhPhase.

thanks in advance

I managed to figure out what was wrong, the script it self was very good nothing wrong with that one, but on the otherhand I figured that I only needed one button and just flip the other one around using the X and y axis on the pixel inset in the GUITexture, this is however wrong because the Gui seems to be the other wau it is infact turned and can not register a hit on the hit test, it was just my stupid fault of not thinking that unity is a 3d software and not a 2d.