Why is tapping Android Screen detected via Mouse Button 0 calls

Hey guys, quick Android question.

If I have the target platform switched to Android and have the following code (and nothing else) in the scene running:

	void Update () {
	
		if ( Input.GetMouseButtonDown (0) ) {
			Debug.Log ("Button Pressed");
		}
	}

If I tap the Android screen (while using Unity Remote), the message gets displayed. This is also the case if ‘mouse 0’ is an input within an InputManager axis and I touch the screen (obviously asking if that axis has been pressed).

Is there a reason that tapping the screen is identified with pressing mouse button 0? At first I thought it might be the EventSystem, but I’ve removed that from the scene so I’ve simply got this script, camera, light in the scene.

Thanks for any help.
Cheers

It’s been a longstanding (although I’m not sure officially documented) behaviour that the first finger touch on a touch device maps to a LMB click. This makes it easy to quickly design interfaces that work on both touch and mouse by only listening for GetMouseButton(0), though I wouldn’t rely on it and recommend you use the “proper” Input.touches input.