How to account for various screen DPI?

In Unity Touch.deltaPosition is measured in pixels. iPhone 4 has twice the DPI of previous iPhones, so that means the user can cover twice as many pixels with the same gesture (sliding a finger). And, of course, iPad has different DPI altogether. How can I account for that in my game?

Edit: Also, consider the player using an iPad vs iPhone. A finger swipe on iPhone can cover most of the screen, whereas the gesture of the same length on iPad will cover only a quarter or so of the screen. I want to treat those gestures the same way.

Edit: So far the best solution seems to be: check iPhoneSettings.generation and have your own hardcoded values for all listed devices.

You can always normalize pixel values by dividing the number by Screen.width or Screen.height where appropriate, eg:

Vector2 touchDelta = Input.GetTouch(0).deltaPosition;
if (touchDelta.x / Screen.width >= 0.25)
    //Swiped right more than a quarter of the screen
    Debug.Log ("Horizontal Swipe");
if (touchDelta.y / Screen.height >= 0.5)
    //Swiped up more than half the screen
    Debug.Log ("Vertical Swipe");

If device-by-device resolution on is important and you're developing for iOS only, the most specific device type information is available by requesting iPhoneSettings.generation. Be aware that if you take this approach, you'll need to continue updating your program as new devices/generations are released. You can also use Screen.width/height and divide it by the device's physical screen size (which you'd have to record in a constant each for iPhone and iPad), which historically changes with much less frequency.

As an example, let's say you made a game for a resolution of (400x200)

First, convert pixels to normalized screen units. This way, tapping the exact center of the screen would give you coordinates (0.5,0.5). To do this, just divide each component of deltaPosition by the resolution of the screen in that direction. Try this:

Touch touch = Input.GetTouch(0);
Vector2 newDeltaPosition = new Vector2(touch.deltaPosition.x / Screen.width, touch.deltaPosition.y / Screen.height);

Second, multiply it by your intended screen rez:

newDeltaPosition = new Vector2(newDeltaPosition.x * 400f, newDeltaPosition.y * 200f);

Hope it helps a bit...

As already stated herein, touch.deltaPosition is based on pixels, which denser screens have more of. So, you will need to factor deltaPosition per screen dimensions on X and Y. But it generally doesn’t have to be precise, just close. As such, I’ve had a lot of luck using futzy stuff like this:

function LateUpdate() {
adjustedDeltaX = (deltaX/Screen.width) * Time.deltaTime;
adjustedDeltaY = (deltaY/Screen.height) * Time.deltaTime;

Here, adjustedDeltaX and adjustedDeltaY will come out roughly the same pixel-to-physical distance swiped across all screen sizes.

Thought I would add my 2 cents on this, since this question came up in Google for me, but I came up with a different solution that seems to work.

I was looking for a conversion between Screen space to World space. For my case, I am building a 2D game, and so I’m using Orthographic cameras.

As per Unity docs, the Orthographic size is defined as half the vertical viewing space (or double that, and it’s the full vertical space):


float screenToWorldY = 2*Camera.OrthographicSize / Screen.Resolution.Height

And, for the horizontal space, you must incorporate the AspectRatio:

float screnToWorldX = 2*Camera.OrthographicSize * Camera.AspectRatio / Screen.Resolution.Width