# Targeting screen differences between iphone 4 and 5

I know these types of questions have been asked a lot, and believe me I have spent all evening searching and trying to find a practical example but so far my search has turned up empty (with my best lead being this question on Stack Overflow android - How to Work With Different Screen Resolutions - Stack Overflow

But can anyone, in a practical example show how to work out the correct scale for a game?

For instance, lets assume I have a single sprite that is 256x256px (with pixels to units set at 256) and regardless of the device (iphone 4 or 5), with the device in portrait mode, how would I calculate my scale so that my game fits 6 of these sprites across the screen and I would just have a background that fills up the remaining space if need be.

So the resolution of the iphone 5 tall is 640x1136. The scale is always based on height. In my example that works I currently have the scale set to 9.03125. I arrived at this number by the following math

1136 / 256 (height over pixels to units)
= 4.515625

4.515625 * 2 = 9.03125

Now I just guessed the * 2 basing it on the fact that the scale is half the height, but I’m still not sure if this is correct. I feel like the 640 should be in my calculation somewhere to help work this out. Is anyone able to explain with numbers, how I would work this out, so that my scale would always fit the 640 width in for iphone 4 and 5 ? I am aware that I would have to have either extra vertical or horizontal background on 1 or both the devices to accomodate this, but I can’t work out how to work out the minimum. I tried setting the player settings to use 640 as the width and this looked fine on a 4 but it was stretched vertically on the iphone 5

Thanks

What do you mean by ‘scale’ here?

So, you have a 2x3 grid of tiles. That means your game is effectively 2:3 aspect ratio, meaning the game runs in portrait mode. So, you can get the screen dimensions of the device, and compute the aspect ratio. If you have an iPhone5 with a 1136x640 display, then you’ll end up with black bars top and bottom of the screen, because a 2:3 aspect ratio is larger (taller) than a 640:1136 (40:71) aspect ratio. (Or put another way, if your tiles were 320x320, then they’d fit perfectly into the iPhone4S screen, since 3x320=960. Since the iPhone5 screen is also 960 wide, your 3x320 tiles will be 960 tall, so leave 176 pixels at the top (or bottom), so you’d shift the tiles down (or up) to make an 88 pixel high row top and bottom.

(And now I’ve written that I’m not sure you have a 3x2 grid of tiles…)

No idea where the *2 or 4.515625 come from.

If you want 6 tiles across the width of the screen, then each one is 960/6=160 or 1136/6 = 189 pixels wide.

I mean what scale should I set the camera so that regardless of whether the user is on an iPhone 4s or an iPhone 5 they will always see the full width of the game, even if there is extra background

Graham, I’m not sure where you got the grid of 2x3 tiles from. If I wanted all 6 to sit next to each other horizontally across the screen you’ve basically calculated what the tiles should be based on the dimensions on the screen. But 6 is just an arbitrary number here. If I was making a game for multiple platforms, then I might already have the assets at a certain size, but I’m trying to work out, how you go about calculating what the scale of the camera should be based on those sizes and the fact that you always want a minimum playable area.

For instance if the tiles were 64*64 then a camera scale of 5 for the iphone5 would show all the boxes and this would also work for the iphone4 (though with more space around the sides)

Just by using trial and error the best fit for the iphone5 would roughly be a camera scale of 4.5 rather than 5 because it is always based on the height rather than the 640 width. My point is that there must be some kind of calculation that should help me to arrive at the 4.5 rather than trail and error bearing in a mind that for the iphone game I want to always show a minimum of 5 blocks across the screen regardless of the device and I know I need to calculate this number against the iphone 5 dimensions as it is taller.

For what it’s worth the perfect fit on the iphone 5 (based on having 5 64x64px squares horizontal) is actually a camera scale of 4.4375.

I arrived at this number by doing

screenHeight/pixels to units = a

a/2 = b

b/2 = c;

e.g 1136/64 = 17.75 // However, the orthographic camera’s Size property measures half the height of the screen, so it should be half the height
17.75/2 = 8.875
8.875/2 = 4.4375

Now although I’ve got this number, I’m not sure why it needs to be divided by 2 a second time. If the answer was 8.875 it would make more sense to me, but it’s almost like the calculation should be

( screenHeight / (pixelsToUnits *2) ) / 2

For what it’s worth I’ve been looking at this Ray Wenderlich tutorial http://www.raywenderlich.com/61532/unity-2d-tutorial-getting-started (search for “Fix Your Camera’s Projection” on the page to find the info about scale)

Thanks