# Scaling my background Sprite to fill screen? (2d)

So I’m trying to scale my background sprite(s) to fill the screen, there are 2 a regular background image, and a sprite that lays over it with alpha transparency to fake shadows (since lighting doesn’t apply to 2d sprites) the issue is i cant seem to find that magical float to multiply my sprite width/height by to get the desired size (maybe i just suck at math. My First attempt was:

``````percentW = (float)scrW/(float)imgW;
percentH = (float)scrH/(float)imgH;
``````

which gives me 0.3125/0.3125

but in playing with inspector variables i found that for this particular resolution i need 0.8/0.8 so i looked online and found this code:

``````float height = 2.0f * Mathf.Tan(0.5f * Camera.main.fieldOfView * Mathf.Deg2Rad);
float width=height * scrW/scrH;
``````

which gives me 0.649 for height, closer but still not perfect. I can hard code a close approximation by multiplying by 1.3 to get .84 but i want .8 almost exactly, and its driving me nuts.

This is going to be an android title meaning tons and tons of varying resolutions so of course independence is of the utmost importance. So has anyone figured out how to do this yet?

I’ve not spent a lot of time with the 2D stuff yet, but it seems to be that you have to use the world size of the sprite to make this calculation. In particular, since you can specify `Pixels to Units` when you import a sprite texture, the world mapping needs to use the `sprite.bounds`.

Here’s some sample code. There may be a simpler solution. It assumes an `Orthographic` camera centered on the image parallel to the camera plane:

``````function ResizeSpriteToScreen()
{
var sr = GetComponent(SpriteRenderer);
if (sr == null) return;

transform.localScale = Vector3(1,1,1);

var width = sr.sprite.bounds.size.x;
var height = sr.sprite.bounds.size.y;

var worldScreenHeight = Camera.main.orthographicSize * 2.0;
var worldScreenWidth = worldScreenHeight / Screen.height * Screen.width;

transform.localScale.x = worldScreenWidth / width;
transform.localScale.y = worldScreenHeight / height;
}
``````

P.S. I see you are trying to use `fieldOfView`. If you are using a perspective camera, this code will need to be modified.

Many Many thanks to robert. It helped me a lot.

This is the C# implementation of robert’s code:

``````void Resize ()
{
SpriteRenderer sr=GetComponent<SpriteRenderer>();
if(sr==null) return;

transform.localScale=new Vector3(1,1,1);

float width=sr.sprite.bounds.size.x;
float height=sr.sprite.bounds.size.y;

float worldScreenHeight=Camera.main.orthographicSize*2f;
float worldScreenWidth=worldScreenHeight/Screen.height*Screen.width;

Vector3 xWidth = transform.localScale;
xWidth.x=worldScreenWidth / width;
transform.localScale=xWidth;
//transform.localScale.x = worldScreenWidth / width;
Vector3 yHeight = transform.localScale;
yHeight.y=worldScreenHeight / height;
transform.localScale=yHeight;
//transform.localScale.y = worldScreenHeight / height;
}
``````

I realise this is almost a decade later but for future reference the C# implementation is a tad off and neither scripts bother to line the sprite up with the camera.

``````void Resize()
{
SpriteRenderer sr = GetComponent<SpriteRenderer>();
if( sr == null ) return;

var camera = Camera.main;

// Line up sprite with camera
transform.position = new Vector3(camera.transform.position.x, camera.transform.position.y, transform.position.z);

// Get viewport sizes
float worldScreenHeight = camera.orthographicSize * 2f;
float worldScreenWidth = worldScreenHeight / Screen.height * Screen.width;

// Scale sprite
var spriteSize  = sr.sprite.bounds.size;
Vector3 scale = Vector3.one;
scale.x = worldScreenWidth / spriteSize.x;
scale.y = worldScreenHeight / spriteSize.y;

transform.localScale = scale;
}
``````