Hi,

I have two ints and dividing them I need to obtain a float. But if I use the normal division I obtain a int without decimals, what can I do?

Lot of thanks

Hi,

I have two ints and dividing them I need to obtain a float. But if I use the normal division I obtain a int without decimals, what can I do?

Lot of thanks

just cast in float before you do the division.

something like that :

```
int i1 = 10;
int i2 = 3;
float f = ((float) i1) / ((float) i2);
```

then f will be equal to 3.33333 (instead of 3 if you change the type of f to be integer)

Hey Infinity

What’s happening is because you dividing an int by another int, it is assumed you would like an int as your answer type.

To get your result as a float, just cast your first int to a float like so:

```
float answer = ((float)intOne) / intTwo;
```

This is then seen as a float being divided and so implicit conversions to float from all non-float types will occur.

This is an old question but since it was the first to come up when I googled I’d like to add the js-solution

```
var c : float = parseFloat(a) / b;
```

or

```
var c : float = (a+0.0) / b;
```

Additional Info:

Casting in UnityScript is usually done through “variable name” as “classname” ,

but since int/float are primitives, this WON’T work:

```
a as float
```

Source (Answer by Eric, additional Info by Dreamora)