why don't decimals work?

I’ve recently started using unity, and came up with one of the biggest problems I have faced. Decimals. Whenever I try to write 0.5 instead of 1 it instantly stops working. Why don’t decimals work? here is how my code looks, it’s really simple. The problem is at spawnObj(grass, x, height +.5); At one it spawns too high, and I need it in the middle. Any tips?

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class ProceduralGeneration : MonoBehaviour
{
    [SerializeField] int width,height;
    [SerializeField] GameObject dirtblock, grassblock, grass;
    void Start()
    {
        Generation();   
    }

    void Generation()
    {
        for (int x = -100; x < width; x++)
        {
            int minHeight = height - 1;
            int maxHeight = height + 2;

            height = Random.Range(minHeight, maxHeight);

            for (int y = -10; y < height; y++)
            {
                spawnObj(dirtblock, x, y);
            }
            spawnObj(grassblock, x, height);
            spawnObj(grass, x,  height +.5);
        }
    }
    void spawnObj(GameObject obj,int width,int height)
    {
        obj = Instantiate(obj, new Vector2(width, height), Quaternion.identity);
        obj.transform.parent = this.transform;
    }
}

@WolfTraits C# is a strongly typed language, meaning you have to declare what data type a variable is when you declare the variable. You seem to have gotten the hang of doing this in cases of int and probably some other ones, when you declared the maxHeight and minHeight ints.


The main benefit to this is safety - your code won’t break quite as easily when it balloons into being much more complex. However, this also means you need to be careful about what kinds of data you are trying to store into your variables. There are two issues with the decimals you are trying to implement. The first is that you are trying to add a decimal to a variable of type int. Ints can only be whole numbers. Integers are an important concept in math as well as programming and I would suggest looking into their basic properties.


The second is that there are different kinds of “decimals” in C#. The most common (and what you should probably be using in this case) is called a float. If you want to declare a float, not an int, you just write float height; instead of int height;. Moreover, if you are putting a value into a float, you type an f after the number, like so: float height = 1.5f;. Floats are best most of the time, but there are also other kinds of fractional numbers, like decimal (for which you would have to type a d after the number) or double.

Most of Unity depends on floats and integers. you would have to use 0.5f instead of 0.5 - This declares it as a float decimal (popular in Unity) instead of a double decimal (default in C#). There are reasons why floats are better in this environment, as explained in the other answers.