Why is my weighted loot table only returning the last index on the list?

I’m trying to build a very basic weight loot drop system, where each item gets a drop probability between 0-infinity and the weights are then compared against each other, such that 3 items with drop rates of (50,50,100) will drop at approximately the same ratios as 3 items with rates of (1,1,2):

public int GetSpriteIndex(List<SpriteSpawn> listOfSprites)
{

    int index = 0;
    int total = 0;
    for (int i = 0; i < listOfSprites.Count; i++)
    {
        total += listOfSprites*.chanceToSpawn;*

}
int value = (int)Random.value * total;
for (int i = 0; i < listOfSprites.Count; i++)
{
value -= listOfSprites*.chanceToSpawn;*
if (value <= 0)
{
index = i;
}
}
return index;
}
That looks right to me, and I’ve seen various versions of this used for years, but for some reason it only ever returns the last index on the given list. Did I mess up something obvious, or this entire approach flawed?

Replace this:

int value = (int)Random.value * total;

with:

int value = (int)(Random.value * total);

you’re converting the float to int before the multiplication and this will always return 0.

On a side note, nothing personal:

It always puzzles me that people don’t have the slightest clue about even the simplest debugging techniques. Setting a breakpoint and tracing through the code or in this case a simple debug.Log should have been enough to figure out what was going wrong.