Hi Dev!
For using Hash128 in my code, I tested some simple code. But result was a little weired.
I guessed Hash128.Parse() method returns UNIQUE hash (Hash128), but it didn’t.
Test code is below.
Hash128 a = Hash128.Parse("TestA");
Hash128 b = Hash128.Parse("TestB");
Debug.Log( "a: " + a.ToString() );
Debug.Log( "b: " + b.ToString() );
Debug.Log( "equal? " + a.Equals( b ) );
And here is result log.
a: 0e000000000000000000000000000000
b: 0e000000000000000000000000000000
equal? True
Do I use Hash128 wrong?
If do that, Please tell me how to use Hash128 correctly or hash function used for Hash128.
Thx!
Here’s an update. It looks like you can use Hash128’s Compute method with one of the various overloads. @Bunny83 It appears that Unity is using the SpookyV2 hashing algorithm, also mentioned in the link I shared.
No Hash128 is just a struct to store an AssetBundle hash. It can’t calculate a new hash. Parse is just the opposite of ToString. So all “Parse” does is convert the string representation of a hash back into the numeric struct representation. Again, it does not calculate a hash.
Just to make it more clear, here’s an example. I just created an arbitary hash manually and let it convert to string:
var h = new Hash128(0x12345678,0xABCDABCD,0xDEADBEAF, 0x55AA55AA);
string str = h.ToString();
Debug.Log("str: " + str);
var h2 = Hash128.Parse(str);
Debug.Log("equal? " + (h == h2));
In the console you will see:
str: 78563412cdabcdabafbeaddeaa55aa55
and
equal? True
As you can see ToString simply converts the 4 uint values into hex and Parse simply does the inverse and converts the hexadecimal representation back to a hash.
In your case “TestA” is not a valid hex string. However it actually detected the “e” in your string as hex digit which is the only one it can read.