Yes, exactly. Unity is not converting the integer value to the corresponding float value (like 3.0), but rather it is using the underlying bit pattern of the integer as a float.
In IEEE 754, the smallest positive subnormal float is approximately 1.401298E-45. When you “cast” the integer value to a float at the bit level, you get:
- 0 → 0.0f
- 1 → 1.401298E-45 (bit pattern 0x00000001)
- 2 → 2.802597E-45 (bit pattern 0x00000002)
- 3 → 4.203895E-45 (bit pattern 0x00000003)
- 4 → 5.604994E-45 (bit pattern 0x00000004)
So, the value 4.203895E-45 corresponds to the integer 3, which is your enum value Threshold.
This “weird” conversion is simply because Unity is storing the enum’s integer values in a float field, and the bit pattern isn’t converted in the way you might expect (e.g., 3 becoming 3.0f). Instead, it’s interpreted directly as the float whose bit pattern equals the integer value.