Reading & Coding against MaskFields...

I have a masked field in my inspector and it appears to be generating the appropriate integer value based on the selections (none, one, or more). The question is, how does one compare the MaskField integer value to other sources?

That is, given a MaskField list of terrain texture layers (0, 1, 2, 3 and only 1 and 2 are checked/selected), how would I compare the terrain texture layer mix at a given terrain coordinate (e.g. an array holding the mix percentages for each texture layer… 0, .25, .75, 0) to the list’s MaskField selections (value)?

The intended use is for randomly & procedurally placing objects only on certain terrain textures above a certain strength. So in the example given, if both textures layers 1 & 2 were set with a .5 mix minimum it would fail for texture layers 0, 1, & 3 (as they are less than .5) but pass for layer 2 (as it is above .5).

Note, I already know how to get the texture mix for any given position. I may not be populating the MaskField correctly and/or may not be performing the comparison correctly.

A masked enum uses bitmasking in order to be able to hold multiple values. I might need a bit more information on how you’re doing this, but you wouldn’t be able to match a bitmasked enum to a float value because the bitmask essentially requires a power of two. So you could use the bitmasked enum to check if a layer is being used or not, but not the actual value, if I understand your situation correctly. This is good reading on usage Enum, Flags and bitwise operators - Alan Zucconi

To give my own example, I have in my current project

[Flags]
public enum GameMode
{
      NONE = 0
    , FP = 1
    , MENU = 2
    , CUTSCENE = 4
    , QUITTING = 8
}

note the [Flags] attribute – you must put using System; at the top of your file in order to use it. Also note how each enum has a power of two assigned as its value. This is necessary for bitmasking. (This is also why Unity only allows 32 layers!)

I have a parent UI window class from which all my other UI window classes inherit. In this class is this:

protected virtual GameMode compatible => GameMode.FP| GameMode.MENU;

(note that it being virtual allows it to be overridden in child classes)

To open a window inheriting from this class, I call

 public virtual void Open(float speed = 0.5f)
        {
            if (IsOpen || Opening) return; //we want to open if the openRoutine is running because it calls open at the end of it
            if (compatible.HasFlag(GameState.GameMode) || debugMode)
            {
                OpenSetup();
                ShowCanvasGroup(canvasGroup, speed, onComplete: () => {
                    OnDoneOpening();
                });
                foreach (RSAWindow window in openOnOpen) window.Open(speed: speed);
            }
        }

The important part is the compatible.HasFlag(GameState.GameMode)

So that covers how to check if a single flag is included in a masked enum. About actually creating the masked enum, you’ll note that using a single pipe | more or less combines the conditions on either side of it. Thus the default compatible will return true for compatible.HasFlag(GameMode.FP) as well as compatible.HasFlag(GameMode.MENU)

if you’re using actual integers rather than an enum, as with a layermask, the format is slightly more complicated. Here’s another example

public LayerMask layerMask = 1 << MOUSEOVER_LAYER | 1 << PLATFORMS_LAYER;

this will put a LayerMask in the inspector that defaults to having MOUSEOVER_LAYER and PLATFORMS_LAYER selected. (I’ve defined these with constants elsewhere in the class.)

I’m not sure if that’s addressing your question though? If not, can you give some more details about how things are set up on your end?