Please help me make sense out of this. I’ve tried reading microsoft’s doc. I still don’t see how the following works.
Why is the first debug line returning both fields but when separated they both return nothing?
using System;
using System.Reflection;
using UnityEngine;
public class A
{
protected int fu;
protected int bar;
}
public class B : MonoBehaviour
{
void Awake ()
{
Type typeA = typeof(A);
Debug.Log (typeA.GetFields (BindingFlags.NonPublic | BindingFlags.Instance).Length);
Debug.Log (typeA.GetFields (BindingFlags.NonPublic).Length);
Debug.Log (typeA.GetFields (BindingFlags.Instance).Length);
//Outputs respectively :
//2
//0
//0
}
}
As it says in the BindingFlags page you linked, in a note box:
as for the | operator on the BindingFlags, since BindingFlags has a Flags attribute the | operator functions as a bitwise operator. In this sense the code looks at the two flags in binary and weaves them together. For example take the following two bytes
1001 and 0011.
A bitwise operation compares each digit individually to get a result. So…
1001 | 0011 = 1011 (| returns 1 if either digit is 1, otherwise 0, like OR)
1001 & 0011 = 0001 (& returns 1 if both digits are 1, otherwise 0, like AND)
~1001 = 0110 (~ returns the opposite, or complement value, like NOT)
Ahhhh, that explains it. I completely missed that, thank you.
Amazing, thank you! I couldn’t make sense out of this for the life of me. Just did some further reading with the info you provided to figure out why it makes sense to use the attribute and how bitwise OR on enums makes sense, and at last, I finally understand.
In case anyone with a similar issue ever ends up on this thread, here’s a little addition to Joshua’s answer, explaining how the Flags attribute is used.