Elvis operator / Null-Coalescing operator

Null-conditional / Null-Coalescing operators ( ‘?.’ and ‘??’ ) cannot currently be used on serialized fields.

When deserializing a MonoBehaviour, all unreferenced fields are not set to null, instead it references a object with a == operator overload, so it results true when testing for null (when in fact it is not a null reference).

This leads to some unexpected behaviors like:

public class TestBehaviour : MonoBehaviour
{
    UnityEngine.GameObject reference;

    void Start() {
        if (reference?.transform == null) {
            Debug.Log("It works!"); // This line is printed as expected
        }
    }
}

But changing the reference to public changes the execution path

public class TestBehaviour : MonoBehaviour
{
    public UnityEngine.GameObject reference;

    void Start() {
        if (reference?.transform == null) { // throws an exception
            Debug.Log("Not executed"); // This line is not reached
        }
    }
}

Exception:
UnassignedReferenceException: The variable reference of TestBehaviour has not been assigned.
You probably need to assign the reference variable of the TestBehaviour script in the inspector.

This null reference behaviour was probably created to help developers when trying to access unassigned references, where it would print the attribute name inside the exception, but this ‘helper’ is now causing an UnassignedReferenceException where it should not exist.

The ?? operator also does not work as expected on that scenario.

var validReference = optionalReference ?? referencedAtribute;

The code above would always store ‘optionalReference’ on ‘validReference’, regardless if ‘optionalReference’ is set or not.

My opinion:
this pretty UnassignedReferenceException only works when trying to access a Serialized Object attribute, so the user is not safe from NullReferenceExceptions.

  • It tries to solve a ‘problem’ that is not Unity’s fault (out of its domain)
  • It helps only on some specific scenarios
  • It creates other problems with it

Basically not worth it.

What should companies do to avoid such weird bugs? Recommend its developers to not use ‘?.’ and ‘??’ at all? (yes I’ve heard that).

At least give me the option to read attributes as null, when they are not referenced on Unity editor.

3 Likes

I think many would agree. This will become quite a lot worse when C# 6 is rolled out and ?. becomes available as well.

Could you do a quick check for what the second version does in builds? I believe Unity does not replace the null with a bullshit “let me help you will your nullref” object in builds.

If that’s the case, maybe the difference between editor time and runtime is enough to get rid of it! It really only helps the people who are struggling through Roll-A-Ball, and is an ugly gotcha for everyone else.

The ?? error has probably always been there, I just haven’t run into it because ?? doesn’t really work with the whole == override thing.

1 Like

I’m wondering what unity is to do here. I agree that all the bogus null objects should be removed (for serialization and also for GetComponent returns) - we just want to write standard C# (and get rid of useless allocations while at it.

However, you’d still have this problem for Destroy() and your own variables. The nullref overload there is well… it is convenient and at this point impossible to remove really

Maybe now that they’re in the .net steering group they could ask microsoft for some mechanism to have ? and ?? compiles to using the actual Equals or == operator instead of hardcoded compare. I guess that might impact performance so it would need to only compile to that for certain types.

This was discussed in some length on the blog a couple of years ago.

Basically, the custom operator is there to avoid the need to double up every single null check with an isAlive check or similar, since the object could have been destroyed on the C++ side of things. In my opinion, the custom operator should be removed and the implicit bool cast that is already in place should be used for that purpose instead.

1 Like

While I agree for if (blah == null) - I think with C# 6 the situation is different now. It means that we still couldn’t write m_comp?.Var defeating the now common ?. operator.

I really don’t like the operator either but at the same time unity has always set the goal that you can just write standard C# . I think atm I agree, this is broken, but maybe with their influence in .net now they can do something about the underlying problem.

Thanks for bringing this up. We’re having some internal discussions about how to best handle this now, but we don’t have a consensus yet. With these new C# 6 operators, it certainly brings the underlying issue to the forefront. We’re open to more discuss/suggestions about how to best deal with this.

3 Likes

Are you talking about the special handling of null fields in the editor, or the == null override? Or both?

Really just the == null override, but I think everything is open to discussion at the moment.

So the special handling of null fields is easier to handle - just remove it. It’s causing inconsistencies between the editor and the runtime and it’s causing the profiler to show memory allocations that’s not there at runtime. The help it gives to new users who doesn’t know what a nullref is doesn’t seem to help much, from what I’m seeing in the scripting forums.

The == null thing is harder. The problem is that it’s really convenient that == null checks for destroyed objects. It has a downside where it’s surprising for experienced programmers that’s new to Unity and it breaks around interfaces (see discussion here), so I’m a bit back-and-forth on what’s the good option.

If the goal is that these statements should be equivalent:

var foo = bar != null ? bar : new Foo();
//and
var foo = bar ?? new Foo();

and that these statements should be equivalent:

if(foo != null) {
    foo.DoSomething();
}
//and
foo?.DoSomething();

There are some options:

1: Remove the == override, and have all of the statements above break on references to destroyed objects. This is consistent with how C# works, but requires retraining all Unity developers, retraining all of Unity’s developers, and breaks every single free and paid plugin that includes scripts.
This also doesn’t make either ? or ?? useful in Unity. It just makes things consistent over the board. It’s not a very good solution.
2: Don’t change anything. This doesn’t break anything, but since the elvis operator is a lot more useful than ??, I’m assuming that a lot more people will do this and despair:

transformThatHasBeenDestroyed?.Translate(direction); //MissingReferenceException

Which means that no problem has been solved, and ? just can’t be used in Unity (kinda like ?? really can’t be used unless you’re completely certain that the object will never be destroyed)

3: Somehow convince Microsoft to have ? and ?? use the class’ == operator instead of the default one to check for null. This is potential breakage in ALL .NET CODE OUTSIDE UNITY, so the chances are slim.

4: Somehow convince Microsoft to make ? and ?? be overrideable on a per-class basis, and override it in UnityEngine.Object.
This has the least impact on things, and solves all problems. On the other hand, even if it happens, it probably won’t happen before C# 8, which is probably years off. Unless they’re still taking suggestions for C# 7?

5: Make Unity actually use UnityC#, which is a C# language that’s equivalent to C# in every single way except that ? and ?? uses the class’ == check. As an added bonus, the == check for object could be changed to do what UnityEngine.Object’s == does.
This option sounds really good, but is also completely off the walls insane, so idk.

Is there any other option available?

1 Like

This is the only option I like. At first I thought it’s kind of crazy, but then realized that non-nullable reference types are expected to be one of the main themes for C# 8. Since null-whatever-operators will have absolutely no sense for non-nullable types, the override might become quite handy even outside of Unity.

public class Foo
{
    public static bool operator ?(Foo foo) => foo.IsValid;
}

PS
Oops! It will never happen, because otherwise all the C# code all around the world will have to be recompiled. Not good.

There could be another solution, Unity is quite keen on patching assemblies in general so it could potentially be possible to remove the custom null check on the language side and patch it in directly at the IL level.

That should allow all the operators to function as expected, at the expense of having even more potentially unexpected behaviour.

Still wouldn’t help when accessing Unity objects through interfaces and the like though.

Does anyone know how the compiler generates the code for the new ?. operator?

For example, why doesn’t this code work out of the box:

transformThatHasBeenDestroyed?.Translate(direction); //MissingReferenceException

Since transformThatHasBeenDestroyed isn’t null, and ? probably uses the object’s referenceEquals.

The latest version of the C# specs I can find (here, not official, but the latest official ones are from 2012 and doesn’t contain the operator), says this about the operator:

Note that the mention is specifically “is non-null”, not “== null resolves to false”. So I’d guess the generated result is equivalent to:

if(!object.ReferenceEquals(transformThatHasBeendestroyed, null)) {
    transformThatHasBeenDestroyed.Translate(Direction);
}

If this “worked” in Unity, ie. the line would be passed by if the object was destroyed, then Unity would somehow be breaking the C# specs.

There is no solution to this that doesn’t have major drawbacks. I think the UnityC# superset is the most appealing, but I’ve got no idea how much work that would be.

I just ran this test and peeked at the generated code (using ildasm):

.method private hidebysig instance void  Start() cil managed
{
  // Code size       31 (0x1f)
  .maxstack  8
  IL_0000:  nop
  IL_0001:  ldarg.0
  IL_0002:  ldfld      class [UnityEngine]UnityEngine.Transform TestScript::transformThatHasBeenDestroyed
  [B]IL_0007:  brtrue.s   IL_000e[/B]
  IL_0009:  br         IL_001e
  IL_000e:  ldarg.0
  IL_000f:  ldfld      class [UnityEngine]UnityEngine.Transform TestScript::transformThatHasBeenDestroyed
  IL_0014:  call       valuetype [UnityEngine]UnityEngine.Vector3 [UnityEngine]UnityEngine.Vector3::get_zero()
  IL_0019:  callvirt   instance void [UnityEngine]UnityEngine.Transform::Translate(valuetype [UnityEngine]UnityEngine.Vector3)
  IL_001e:  ret
} // end of method TestScript::Start

What happens (if i understand correctly) is that an IL instruction (brtrue.s) is used to determine if an object is not-null, and if so, jumps to the correct location to continue evaluation.
This completely overrides Unity’s (or in fact C# 's) operator evaluation and the overloading mechanism, so Unity’s overloaded operator is not used in this case.

BTW - this new operator will only work incorrectly in the cases where it is triggered against a destroyed object. How often do you think this behaviour is encountered by developers? for the most cases it should be fine, though i still think there’s something that should be done about it.

1 Like

There was some chatter about this on Slack in the #code channel on Sep 27.

My question was why the heck the ‘Missing’ even exists, because it should just be null. It was pointed out that the C++ side of the data can be destroyed but the C# side cant because with C# there is no way to find all references to an object and null them, hence the C# wrapper has to stay (missing/kinda dead) until you drop all of the references to it and GC hits it.

Dunno if they’re looking into a workaround for it or not.

The vast majority of null-checks I write are actually destroyed-checks. Most of them are of the kind “handle that the object you were following/checking/managing got destroyed”.

I’ve got a hunch they are.

IMHO, all proposed solutions are not very good ones (e.g: change the universe to make this work) or other hacks.

it was said that it’s not technically possible (or desired ? due to performance reasons), to scan for all references to a given object and invalidate them (make them null). If every native (C++) object holds a pointer to its managed (C#) counterpart, is it possible to nullify these (e.g: destroy the object) ? or is it the reference that should be nullified as well ?

Another (ugly) suggestion - a UnityEngine.Object will have another reference (e.g: this.self or something similar). This will be monitored by the native side and nullified in case the object is destroyed.

You would then check by something like:

transformThatHasBeenDestroyed?.self?.Translate(Direction);

Ugly as f** i know :slight_smile:

[quote=“liortal, post:19, topic: 641382, username:liortal”]
it was said that it’s not technically possible (or desired ? due to performance reasons), to scan for all references to a given object and invalidate them (make them null). If every native (C++) object holds a pointer to its managed (C#) counterpart, is it possible to nullify these (e.g: destroy the object) ? or is it the reference that should be nullified as well ?
[/quote]The problem is not the reference that the C++ object holds to the C# object, or even the C# object itself - it is the references that other C# objects hold to that C# object. Scanning every other C# object to look for potential references to the object being deleted… well, you’re already familiar with what that looks like: it looks like the garbage collector. So how do you feel about running the GC every single time an object is deleted? :slight_smile:

[quote]
Another (ugly) suggestion - a UnityEngine.Object will have another reference (e.g: this.self or something similar). This will be monitored by the native side and nullified in case the object is destroyed.
[/quote]That’s an interesting idea - I’d probably want to call it something like “IfAlive()”, so you’d do “transformThatHasBeenDestroyed?.IfAlive()?.Translate(Direction).” I think this is something you could already do with an extension method though.