Often I avoid NullReferenceExceptions by testing my objects before I use them like so:
if (obj) {
obj.property = 5;
}
Generally if the object is null, it won’t pass the if (obj) test, and I avoid a NullReferenceException. Sometimes the above code causes an error that obj cannot be converted to bool, in which case I test it like this:
if (obj != null) {
obj.property = 5;
}
and this works fine. But barring this circumstance, is there a reason that I should avoid the first way of doing it? Also, why is it that it’s sometimes fine to do it the first way, and other times causes an error?
A class can define how how variables of its type are casted to and from other types. If you can do if(obj) it just means that the class of obj defines an implicit comversion operator to bool.
is there a reason that I should avoid the first way of doing it
If you know what type you are casting and how it converts to bool, nope. But in bear in mind that someone could make a class that equals false when it’s numeric value is 0 for example.
In my experience, it is always better to be specific.
As NoseKills said, obj could evaluate to true or false in any arbitrary way.
If you want to check if obj is not null, then that’s the check you should perform.