Wasn’t sure where to put this, but it isn’t a scripting question nor a shader issue at least yet. And I don’t really know much about rendering, so forgive me if this is naive. But I’m getting a weird effect when moving a spot light around a dark area at a distance beyond the maximum range. For example:
Notice how it’s pitch dark in the middle of the light? Granted that’s because that corner is farther away from the light, but the attenuation is so severe that when shining the light at a particular corner of the wall, you can see more by shining the edge of the light at it rather than the center. It might not look like a huge deal in the photos, but it’s rather distracting in motion.
I’m guessing this is because Unity calculates range flat across the light, when realistically (and from an aesthetic standpoint, I’d argue) the range would be greatest in the middle of the light and fall off at the edges. Is there any way I can simulate that without layering an absurd number of lights? (I’m already using two, one for the main part of the light and one for the diffusion, and I don’t want it getting much more expensive.)