Issue with Ray Length?

I am using raycasting to do some very pixely fog of war for a top down game. My code looks something like this:

For each ray I am firing, I have a block like this:
ray1 = new Ray(transform.position, new Vector3(-1,0,-1));
Debug.DrawRay(transform.position, new Vector3(-1,0,-1)* visionDistance, Color.yellow);

Simple enough. And then I have a bit of code for each of those rays to check for collision to enable/disable rendering of the fog blocks I have that looks like this:

if (Physics.Raycast(ray1,out raycastHit1, visionDistance))
		{
			if (raycastHit1.collider.gameObject.tag == "fogblock")
			{
				GameObject hitObject = raycastHit1.collider.gameObject;
				if (hitObject)
				{
					hitObject.renderer.enabled = false;
				}
			}
		}

Once again, not too complex. The ISSUE is that the visionDistance (the variable I’m using for the ray distance) doesn’t do anything. When I run my game, my debug rays match the vision distance like you’d expect, so I can see what it should be like, but the fog blocks don’t disable themselves AS IF the ray is the default 1 unit long, despite visionDistance being higher.

I have looked all around the internet trying to figure out how to get the ray to cast to the length of the float “visionDistance”, but it doesn’t. Any help is greatly appreciated, this is the last hurdle in a long couple of weeks on similar issues. :smiley:

Thanks!

SOLUTION: After looking through my code, I found a sorta’ “well, duh” moment. The issue I was having was, the rays were obviously hitting COLLIDERS and stopping. But when they hit the fog block collider, I only disabled the renderer of the object, and not the collider. So it looks like the ray was casting only a few feet in front of it, when really, it was casting much further but hitting an object that was now invisible. Herp derp!