Big Normal Question

If I cast a ray downward, using something like this:

var hit : RaycastHit;
Physics.Raycast (transform.position, transform.TransformDirection (-Vector3.up), hit);

How can I then calculate the average normal of the triangle hit and its immediate neighbors?

Plus, how can I make it so that closer neighboring trianlges are weighted more heavily in the average, proportionate to how close they are?

I can’t figure out a way to do exactly what you want. Would it suffice to cast multiple rays and average the normals of their hits?

you can use the face index of the raycast and then interpolate between the normals of the 3 vertices based on the barycentric coordinate provided by the raycast hit.

that’s not even english :wink:

I tried to strip down the portion of the “Painted Surface” Procedural demo code that does this, and got the attached. It doesn’t seem to work though.

Plus, even if that code worked, it would only give me one average normal for a given face. What I would like is for the average normal to be unique even as I move across the face whose triangle is being hit by the ray, since the neighboring triangles are weighted based on distance to the ray hit point.

The net result that I’m after is being able to very smoothly track a transfrom across a “grainy” (low-poly) surface, via this averaging of the normal data.

var hit : RaycastHit;  
ray = Physics.Raycast (transform.position, transform.TransformDirection (-Vector3.up), hit);
	    	
var filter : MeshFilter = hit.collider.GetComponent(MeshFilter);
var mesh = filter.mesh;
var position = filter.transform.InverseTransformPoint(hit.point);

var inRadius = 5.0;
var vertices = mesh.vertices;
var normals = mesh.normals;
var sqrRadius = inRadius * inRadius;
	
//  calculate averaged normal of all surrounding vertices	
var averageNormal = Vector3.zero;
		
for (var i=0;i<vertices.length;i++) {
	var sqrMagnitude = (vertices[i] - position).sqrMagnitude;

	//  early out if too far away
	if (sqrMagnitude > sqrRadius) continue;

		var distance = Mathf.Sqrt(sqrMagnitude);
		var falloff = Mathf.Clamp01(1.0 - distance / inRadius);
		averageNormal += normals[i];
		averageNormal += falloff * normals[i];
}
			
averageNormal = averageNormal.normalized;
	
transform.rotation = Quaternion.LookRotation (Vector3.Cross (averageNormal, transform.TransformDirection (Vector3.left)), averageNormal);

transform.position = hit.point + transform.TransformDirection (Vector3.up * surfaceOffset);

Thats exactly what my suggestion would do.
The vertex normals stored in the mesh are smoothed already. That means, if you have a sphere, the vertex normals are exactly the average between all the bordering triangles. Thus when you interpolate like i mentioned above you get a smoothly changing interpolated normal. Exactly what you want.

But I would only get one average normal per face, right?

I want to be able to move across a face, casting the ray down, and get a new average normal with each bit of translation across the “surface” of that face.

Would your suggestion accomplish that?

And, if so, any chance you could pen a bit of the 3D Code Of The Gods you are referring to? Please, please, please? :slight_smile:

I’ve been helped a lot in these forums, but I haven’t contributed a lot, so I’ll take this opportunity.

The following code will smoothly interpolate the normals. The top part of the Update function was taken directly from the triangleIndex documentation of RaycastHit, where the mouse is tracked to provide the raycast point. You can modify that as you need.

The next part performs interpolation of the intersected triangle normals. At the bottom, the variable N is assigned the interpolated normal. I had (a form of) this code lying around in C#, so I had to translate it to JavaScript.

var P  : Vector3;
var P1 : Vector3;
var P2 : Vector3;
var P3 : Vector3;

var N  : Vector3;
var N1 : Vector3;
var N2 : Vector3;
var N3 : Vector3;


function Update ()
{
	// Only if we hit something, do we continue
	var hit : RaycastHit;
	if (!Physics.Raycast (camera.ScreenPointToRay(Input.mousePosition), hit))
		return;

	// Just in case, also make sure the collider also has a renderer
	// material and texture
	var meshCollider = hit.collider as MeshCollider;
	if (meshCollider == null || meshCollider.sharedMesh == null)
		return;

	var mesh : Mesh = meshCollider.sharedMesh;
	var vertices = mesh.vertices;
	var normals = mesh.normals;
	var triangles = mesh.triangles;

	// Extract local space vertices that were hit
	var P1 = vertices[triangles[hit.triangleIndex * 3 + 0]];
	var P2 = vertices[triangles[hit.triangleIndex * 3 + 1]];    
	var P3 = vertices[triangles[hit.triangleIndex * 3 + 2]];   

	// Extract normals
	var N1 = normals[triangles[hit.triangleIndex * 3 + 0]];
	var N2 = normals[triangles[hit.triangleIndex * 3 + 1]];    
	var N3 = normals[triangles[hit.triangleIndex * 3 + 2]];   

	// Transform local space vertices to world space
	var hitTransform : Transform = hit.collider.transform;
	P1 = hitTransform.TransformPoint(P1);
	P2 = hitTransform.TransformPoint(P2);
	P3 = hitTransform.TransformPoint(P3);
	N1 = hitTransform.TransformDirection(N1);
	N2 = hitTransform.TransformDirection(N2);
	N3 = hitTransform.TransformDirection(N3);
	P = hit.point;

	// finally, interpolate vertex normals
	var Nt = Vector3.Cross(P2-P1, P3-P1);
	var Na = Vector3.Cross(P3-P2,  P-P2);
	var Nb = Vector3.Cross(P1-P3,  P-P3);
	var Nc = Vector3.Cross(P2-P1,  P-P1);

	// avoid divide by small
	var N_norm_squared = Nt.sqrMagnitude;
	if (N_norm_squared < 0.001)
		Debug.Log("triangle vertices are too close");

	// compute barycentric coords
	// taken from Pete Shirley's tiger book (Fundamentals of Computer Graphics, 1st Ed.), p. 46
	var one_over = 1.0 / N_norm_squared;
	var alpha = Vector3.Dot(Nt, Na) * one_over;
	var beta = Vector3.Dot(Nt, Nb) * one_over;
	var gamma = Vector3.Dot(Nt, Nc) * one_over;

	// Check if barycentric coords are within triangle
	// We shouldn't have to perform this check because we assume
	//  the triangle index returned by RayCast hit is correct
	// --------
	//if (alpha < 0 || alpha > 1 ||
	//	beta < 0 ||  beta > 1 ||
	//	gamma < 0 || gamma > 1)
	//	Debug.Log("hit point is outside of triangle");

	// interpolate normal
	N = N1*alpha + N2*beta + N3*gamma;
	Debug.DrawLine(P, P + N, Color.green);
}

Wow!

That’s amazing. And it works like a charm.

You appear to be from the same planet of Supergeniusrobot Coders that Joe hails from, RockHound - great to have you here with us on Earth!

:wink:

BTW, RH, what did you develop this code for orignally?

I used it to do the same type of smoothing for colors that you need for your normals. At the time, we were darkening our character based on the baked vertex lighting of the scene. I yanked the code when 1.6 (I think) introduced the textureCoord field in the RaycastHit struct, and we could simply use a light map.

Which raises the question, maybe the ‘normal’ field of the RaycastHit struct is already doing smooth (barycentric) interpolation, in which case my contributed code is moot. If not, it could easily do so, since I assume uv’s are interpolated similarly.

You reading this, Joe? :wink:

Hi Marty,

I just remembered that the above code does not necessarily preserve the length of the interpolated normals. If you need the smoothed normal to have a similar length as the original three normals, you can replace this line (at the bottom of the Update function)

N = N1*alpha + N2*beta + N3*gamma;

with this

N = N1*alpha + N2*beta + N3*gamma;
N.Normalize();
N *= N1.magnitude*alpha + N2.magnitude*beta + N3.magnitude*gamma;

It adds some processing, but you probably won’t see any performance hit.

Thanks, RH.

A non-normalized normal is okay for orienting a transform, right?

Such as in this code snippet:

transform.rotation = Quaternion.LookRotation (Vector3.Cross (averagedNormal, transform.TransformDirection (Vector3.left)), averagedNormal);

Yea, that works just fine without normalizing.