Is there a bit of maths I can do to worldProj to rotate if slightly, say 45 degrees, so it’s projecting diagonally through the scene, rather than parallel to one of the world axis?
Search for “rotate uv in shader” and you’ll find a ton of examples.
The main thing to understand is the tex2D() function is only using the .xy values of the i.worldProj input (so only the xz in world space, since you’re swizzling in the vertex shader). You don’t need to pass the float3 swizzled world position, just the float2 of the final UVs.
Hey bgolus,
So I don’t think I explained my question very well.
I’m not trying to rotate the texture so much as I’m trying to rotate the projection. And, because I’m using worldpos to project the texture, effectively I’m trying to rotate the world, and then use that to project the texture.
(Still don’t think I’m explaining this well.)
Basically I’m trying to create an effect for a mobile game that uses a projected texture to fake a directional light on a ball. (I don’t want to use actual lighting because vertex lighting doesn’t look good enough and pixel lights will be (i assume) significantly less efficient than using textures (if I can get them to work)
At the moment the texture projects straight through x. This looks fine but it always looks like the light is pointing straight down. That’s acceptable for this project, but it would be nice if I could rotate the texture to give the lighting some directionality as the ball moves in the scene.
The method you suggested above will work, but only while the ball moves in one axis. For this game the ball can move in both x and z.
(Even now I don’t think I’m explaining this… I’ll attach a video of me manually simulating the effect in Maya)
Ah, so you want to project a gradient texture on the side of a ball to fake lighting.
So, for that you’ll need a 3d rotation matrix, or do two 2D rotations. Understand the sine & cosine and resulting float2x2 in the original example is that rotation matrix. So you’d need to do that to the world space xz, then to the rotated world space xy to get the final local space xy. Really at that point you only need the y and can just ignore the x entirely and only pass the y. It’d be much easier to pass a matrix in from script rather than trying to generate it in the shader, especially if you’re aiming for mobile.
// c#
Matrix4x4 lightProjection = Matrix4x4.TRS(
Vector3.zero, // unused
Quaternion.Euler(pitch, yaw, roll), // the light rotation
Vector3.one * ballDiameter // pre-scale matrix
);
material.SetMatrix("_LightProjection", lightProjection);
// shader
// outside of function
float3x3 _LightProjection; // note only 3x3, this does rotation and scale only
// in vertex shader
o.worldProj = mul(_LightProjection, worldPos.xyz - objOrigin.xyz).y; // only use y
o.worldProj = o.worldProj + 0.5; // adjust so 0.5 is at the center of the ball
// prescaled matrix should put 0.0 and 1.0 at the extents of the ball
// in frag shader
fixed4 lighting = tex2D(_DirectionalLight, i.worldProj.xx);
With a little more work you could construct your matrix to work on the ball’s local space vertex positions so you don’t have to calculate the world position and object’s pivot. To do that get the ball’s rotation, construct a rotation matrix with that (Matrix4x4.Rotate(ball.tranform.rotation)) and multiply that and the light projection matrix together (I always forget which order) in the script before setting it on the material. Then in the shader it’d just end up being:
Technically you could do that “0.5” offset in c# too by calculating an offset position from the ball for the matrix, but that might end up being slower overall since that would require a 4x4 matrix multiply instead of a 3x3 matrix multiply.
He Bglous, sorry to be an unbearable pain in the rear, but… I must be missing something because I’m getting this…
Which should look like a smiley face test texture…
I don’t think my c# script is talking to the _LightProjection matrix in the shader, because when I change the values in the script, nothing is happening to the texture at all…
The script and shader are talking, I tested with something else, so I’m a bit baffled.
What have I done wrong (I bet it’s something stupid)
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
[ExecuteInEditMode]
public class DirectBallLight : MonoBehaviour
{
public Material _Material;
public float pitch;
public float yaw;
public float roll;
public float IT;
public float ballDiameter = 5.8f;
void Update()
{
Matrix4x4 lightProjection = Matrix4x4.TRS
(
Vector3.zero, // unused
Quaternion.Euler(pitch, yaw, roll), // the light rotation
Vector3.one * ballDiameter // pre-scale matrix
);
_Material.SetMatrix("_LightProjection", lightProjection);
_Material.SetFloat("_IsTalking", IT);
}
}
Another bug I had in the original code is the scale needs to be divided, not multiplied. Here’s some tweaks I made. Script uses a dummy object (in this case, an actual directional light) to get the orientation, and uses the ball’s world rotation so the transform can be applied directly to the v.vertex instead of extracting the object space location. It’s also doing the offset in the matrix, since that ended up being easier than I expected and doesn’t seem to be significantly slower.
One of these balls is using the included shader, one of them is using the Legacy/Diffuse shader.
The light texture is setup so that it’s a gradient with the left edge being the light side and right edge being dark. I’ve included the gradient I used to test above.
Thinking a bit more about this. I suppose, now that we know the orientation of the fake light. I wonder how it would look to project a “shadow map” along the y axis. I might give that a go in future.