Particle World Space Coords?

Wondering if anyone’s played with this.

I’m trying to get the world space of particles in a shader, however their coordinates seem to be based on screen/camera space.

Does anyone know how I could either;

  • Convert screen/camera space to world space.

OR

  • Get the world position of the vertices of the particles.

I have come up to this question as well, and I am wondering if anyone has an answer. Thanks!

camera to world space should be the coordinates multiplied with the inverse projection matrix and the resulting float4 divided by the w component. You’ll need to create the shader constant for projection matrix yourself, but it’s very easy to do so.

I replied to beck via PM, but I’ll post up here, too…

Apply this to your camera(s);

#pragma strict

function OnPreCull () {
	Shader.SetGlobalMatrix ("_Camera2World", this.camera.cameraToWorldMatrix );
}

Define this somewhere before your vertex shaders;

float4x4 _Camera2World;

Use this in your vertex shaders to get the world position;

float4 worldPosition = mul(_Camera2World, v.vertex);

Bear in mind this will not give the correct result in the Scene viewport.
It will only give the correct result in the Game viewport when the game is playing.

If you want it to work in the scene view as well you can add an component to the scene view camera.

Camera sceneCamera = null;
if (SceneView.currentDrawingSceneView != null)
	sceneCamera = SceneView.currentDrawingSceneView.camera;

if (sceneCamera != null)
{
	if (sceneCamera.GetComponent<CameraSettings>() == null)
		sceneCamera.gameObject.AddComponent<CameraSettings>();
}

The CameraSettings component does what Farfarer did and it has [ExecuteInEditMode] to make it work when you aren’t playing. You have to add this component after every recompile and every time you play the game though. I’ve done it with a hotkey but maybe it’s possible to hook it up to do it automatically every recompile. An added bonus to this is that you can set depthTextureMode on the scene view camera as well to make depth texture effects work in the scene view e.g. soft particles.

dude can you run this on android??

Anyone actually get this working? Here’s an updated script with the new api, it’s attached to my camera but I’m still not getting correct world coordinates in my shader…

#pragma strict

function OnPreCull()
{
    Shader.SetGlobalMatrix("_Cam2World", this.GetComponent.<Camera>().cameraToWorldMatrix );
}

As of Unity 5.2 particles are generated in world space. There’s no need to pass the camera to world matrix anymore; v.vertex is the world space position.

Is there any way to get object/particle local vertex coordinates from v.vertex? Multiplying it with _World2Object doesnt seem to do absolutely anything.

Short answer: No.

Long answer is _World2Object has nothing in it of value because particle vertices are already in world space and a particle system is a single mesh that includes all particles in the system.
v.vertex == mul(_World2Object, v.vertex) == mul(_Object2World, v.vertex)

If you want the position of the particle system you’d have to pass that as a parameter to the particle system’s material.
If you want the position of the particle you can use a geometry shader to get a good approximation of a quad’s center or abuse particle normals set to 0 if the particle is a fixed size, but it’s impossible using mesh particles.

1 Like

Thanks for the quick reply. That was a stupid question, sorry, i later realized i could just use uvs as kind of local 0-1 position. Didnt need them to be 1 to 1 world scale anyway

Just to add to this It looks like the normal’s and tangents are in screen space / view space still so you will still need to use the inverse view space hack if you want world space normal’s

1 Like