camera to world space should be the coordinates multiplied with the inverse projection matrix and the resulting float4 divided by the w component. You’ll need to create the shader constant for projection matrix yourself, but it’s very easy to do so.
Bear in mind this will not give the correct result in the Scene viewport.
It will only give the correct result in the Game viewport when the game is playing.
If you want it to work in the scene view as well you can add an component to the scene view camera.
Camera sceneCamera = null;
if (SceneView.currentDrawingSceneView != null)
sceneCamera = SceneView.currentDrawingSceneView.camera;
if (sceneCamera != null)
{
if (sceneCamera.GetComponent<CameraSettings>() == null)
sceneCamera.gameObject.AddComponent<CameraSettings>();
}
The CameraSettings component does what Farfarer did and it has [ExecuteInEditMode] to make it work when you aren’t playing. You have to add this component after every recompile and every time you play the game though. I’ve done it with a hotkey but maybe it’s possible to hook it up to do it automatically every recompile. An added bonus to this is that you can set depthTextureMode on the scene view camera as well to make depth texture effects work in the scene view e.g. soft particles.
Anyone actually get this working? Here’s an updated script with the new api, it’s attached to my camera but I’m still not getting correct world coordinates in my shader…
#pragma strict
function OnPreCull()
{
Shader.SetGlobalMatrix("_Cam2World", this.GetComponent.<Camera>().cameraToWorldMatrix );
}
As of Unity 5.2 particles are generated in world space. There’s no need to pass the camera to world matrix anymore; v.vertex is the world space position.
Is there any way to get object/particle local vertex coordinates from v.vertex? Multiplying it with _World2Object doesnt seem to do absolutely anything.
Long answer is _World2Object has nothing in it of value because particle vertices are already in world space and a particle system is a single mesh that includes all particles in the system.
v.vertex == mul(_World2Object, v.vertex) == mul(_Object2World, v.vertex)
If you want the position of the particle system you’d have to pass that as a parameter to the particle system’s material.
If you want the position of the particle you can use a geometry shader to get a good approximation of a quad’s center or abuse particle normals set to 0 if the particle is a fixed size, but it’s impossible using mesh particles.
Thanks for the quick reply. That was a stupid question, sorry, i later realized i could just use uvs as kind of local 0-1 position. Didnt need them to be 1 to 1 world scale anyway
Just to add to this It looks like the normal’s and tangents are in screen space / view space still so you will still need to use the inverse view space hack if you want world space normal’s