# I need help with logic for a game jam feature.

Hey y’all, I’m doing the Natures Beauty game jam that ends in 3 days.
My game is about putting out forest fires and regrowing trees with rainclouds.
I’m using the curved world asset which lets you make a flat terrain or scene into a sphere, and I love the way that looks and feels to use so I don’t want to change it.

they give you a tool that lets you convert shaders to using their vertex shader alteration, but I dont know how to get it to work for my fire vfx. I tried making my vfx on shader graph as a vertex fragment output graph and adding the curved world node. that didn’t work.

In my image I’m showing the problem with some regular trees instead of vfx but its the same exact issue, the game elements don’t know that the shader has moved the game world from under it so when I adjust the camera, the un effected game elements begin to look like they are floating.

I don’t want to part with the curved world or the vfx fire, so I was thinking i could write a script that lowers the fire vfx the more far I am from them, but the problem is only when i move far away on the x or z axis, my y axis from some angles can leave the shader in such a position that it doesn’t look like the vfx are floating.

My question is, can someone who understands programming better than I do explain the logic to me. What I’m thinking is to get a vector 3.distance between the camera and the vfx but I’m not sure how to only check for x and z while not accounting for y. also i know i want to change the y position of the vfx the more far the camera is from the vfx, but I’m not sure how to make it smooth, especially if there’s some kind of offset where the amount of distance the camera is is not the same as the amount of distance the vfx needs to translate.position.y.

I know this is a weird ask but I’m hoping someone has an answer for me, Thank you for reading.

The asset creator of curved world (Curved World | VFX Shaders | Unity Asset Store) specify that “Shader bending is a special and very effective vertex displace technique for creating bending illusion of a mesh and entire scene without modifying actual and original meshes.”.

This “entire scene” bit is interesting as it hints towards something in your configuration that prevent the effect to be applied to the entire scene as seen in the video demos and screenshots on the asset store. They showcase the effect which is applied to entire levels and scenes. Which lead me to the unescapable conclusion that you did not read their documentation (https://drive.google.com/file/d/19l7s6xp0CEgSYA3uh_BKT6pERlTAw0dN/view).

So just code as if your scene was not bended but be sure to do what the doc says concerning your scene objects materials:

I tried this script I’ve written but its not working the way I would like:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class ObjectToGroundCurveSimulation : MonoBehaviour
{
public GameObject thisVFXPosition;
public GameObject cameraPosition;
public Vector3 VFXPositionX;
public Vector3 VFXPositionZ;
public Vector3 CameraPositionX;
public Vector3 CameraPositionZ;
public float distanceX;
public float distanceZ;
public float distance;
public float aThirdOfTheDistanceBetweenVFXAndCamera;
public float yValueStorageToDoMathOn;
public float newVector3YValue;
void Awake()
{
yValueStorageToDoMathOn = thisVFXPosition.transform.position.y;
}
void Update()
{
VFXPositionX = new Vector3(thisVFXPosition.transform.position.x, 0, 0);
VFXPositionZ = new Vector3(0, 0, thisVFXPosition.transform.position.z);
CameraPositionX = new Vector3(cameraPosition.transform.position.x, 0, 0);
CameraPositionZ = new Vector3(0, 0, cameraPosition.transform.position.z);
distanceX = Vector3.Distance(VFXPositionX,CameraPositionX);
distanceZ = Vector3.Distance(VFXPositionZ, CameraPositionZ);
aThirdOfTheDistanceBetweenVFXAndCamera = distance / 3f;
if (distanceX >= distanceZ) {
distance = distanceX;
}
else
{
distance = distanceZ;
}
newVector3YValue = yValueStorageToDoMathOn - aThirdOfTheDistanceBetweenVFXAndCamera;
thisVFXPosition.transform.position = new Vector3(thisVFXPosition.transform.position.x, -aThirdOfTheDistanceBetweenVFXAndCamera ,thisVFXPosition.transform.position.z);
}
}

Yeah I’ve read that page, generated those nodes for shadergraph, plugged those nodes into my vertex component of my shader and it didnt work. I’ll try again but I think its still not going to work.

The particles shader in Unity is pretty simple, additive with half being 1.0 as far as alpha.

Try find another particle-y shader online and see if you can hack the curveworld chunk into it.

The product page also bragged about shadergraph and whatnot, so perhaps just use that?

She should integrate the curved world chunk into a shader that work, then modify it bit by bit until she get an acceptable similar effect to the particle shader. But if her fire fx shader itself displace vertices she will have to combine the fire vfx transformations with the curved world ones.

I custom made the shader on the terrain object and the water, and in both of them the curved world shader node worked fine, but with my Fire VFX shader, in fact every particle shader I’ve tried it on, it doesn’t have an effect. I’ve tried every combination of inputs and outputs on that group of nodes in the vertex shader.

Honestly. I think it IS working, its not just changing the position that the particles are being spawned from, I think each individual quad is probably being effected as advertised, but I cant figure out how to make the generation point of the VFX match the curve of the planet I made, and at this point I’m ready to just stop trying to use curved world for this project.

I was using shadergraph. See my response above.

It’s kind of strange that you need to get from world space coordinates to local space coordinates before kicking in V_CW_TransformPointAndNormal(v.vertex, v.normal, v.tangent); through the CW node. CW should always act on world space coordinates for the whole scene. Else your elements will be transformed relative to their respective local origins. With CW you never have to adjust an object position to compensate for the camera position.

I’m very new to shader authoring, do you know, is there a node that gathers the object position that itself doesnt require the world node attached to it to work? I know this is working on my other mesh so i think its working properly even if i have a redundant node in the network. I’ll look at a list of the nodes and see if theres a better fit.

Just make sure you get your fire fx vertices in the correct space coordinates before applying the CW transform. Make a first test where your fx is not a child object. When it works, attach it back to its intended parent and work the transform chain from there.

Ive tried it with world space, object position, and a vector3 with all of its floats set to 0. no change.

2 new things have occurred to me.
1, someone might suggest that I parent the vfx to a game object whose mesh is properly being distorted by the shader.
I’ve done that, and it doesn’t work. None of the local space transforms are being changed, only the way the image buffer is interpreting the shape of the mesh.

1. I might be able to get away with making a fake particle effect, going around the need for an emitter, maybe I can do this and have it look realistic in maya, if not just using unitys animation system might make it look good and be more performant than a standard particle system.

Make a first test where your fx is not a child object. Use only world space coordinates. When it works, attach it back to its intended parent and work the transform chain from there if needed.

I’ve tried it as being a child and not being a child, no change. The problem here is the emission point of the particles. curved world is a shader that does not change any position data, it only changes the way you view the position data of mesh that the curved world material is applied to. the emission position doesnt have a material applied to it, only the particles it emits have a material applied to them. this shader cannot change the emission position of particles. Thats the theory im operating under anyway. Which is why im looking for a work around.

Particles emitters have a world space simulation option you should test. But it’s not the problem here since the particles emitter dont move.

Ive tried removing the world space to local space node and:
Plugging nothing into it (which is a vector 3 with all axis set to 0 by default)
plugging the worlds pace node directly into the curved world node.
and ive tried messing with all of the connections in shadergraph.
None of that worked.

So either i abandon the curved world, or i animate a fake vfx in maya and bring it into unity, which is what I’ll dot unless it looks like crap.

I checked and whether you select the emitter simulation space as local or world, the particles vertices received by a shader vertex program are in world space. So just remove the world to local coordinates node in your shader graph, remove the camera compensation code and you are good to go.

I got it yall, im not 100% sure what it was but i built the particle emitter through VFX graph. even though my settings were right i guess something was lost in communication between shader graph and vfx graph, i rebuild the particle emitters through just the legacy particle effects system and now its working as expected, thank you for your help!

I couldnt do more because I dont have the curved world asset, nor have I your project, so we must guess everything the users dont tell us in these situations. But the fact that there was no need to tinker with the camera and local coordinates was an easy guess.