Get nearest position

Hi,

I am planning on making an effect that takes multiple object positions and leads particles to them via “Conform to Sphere”. Right now I have the following soloution for that. I am using a Property Binder with “point cache → multiple position Binder” to store the positions in a Texture2D and have the following graph:

So far this works fine, but I would rather like to have the particles pick the closest position instead of just picking one random. I could also imagine that there is one “Conform to Sphere” for each position (but I dont know if it is even possible to dynamically add or remove blocks depending on the position count).

Any Ideas how to solve this?

By quickly looking at your question, an SDF could be a good solution, as a signed distance field sampling would give you the “Closest position”. The advantage is that you won’t have to do multiple sampling to know which position is the closest. But for this you’ll need to generate an SDF at runtime that can also be costly…

So I guess that it might depend a lot on the numbers of particles and the number of “Lead Position”.
“HLSL” block would allow doing this more easily, but still sampling the same texture multiple time per-particles isn’t ideal…

Here is a “naive” an “ugly” solution that might be fine if your number of particles and “Lead position” isn’t too high.
The idea is to compute the distance between the “Particle position” and each Lead Positions.
We find the “Min” distance meaning, the closest position and use this in a “Conform to Sphere”, or else:

8959227--1230708--upload_2023-4-19_12-0-3.jpg

8959227--1230711--Unity_lvtKYuXUW6.gif

8959227--1230726--Unity_oFoLJhZrTh.gif

8959227--1230729--upload_2023-4-19_12-45-34.png
I’m sorry, as this solution is far from the best…
Have a great day.

2 Likes

Hi Orson, Thank you for you detailed reply.
Your suggestion is a good starting point and I had a similar Idea, sadly it doesnt work in my case, because the number of “Lead Positions” can vary from frame to frame.

But looking ant your soloution, I got another idea, that I might try out.
I could have a counter that counts up every frame (cycling through IDs) and then I use this counter as an index for the texture sampler to compare it with the currently aimed indexes value. this way it wont react instantly on every frame and it could lead to weird behaviour. But hopefully it will not be noticeable.

The other Idea that I have, is to have multiple “conform to sphere” blocks (around 20) and assign the different positions to them. If there are less lead positions, i accordingly set the attraction force to 0. I have no idea how performance costly this is tho.

But maybe I should check this out first, since it sounds promising. I don’t know how to create SDFs tho. I will try to find out.

VFX Graph has an SDF baker but it’s more for “offline” SDF baking.
8960103--1230948--upload_2023-4-19_18-1-12.png

I didn’t have the time to try it out yet but the demo Team showed some Real-time SDF generation that could be useful to you.
Here is the link to the Github Repo:
https://github.com/Unity-Technologies/com.unity.demoteam.mesh-to-sdf

Don’t hesitate to share your Results :slight_smile:

I want to note that the default baker provided with VFX graph package works in runtime too. I have never tested performence when baking every frame, but it didn’t look so slow (it runs on GPU).
However description in the link above claims it’s faster than the default one anyway, so…

If baker is somehow too slow and you need to bake spheres only the other option is to write custom compute shader to generate SDF texture.

yeah, I can imagine, that a custom compute shader is much more performant. Unfortunately I dont know anything about compute shaders and how to create them. All shader experience that I have is with shadergraph. So I never learned shader language. Do you have a hint for me, where to start?

To be honest this might be a deep dive if you used shader graph only. There are some tutorials introducing like this, but creating correct SDF requires additional knowledge, although it should be simpler with spheres.

In any case I wouldn’t jump there unless your tried other solutions like the repo above and it was still not enough.
Also I think I have seen somewhere repo with vfx graph custom node created by Thomas Iché and assuming it works you could try to loop buffer of points to calculate forces.

So here is what I came up with:
It is the soloution I mentioned here:

I made a Subgraph for the “count and compare” part so I was able to do it multiple times in a row. this way the particles react faster.

Subgraph:
8961834--1231218--graph2.png

Main Graph:
8961834--1231224--graph1.png

Result:
8961834--1231227--attract.png

Its not perfect, but in my case it works very well.

Thank you guys for your time and help @OrsonFavrel @Qriva you are awesome!

Okay, I ran into another problem, maybe you have an idea for that too.
after testing the wohle thing with these dummy cubes, i now want to asign the targets realtime. problem is: I have no idea how to do this. In the inspector I can add the binder through Add Component > MultiplePositionBinder, but when I try to write GetComponent() Visual Studio marks it red. It doesn’t suggest a using namespace either. Do I have to write my own binding class if I want to change targets runtime?