Spawn particles on dissolving edges

Hi, i want to crate a dissolving effect with Shader Graph and P articles. For the dissolving effect seen in the pictures, I´ve used an asset but now I wand to use particles to make it like the object is dissolving into sand. For this I want to spawn the particles just on the edges (yellow) wich are used for the dissolving effect. But I cant figure it out how to do it. Maybe it´s just a simple trick, but i didnt find a solution yet.

I would like to use the VFX Graph to do this, but because I want to develop it for the Quest 3 a normal Particle System is better for a stand alone version. (But i would also accept tips for this :slight_smile: ) I´m using Unity URP 2022.3.14 for this.

Thanks!

Morning.
Dissolving a mesh with particles is a use case that is often needed. There are multiples ways of achieving this and your road will depend on your specific project and needs. First, I will say that frequently having a 1-1 match between the particle’s spawn position and the dissolving edges isn’t mandatory. In gameplay scenarios, it’s regularly needed to dissolve/destroy the mesh pretty fast. In this case, it can be hard for the player to notice that particles and dissolving edge are not in sync.

With that being said, what is needed to achieve this?

  • We needed to be able to match the dissolution pattern between the Mesh and the particles.

  • We have to synchronize the dissolution animation between the Mesh and the Particles.

  • We have to emit particles from the dissolving edge.

Both of these requirements can be achieved in different ways depending on your specific use case scenario.

Let’s start with matching the dissolution pattern. The idea is to reproduce the dissolve function that you did in your Shader inside VFXGraph.
So, if you’re sampling a Noise texture in your shader, you have to sample the same texture with the same settings (Uvs, scale, etc…)
Now, regarding the UVs to sample the texture use in your dissolve function. This will be specific to your use-cases and project, but usually, we either use the Mesh’s UV or use Triplanar or Biplanar mapping. Using Biplanar or Triplanar mapping is more expensive but offers advantages like getting predictable and consistent results between meshes and typically no UVs seems. Indeed, this technique relies on WorldPosition whereas your meshes 'uvs may not have the same layout between meshes.

9693812--1383173--upload_2024-3-11_13-56-7.jpg

In the example above, you can see the difference between sampling the Mesh UVs and the WorldPosition as UVs to sample a texture. In both case, what’s important and sometimes tricky is to do use the same operation and settings on both the Shader Side and the Particles.

9693812--1383191--upload_2024-3-11_14-7-8.png

With this done, you need to keep in mind is that if you change some settings on your shader (ex: the Sample Texture used) you need to propagate the changes on the VFX side. This can be done manually, or you can expose the critical properties in VFX Graph and in ShaderGraph to be able to link them thanks to scripting.
Using generic Subgraph and global variables could also be a way to keep the settings in sync.

The second part is to sync the dissolution animation between Shader Graph and VFX Graph. We often try to control everything thanks to one float property. From here you need to update those floats in sync between ShaderGraph and VFXGraph.

For this, you can either use C# Scripting or Timeline. If possible, I prefer to control this with Timeline, as it’s more Art Directable, but it’s a matter of preferences and project/use-case. As demonstrated by this example above, I’ve got two tracks that are in sync to animate the dissolve exposed property on the Shader and in VFX Graph.
9693812--1383200--Unity_ZVUTs5SltX.gif

Now the last part is: How to spawn particles on the dissolving edges. The idea would be to use a black and white value to control where the particles are being spawned.
We’re basically going to need to do “Rejection Sampling”, meaning that we spawn particles on the entirety of the Mesh and then kill particles with a threshold value. This can be done several ways.

As you can see on my screen, I’m spawning many particles on my Mesh. By sampling it, I’m able to get it’s Uvs and with this I can sample the same texture used in shaderGraph for the dissolving function.
As stated before, the idea is to reproduce the same function in VFXGraph that the one did in ShaderGraph.

9693812--1383215--upload_2024-3-11_15-26-27.png

You can also take a look at this great video from Matt Ostertag, that is doing what you’re trying to achieve with VFX Graph and Amplify Shader. The main ideas are the same as the one exposed here, but the result is more polished.
Furthermore, the solution to spawn particles on the dissolving edges is interesting and different as it relies on a “Master” system and GPU events. It is quite nice, but has currently the drawback of preventing the instancing to work with VFX Graph. So if you need your effect to be instanced a lot, I would try not to use GPUEvent for this.

Now, might knowledge of Shuriken is limited, but I’m not sure that it’s doable or not easily. I will ask around to see if this can be achieved with it.

I hope this will still be helpful for you or other VFX Graph’s users. Wish you a sunny day.

9693812--1383215--upload_2024-3-11_15-26-27.png

3 Likes

Thank you so much for the infor, i have 1 question, Can i apply this effect but reverse on 2D texture? Im stuck on the vfx side, dont know how to link the edge from shader with the vfx spawn. Here is my shader.
Thank you for the help

9707777--1386713--Ảnh chụp Màn hình 2024-03-18 lúc 17.34.54.jpg

First of all: Thank you sooo much for this detailed informations! This is pure gold and I´m a little bit ashamed that my response is so late. I just forgot to check on my post, because i had work to to for my Bachelor Degree Show. It was my first post and didnt think i get such a great response! As i said, thank you so much and have a great day!

I understand the steps you need to make to achive this effect and i recreated it with your guide. But I am a little bit stuck now when it comes to Triplanar Mapping in the VFX Graph. In Shader Graph it´s no problem but i can´t recreate the same in the VFX Graph. What would I have to do in the VFX Graph to achive this?

Hello, I have a question for this tutorial.
I recreated the process, but I ran into a problem. See if you sampling a mesh for position, there will be uneven particle spawning areas (On the picture: 1 the “Set Alive” attribute is disabled, 2 - enabled, all the particle are only on the dissolve edges).
I modifed the set position part to get the spawn areas from a point cache map, and they are spawning evenly(3), but now, with the "Set alive " enabled, there are some area they are clearly not supposed to spawn from (4).
What am i missing? Is it possible the modify this graph to work with point cache maps?


Morning. Your issue is that you are randomly sampling the Mesh to extract the UVs.
Those UVs are used later to sample a texture. But your particle’s position isn’t coming from this same random seed.
Instead, it’s coming from the point cache, so there’s no correlation between the spawn position and the stored UVs.

What you need is the UVs of that correspond to those spawn locations. If you used the Point Cache tool from VFXGraph, you should check “Export UVs” option. This would allow you to also get the UVs from your point cache.

9886074--1426596--upload_2024-6-12_9-37-13.png

You can then store the UVs in an Attribute if needed to be used elsewhere.

As you can see in my examples here, I’m using a constant Random, which determines which index to sample the point Cache. From here, I’m able to set the position and store the UVs and normals (if needed).
With this setup, I’m sure that the same Index, so the same point, has been used to get the position, UVs, and Normals.
9886074--1426599--upload_2024-6-12_9-47-23.jpg

From what I understand, you’re using a point cache to get a uniform distribution. It’s a perfectly valid approach, but I would like to suggest another one.
To get a uniform distribution, you could also use a proxy mesh that is used only for VFX purposes.
I’ve shared many different approaches on this Thread , so don’t hesitate to take a look. The good thing about this is that you can spawn on the surfaces of any triangles of the mesh, so you can get Position and Uvs interpolation , which should result in better accuracy without having too many triangles.

Whereas with PointCache, particles can only spawn on the cached Points and get information from them. You cannot interpolate between them. So, depending on the model, this means that you might need to increase the Point Count of your cache.

The illustration below shows you the difference between sampling a uniform Mesh (Left), where you can spawn from the surface of all triangles and interpolate attributes, and on the Right, getting the attributes from the point cache. With the point cache, your accuracy is directly bound to the number of points that have been cached.

9886074--1426614--upload_2024-6-12_10-15-14.jpg

Hope this will be helpful. Wish you a sunny day.:sunglasses:

1 Like

Thank you very much the respond and the links! Its definetly looks better with using a proxy mesh.

Buongiorno da Roma!

I’m sorry for the so basic question, but i’m pretty new in the VFX Graphs logic. I have been searching around the web for hours about “Set Mesh Uvs” block, or anything associate or similar like “unity vfx graph set uv”, “unity set mesh uv graph” and so on.

I have not found literally anything except codes, SHADER graphs, etc.

I’ve already toggle to true the checkbox “Experimental Operator/Blocks” in Preferences > Visual Effect.

Last but not least, in the Unity ShaderGraph I did a Dissolve effect like yours and setted up a “second outline” to make a black effect of burning, but I can’t use black of course since is used for opacity. How can I correctly invert colors? In that way I can use white to get black on the second line of add before setting the emission.

To be clear: I’ve a GameObject with a mesh renderer (with the custom shadergraph material) that actually is a paper containing a World Canvas. I’m trying to make it burning through dissolve (ShaderGraph) and fire particles (VFXGraph).

It would be awesome if you can explain (shortly if you want) how to get the deformation effect on the mesh too so it seems paper is actually burning and dissolving in the air. ( like the video you mentioned )

@OrsonFavrel

Hope to hear you back, thanks for your support and knowledge sharing anyway :heart:

Best Regards,
Carlo

Hello, don’t worry all questions are relevant.

So, regarding the Set Mesh Uvs block that you are referring too, it’s just a “Set Attribute” block of a Custom Attribute. In VFX Graph you can create custom attributes, that are here to store datas.

In this case, we want to store the UV’s of the mesh that correspond to ours Random Sampling.


Storing this UV data in a custom attribute allows us to use it later at different stages.
Now, creating Custom Attributes depends on your Unity Version. Before Unity 6, you would need to:

  • search for “Custom attribute” in the node Search
  • create a Set Custom attribute block.
  • Select the created block.
  • in the Inspector, set the Custom Attribute Name and Type (in our case a Vector2).

From here you should be able to wire your UV’s from the sampled Mesh.

Now to “Get” the custom attribute before Unity you’ll have to:

  • In the graph open the Node search
  • Search for get Custom Attribute.
  • create a Get Custom attribute Operator.
  • Select the created Operator.
  • in the Inspector, set the Name and Type and be sure that it matches the one that you set earlier.

If you’re already using Unity 6 Preview:

Set a custom attribute:

  • Open The Blackboard.
  • Click the + Icon to create a new Attribute of type Vector2.
  • Give it a name.
  • Drag & Drop it in the desired Context.

Get a Custom Attribute.

  • Drag and Drop the Custom attribute from Blackboard to the Graph.

Now regarding the Deformation effect on the Mesh, I could explain it, but I’m not sure that I would do a better job than Matt Ostertag. So I would suggest that you take a look at the video at this TimeCode where he clearly explains how to deform the Mesh.
While he’s using Amplify Shader Editor, you can do the same in Shader Graph.

I hope that these informations will help you with what you’re trying to achieve. Don’t hesitate to come back later if you’re still blocked.

Have a fantastic day.

1 Like

Thanks man, so happy to learn something new.

I’ve completed the VFX Graph like the the one in the screenshot, and i have a shader graph with few differences, but i can’t get the result like the one in the gif (Hoodie Dissolve).

My particles just stay at the pivot of the mesh and doesn’t move along the surface. In this small video you can see the dissolve and burn effect I got with the shader graph, the particles problem, and the graphs. (Sorry for all-in-one solution but as new user i cannot upload multiple elements, just make a screenshot in time)

SupportRequestGIF

I can’t use node like “Exposure” in ShaderGraph cause I’m working in URP. The version of the project is 2022.3.30f1, so thanks for the step by step explanation with older version too.

For the question about mesh deformation I will try to implement it in my shader graph thanks to the excellent video you found out. So grateful for you help.

For this question i didn’t get a reply, is possible to get black outline in the emission? Or i must use red like in the gif i posted cause black is used for opacity?

Have a great day
Carlo

Well, a Black color in the Emission input won’t do much. If you want some kind of Black/Burnt effect, this should be set in the Base Color input.

Now, to help you with the effect, I found this unity package that contains the “Hoodie Dissolve Effect”.
It has been made for HDRP, but I added an URP target to the Hoodie SGraph. So it should also work in URP, but the Volume/Fog setting of the Dissolve scene might be wrong. I would recommend that you create an HDRP project to study the scene and how everything is set up.

Just open the scene named “dissolve” and select the Hoodie ion the hierarchy view. Open Timeline and you should be good to go.

VFXG_DissolveMesh_Edges.unitypackage (3.2 MB)

1 Like

Thanks a lot to clarify that.

About the unitypackage i tried to import in both pipelines project (URP and HDRP 3D Sample) on Unity 2022.3.30f1, but this error occurs when i try to open the .vfx:

Exception thrown while invoking [OnOpenAssetAttribute] method 'VisualEffectAssetEditor:OnOpenVFX (int,int)' : ArgumentException: Unable to find attribute expression : meshUvs
UnityEditor.VFX.VFXAttribute.Find (System.String attributeName) (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/Expressions/VFXAttributeExpression.cs:176)
UnityEditor.VFX.Block.SetAttribute.get_currentAttribute () (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/Models/Blocks/Implementations/SetAttribute.cs:400)
UnityEditor.VFX.Block.SetAttribute.ComputeName (System.Boolean libraryName) (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/Models/Blocks/Implementations/SetAttribute.cs:122)
UnityEditor.VFX.Block.SetAttribute.get_name () (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/Models/Blocks/Implementations/SetAttribute.cs:96)
UnityEditor.VFX.VFXGraph.SanitizeGraph () (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/Models/VFXGraph.cs:602)
UnityEditor.VFX.UI.VFXViewController.ModelChanged (UnityEngine.Object obj) (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/GraphView/Views/Controller/VFXViewController.cs:883)
UnityEditor.VFX.UI.VFXViewController..ctor (UnityEditor.VFX.VisualEffectResource vfx) (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/GraphView/Views/Controller/VFXViewController.cs:1391)
UnityEditor.VFX.UI.VFXViewController.GetController (UnityEditor.VFX.VisualEffectResource resource, System.Boolean forceUpdate) (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/GraphView/Views/Controller/VFXViewController.cs:1366)
UnityEditor.VFX.UI.VFXViewWindow.InternalLoadResource (UnityEditor.VFX.VisualEffectResource resource) (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/GraphView/VFXViewWindow.cs:240)
UnityEditor.VFX.UI.VFXViewWindow.LoadResource (UnityEditor.VFX.VisualEffectResource resource, UnityEngine.VFX.VisualEffect effectToAttach) (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/GraphView/VFXViewWindow.cs:186)
UnityEditor.VFX.UI.VFXViewWindow.LoadAsset (UnityEngine.VFX.VisualEffectAsset asset, UnityEngine.VFX.VisualEffect effectToAttach) (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/GraphView/VFXViewWindow.cs:179)
VisualEffectAssetEditor.OnOpenVFX (System.Int32 instanceID, System.Int32 line) (at ./Library/PackageCache/com.unity.visualeffectgraph@14.0.11/Editor/Inspector/VFXAssetEditor.cs:128)
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr, Boolean&)

of course i was searching online for a solution to insert the custom attribute without the editor interface, and better the project you found too searching through keywords like “Hoddie” “vfx” “vfxg” and so on, but nothing.

I don’t want to get so much of your time, but if you can re-upload the unitypackage without doubts is working i’ll figure out how to do that effect on my own thanks to your explanations, other researches and study, and this project.

this is the result i got (no spawn based on edges but just random, is a good result but i really want to understand how it works the spawning on the edges dissolving with shader graph):

ParticlesBurnResult

Reply at your convenient time, thanks for the support.
Carlo

They have been some majors changes between 2022 LTS and Unity 6. The package that I sent should work if you open it in a Unity 6 preview release.

Now, I took the time, to re-create the VFX Graph in URP 2022.3 LTS. I’ve also added Tri-Planar UVs and an SDF collision. I cannot spend more time for now on this Topic, but I’m sure that you’ll be able to make it work. You’re pretty close now. :wink:

Have a great day.

VFXG_DissolveMesh_Edges_URP2022LTS.unitypackage (4.2 MB)

1 Like

WoooooooOOOOH :exploding_head:

Thanks a lot man! You are the big one, i can learn a lot from this and i think others that will land on this post too. Until Unity 6 is the standard of course.

For sure now i can make it work on every model in different ways and with different effects. Like building too (reversing it).

I’m grateful for the time you donated to me and to the community, you are very active i see.

I’ll edit this post with the result obtained when is done, thanks again, really.

Have an awesome day, like Skeletor: Until we meet agaaaaain! :rofl:
Carlo

EDIT:

and in last we can say that is working as intended. Unfortunatly i can’t use the same emission map texture created by shadergraph, i searched a lot how to export it and tried to make it static with “a screenshot” and use it like a mask, nothing of this solved the problem. I just tried to set the visual effect exposed property through code but a material with a custom shader doesn’t have “EmissionMap” in the Texture exposed.

So the result you see is made with a little implementation of the scale property multiplied by the x value of the sample texture noise. With this small change you can upscale or downscale the noise texture in runtime, i just set it fixed to 1.5 cause in my case the paper never change dimension. And of course i changed particles to make it better for my case as you can see.

Thanks again and a lot of good things to all of you!
Cheers

DissolveAndBurnResult

1 Like