A generic Visual Scripting tool is in currently in development. Check the roadmap for more info.
All our samples will be available soon
For this you can pack all your frames in a single texture 3D and use a few math nodes to samples at the right frame or you can expose the 3D Texture and use a small script/timeline track to set the frame. We may add 3D flipbooks functionality directly built-in in the future.
Regarding texture formats supported by the VFX Graph:
Texture 3D (Used for Vector Fields, Signed Distance Fields or can be sampled directly for generic use
Texture 2D
Texture 2D Array
Texture Cube
Point Caches (An internal type which is basically a set of Texture 2D storing point attributes)
We plan to work a lot on the VFX Toolbox to allow easy generation of Vector Fields, Signed Distance Fields from meshes, Point caches directly within Unity.
We also plan to add exporters for popular DDC tools to export Point Caches and Texture 3D directly usable within Unity.
Motion Vectors blending in outputs will come soon and Motion Vectors generation will be worked on later to be included directly into the Image Sequencer tool or our VFX Toolbox.
fga files are used by UE4 to store Vector Fields, we have our own called vf. Depending on what your needs are we can add support for some popular data formats.
I spent too long researching a few things and writing this reply, so Julien from Unity posted a proper response before I finished this message, but I may as well post it anyway!
The tools to easily import/convert some things on this front seem a bit weak at the moment. The visual effects graph expects signed distance fields to be stored in 3D textures, of types like RHalf. It expects vector fields to be stored in 3D textures, and supports Unsigned Normalised and Signed types. Point caches can also be used and this is one thing they do provide a utility for already, there is a menu option to bring up a window that will convert a mesh to a point cache file that the visual effect graph can use, but I havent tried this side of the system at all yet.
There is no utility to import .fga files and turn them into suitable 3D textures yet - so I used one that I bought years ago off the asset store (Mega Flow) and it works, the format of 3D Textures it can create from .fga files seems fine with the visual effects graph. I also bought āJangaFX VectorayGen - Official Vector Field Pluginā just to see if that worked too, but quickly discovered that the bit of code in it that theyāve written that supports GPU 3D Textures is hidden in the UI and doesnt let you actually create texture assets within your unity project, so this is an incomplete solution that I cannot test with the Visual Effect graph unless I take a little bit of time to add some code to it.
There is some code that Unity have used to import signed distance fields from .vf format files so that they turn into RHalf 3D textures inside Unity which the visual effect graph can then use. I know nothing about this vf file format so Iāve been exploring other sdf solutions that bake to a suitable 3D texture asset in unity, but I havent got far yet and spoke about this in earlier posts.
It is good to know that Unity are working on various solutions, and I am not that surprised to hear that they favour making exporters for their own format and/or unity textures rather than just doing an .fga importer, although I still think an .fga importer->3D texture baker would be popular, since 3rd party asset store tool like Mega Flow that can do that, have lots of other unrelated functionality & associated cost and UI complexity.
You need to change the render pipe path in Settings/VFX to point to legacy. Weāre not sure this is still working as it is not maintained anymore. But we can fix it if itās not.
I think the only āmissing pieceā here is on the import side, as I was just discussing, and there is a 3rd party solution you can use if you can provide each houdini frame in a format Mega Flow supports. Once the data is in the right 3D texture format in Unity, as a 3D texture asset for each frame, you could just expose the vector field input in the effects graph as a parameter, and use a simple script to cycle through the 3D textures and assign them to that parameter. If there are performance implications for this basic way of doing it, and the resolution of your simulation data is low enough, it might be better to try to create an atlas of all the frames in one texture, and write some custom effects graph code to handle it, but thats a fair bit of additional effort and I cannot currently evaluate whether it would be worth it.
If anyone has a bunch of frames of such vector field data in a suitable format (Mega Flow supports fxd, fga and a maya plugin that exports flw so those formats are what I can use right now) and they dont mind sharing them, then I would be happy to have a go at getting it working in Unity with the visual effects graph. Its something I shall probably try myself eventually anyway, but right now I dont use external tools so I lack the necessary vector field data until a time that I bother to add a baking function to the fluid sim that I use directly inside unity. And I never seem to quite get round to that because I have so much fun using the data from that fluid sim live in Unity, even though it eats most of my gpu so isnt really that practical for games unless I lower the simulation resolution to levels that dont give me the results I crave. I really should investigate using it as a baking tool now that visual effects graph is here, but Iām not exactly a pro coder, more of a botch merchant.
Thanks, I did check the roadmap two years ago and it still the same today, itās in research.
āInitial prototype of Visual Scripting, for internal demonstration and feedback gathering purposes.ā
To me it seems like this will only come out when the ECS preview is final and stable because it will be build on top of this. right? Let me just say Iām really looking forward to it, and it seems the base to build on is there in the interface for the shaders graph and the vfx graph.
I appreciate the answer. Iām trying to do what you suggested, but Iām not familiar enough with the graph to make it work. Is there a simple example of this available? Or is it better to wait until you guys straighten up the workflow?
My primary use case is having all effects seem like they are in world space, but are actually in local space in relation to a floating origin parent transform. When the parent transform shifts position when the camera is far from the origin, all effects shift along with it without leaving any particles or trails behind at the previous positions, because they are local to that parent.
I donāt feel the installation clearā¦i download directly the Zip, i open the project (HDRP test, right ?) with the lastest beta, i have no visual shader graphā¦can somebody help be, itās been an hour trying all i can do, but itās out of my mind, i donāt get it
Actually, the HDRP Test project does not reference the Visual Effect Graph at the moment.
Right now if you get the project directly from the ScriptableRenderPipeline GitHub repo you will have to configure by hand a new project that reference both: SRP Core, HD Render Pipeline, Shadergraph and Visual Effect Graph
It comes with two batch files, one to get the SRP repository, one for updating it. And is already configured.
Basically you need to download this repository (or clone it), make sure you have git for windows installed, then run the FirstTimeSetup.bat file: It shall get the SRP repository. Then you can open the project using 2018.3.0b6 and newer versions.
This project is already configured, and contains a simple scene with three really simple effects.
That sounds very wrong to me. There are all sorts of different things that can be stored in 3D textures, but the nature of the data is really quite different, and its not good to confuse one for the other. Signed distance fields store information about how close a point is to the surface of an object, and whether it is inside or outside the object. This information has various applications in game engines, and in this case its used to provide particle attractor features that are relatively cheap to compute in realtime. Searching for historical uses of SDF in Unity will cause further confusion since in the past, apart from a couple of specific SDF render-related 3rd party plugins, the most commonly talked about use of SDFs in Unity was for 2D SDFs used for text font purposes.
This āSDF stored in a 3D textureā information is quite different to 3D textures that are used for things like fog density, and 3D textures that store vector fields which enable the velocity of particles to be influenced. And I already posted about some of the options for importing some of these kinds of data into Unity - short answer is that the appropriate tools are not really provided yet, and I only got results by using paid 3rd party assets (eg Mega Flow for vector field .fga import->3D texture) and butchering existing SDF code by Keijiro which was not written with the Visual Effects Graph system in mind. Unity plan more tools to help with this stuff and exporting from external tools, but these tools arent available yet. (with the exception of mesh to point cache stuff)
Yea, I should have been more clear with that. What I recommended wasnāt a solution but an alternative. The textures created with the tool work (from my tests) but yea they are definitely not Signed Distance Field textures so I shouldnāt have included that part. (they wonāt get you the exact result youāre looking for but itās alright for testing)
The video I included with the post above explains that there are tools coming so we just have to wait for them
Any idea on how to set up a parameter binder ? I tried adding the VFX parameter binder script, with a position attribute but I have trouble on getting how whatās the workflow here