For what I saw, it’s pretty clean !
Request: outline.
Great start!
Wishlist:
- Custom Shaders (vert, frag)
- Custom code node
- Custom Lighting
- Mobile Optimization
- Multiple Passes
- Support for particle and UI shaders
Yes, we have a public simple roadmap on the plugin site which we will expand taking all feedback into account and also with more new info from our private, more detailed roadmap.
You can only create Surface Shaders for now. Our main focus with this first beta was more on the user experience and overall workflow but on the future we will support standard vertex/fragment shaders and other new cool features.
We can definitely add this functionality!
We will replicate this issue on our end and fix it asap.
Rest assured that we will fix all these nasty issues.
We use it to internally catch when you hit reset on the material and set the its properties to the shaders default values.
Surface shaders are only the beginning. Custom shaders have a really high priority on our roadmap.
Can you elaborate on the " putting everything into emission"? We only force everything on the emission channel when the Debug port is connected, if not the instructions will be fed into the according connected ports (albedo/normal/etc).
Yes we completely agree with you. We will definitely register your suggestion.
We aiming to generate clean shader code in order for users can to clearly see what is being done. Also this way is easier to catch possible bugs on the shader code generation.
You can manually edit the shader on a text editor but these changes won’t be read by our plugin. We only parse our custom data which is always written on the end of the shader file.
We also create a checksum on the file in order to see if manual changes are done and warn the users when opening the shader on ASE that those changes will be lost.
Optimizations wize, on the shader code generation we check multiple usages of the same output, save its value into temporary variables and use these temporaries. With this we also aim to prevent multiple fetches on the same texture node.
It’s far from finished but this is an area on which we will continously work on, in order to always improve the generated shader to its best.
Can you elaborate a bit more on that?
Thank you so much for all this initial feedback! We are already full force on improving this first beta according to it!
All great suggestions!
Another quick bug. The append node for vector2 is generating a vector 3 instead.
Nicely caught, we already replicated and fixed it!
also purchased its really promising, some interesting request: mobile optimisation/warning (this function is too expensive for mobile !) for example
Ah yes, it makes perfect sense!
Also, kebrus was kind to share his shader with us so we could replicate his issue. There was an internal casting issue which fortunately was easy to spot and fix.
I bought it a few minutes ago and will give it a try because your other plugins are under the best in the asset store.
How many devs are working on this plugin? Do you have any estimates for your update cycle timing?
Ok, two more.
So I managed to create my own node to get the screen color (still very buggy but it seems to work). But I found two other problems in the process.
The first is that, because we don’t have custom shaders yet putting a refractive shader in emission ruins it but, meanwhile I’m outputting to albedo as a test and the result was not what I expected, with some tinkering I found it was the render queue, I want to render it after everything else. The problem is, you have the render queue tied to the render type, and in my case I want the render queue as “Transparent” and the render type as “Opaque”, but if I select Opaque it always changes the queue to geometry, not only that but something is funny, I might be wrong here but shouldn’t it be something like “Geometry+0” instead of “Geometry0”? it might be a typo. But anyway, If I then change the render type to Transparent I get the correct render queue but now the effect gets lost because the blending kicks in because it’s now a transparent shader.
The other thing that I’m not sure about are the matrices, I temporarily got around the first problem by opening the file and manually fixing it, so I was trying to make my refraction shader distort with the object geometry and I needed one of the matrices to do a dot product but the output is always 0 (zero), looking at the files they seem fine, I tried a bunch of them and they all output to zero instead of “UNITY_MATRIX_MVP” (for example).
If i get these two fixed I should be able share some somewhat nice refraction shaders.
We currently have one developer working full-time on the beta. Our initial estimates are to try and release major beta updates on a monthly basis at the Unity Asset Store. We are also working on building a private forum where we will try to release new builds multiple times per week. We really want the community to always have the latest version and maintain a good feedback loop.
Yes, the ‘+’ was missing. This was indeed a typo.
Concerning the render type and queue, we decided to have presets ( Blend Mode dropdown ) to keep it simple for this first beta, but yes we will have separate configurations for each one.
I’m not quite sure if I understand what might be the issue because dot product is a vector operation so it shouldn’t be used on matrices. Are you trying to access a specific row/column of a matrix, use that on the dot product and getting incorrect results?
We’re really looking forward on seeing your shader fully working!
Hi all! Its look great!
I have question about current or planned functionality.
- ASE provide any method for several subshader (with different target platform, for example) with keyword depended nodes?
- ASE provide custom lights or custom cginc?
- ASE can access compute buffers in high end systems?
Thank you so much for your interest!
On Beta 1 you can only create single pass, single subshader surface shaders.
A more detailed roadmap will be provided over the official site as soon as possible, so you know what are our near, mid and long term plans.
We also really want to reinforce that it will not be set on stone. This plugin will be community driven and will be changed over time according to feedback and requests.
Nice! Picked up a license – excited to see where it goes.
Hm. Ok, I wait for official roadmap, because node based editor for shader its intresting… but in more general way than one pass and one subshader. In my project I use write in compute buffers (in DX11 subshader) with per light computation instead directly output from shader to screen… if ASE can handle this in future (subshaders and custom nodes) its will be great.
P.S. Sorry for my bad english… its my readonly language.
Incredible! You just ripped through the asset rankings with this release in DAYS. How long is this initial price offering lasting? I own your other products - which I really like, and I’m thinking of buying this too.
Side note: I’m glad you have trial versions of your other plugins. I tried out bloom and motion that way, then purchased them after putting them in several scenes to see what was possible. Because of that, I also purchased occlusion when it went on sale some time ago because I already had an idea of what to expect.
Hi, Shader Forge have lots of users now, what’s the competitive advantage of this product?
Personally, I think mathematic lighting models are black magic for artists, would ASE make them eaiser?
Request: UV-Mixing
Will there be a way of easily integrating these shaders with Amplify Texture?