Block Shaders: Surface Shaders for SRPs and more (Public Demo Now Available!)

Not at all. Adding support for BiRP would allow us to finally move to URP and HDRP.

The need of support BiRP for full compatibility is what forces us to stay in BiRP. If we had cross-RP compatibility, we’d just move to HDRP and write HDRP shaders and materials with the confidence that they will still work in BiRP and URP. That’s what many of us desperately need to leave BiRP behind.

Then Unity may sunset BiRP at any point later when anyone can use any SRP and URP is feature-parity with BiRP.

3 Likes

BiRP compatibility is needed, though I get why unity keeps wanting to kill it. But it should be easy enough for them to write a template for it, if it isn’t, then something is very wrong with this new system.

That said, I think sticking with the old surface shader format would not only have been a mistake, but would have caused all kinds of headaches for users as well as developers, as some features of that system simply would not work. Further, that syntax was rife with issues - such as the magic variables (viewDir, what space is it in?) and implied vertex interpolators from texture names, etc. It also would have prevented composition of shader blocks, which Better Shaders proved is a very desirable feature.

However, it does mean people like me will have to do yet another translation of all of our shader code to yet another new format, which will take a few months given how much shader code I have. So we’ll just throw it on the 2 years of unpaid labor Unity has already forced me to do or lose my business.

As for the actual system, I’ll need to spend some time with it and figure out if I’ll be able to port my work to it. Immediate things which come to mind is how do I modify things like texcoord’s to have centroid sampling without modifying the template, since raw v2f structures don’t seem to be exposed, are tessellation stages supported, etc?

19 Likes

I think Unity’s coexistence effort if done properly will mostly nullify this problem, even if BIRP isn’t supported. I understand there’s some risk, but at some point the feature set will have to make BIRP obsolete.

Like I said it makes sense for transition.

At some point things will pass the rubicon where people are annoyed when anything new supports BIRP. :stuck_out_tongue:

So unzipping the mac editor from the doc file just produces a large file with a .out extension. What am I supposed to do with this?

1 Like

The template will likely not have to declare the texcoord for, say, main texture sampling - this will be in the block shader. So you’ll have full control there.
Tessellation support will be there at some point, but it’s not available in the prototype.

1 Like

@jbooth_1 The internal file is a “.tar” so you can just rename and unzip it. Not sure why the internals of the file have no extension.

Since i can’t run the editor, I’m reading the docs:

  • float2 is not listed as a supported type? nor int2, int3, int4?
  • having a color type instead of a float4 seems like a mistake
  • the property syntax has changed - is there a reason for this? Does this invalidate all material property attributes, etc?
  • only surface and fragment shader support?
  • no way to add a custom pass without modifying the template, for things like an outline pass
  • How are texture2d’s passed to functions? What about samplers, _ST, and _TexelSize? Examples seem to use tex2D semantics
  • How are template inputs handled - I see the examples use normalTS, but what if my normal is in world space? Do I have to pay the cost of putting it into tangent space and just having them template convert back? Or can I set either and have it do the right thing somehow by declaring which space my output is in?
Block DirectionalLightingToon
{
    Interface
    {
        [Property] in UnityTexture2D DissuseMap;
        [Property] in float RimAmount;

whats a DissuseMap? Does that mean it’s not used? I’ve had enough of that second property over the last few years thanks :wink:

Suggestions:

  • Consider something like the SurfaceData structure I do for Better Shaders to make shader authoring easier for new users. This structure contains a bunch of precomputed variables for the user, like “WorldSpaceViewDir”, “TangentSpaceViewDir”, “WorldSpaceCameraPosition”, etc. Learning to compute all that stuff requires a lot of unity shader specific knowledge and is also different in different pipelines (camera relative, not camera relative, etc), so having a simple set of variables or functions all in one place to do it in a cross platform way will make everyone’s lives much easier than crawling through source files.
3 Likes

Also the folder contents are a little odd (it’s the result of our build system). Internally you’d want to add x64/Release/Unity.app via the hub I believe. You may get a warning about it being produced by an unverified source because the exe is not signed. I have been told on mac you can allow this either via System Preferences->Security & Privacy has a tab for “allow apps downloaded from:“ or when you try to open the binary it may auto pop up asking for explicit approval.

Ok, unzipped the tar, but it says the editor is not a valid application - also tried adding it to the hub, which also rejected it.

When added through the hub, it warns me and I click got it, but then when trying to open the project it can’t find the editor version…

I’m not sure which docs you’re reading, but:

  • float2, int2, etc… are supported
  • there is no color type at the moment, although we may want to have one. Currently you tag a floatN property as to tell the system to add the same attribute to the material property.
  • textures are passed using UnityTexture2D (and other UnityTexture types) which combine the ST data all together.
  • In the prototype you cannot add a custom pass in the shader without creating a new template, but we plan to allow that long term. There’s also a planned system to build one template as an extension of another
  • templates provide data to the customization points for you to use. It’s up to the template to provide values in multiple spaces if that’s desired.
  • DissuseMap: I assume this is a typo on DiffuseMap
3 Likes

Did you get through the hoop jumping in System Preferences->Settings & Privacy bypass? Once you’ve got it runnable, I would maybe run the editor directly with the -projectPath command line option just to confirm it all works. I’m not sure how Hub deals with alpha builds in the wild.

Amplify is a node based shader editor, just like ShaderGraph. Amplify shaders themselves are just regular shaders that have been compiled out of the Amplify editor for each of the pipelines. ShaderGraph supports all 3 as well and you could just the same include ShaderGraph shaders in your store asset and add a shadergraph package dependency.

1 Like

Yeah, I had to do it via command line because the anyone option isn’t available by default…

  • Adding via the Hub complains that it’s not a signed unity application
  • Double clicking it says “the application Unity cannot be opened”
  • “open Unity.app” via terminal “Application cannot be opened for an unexpected reason”
  • go into contents of Unity.app/Contents/MacOS and try “open Unity” just opens a text editor with a bunch of goop in it.

Thank you for your feedback!
As @aleksandrk mentioned above, we definitely want to provide a way to convert BiRP surface shaders into SRP compatible Block Shaders and will investigate adding this to our roadmap.

Thanks for trying this out Jason! We were not able to reproduce the issue you are experiencing on macOS so far, but will try to repro and keep you posted.

Joshua already addressed the above points, but just to add up:

  • The demo provided templates provide Vertex and Surface customization points (to override parts of the vert/frag shader stages respectively), and these would be provided by RP templates along with any other relevant public blocks.

As mentioned in the original post, we are also planning to support additional shader stages. The linked survey has some questions regarding feature prioritization, including shader stage support (e.g Compute, Geometry, Tesselation, Raytracing) - so it would be very useful if you can provide your input!

  • The provided templates are for demonstration purposes only, but RP provided templates will definitely take such considerations into account and provide a sensible interface to avoid redundant transformations (for your example, normalWS and normalTS could be provided)

  • The example shaders “Assets\BlockShaders\ExtraExamples\Properties Blocks\PropertiesShader.blockShader” and “Assets\Tests\LegacyShaders\PropertyTypes\UnitySamplerStateProperty.blockShader” both provide some reference on using sampler states, so you can check these out.

  • Regarding the shader property name typo, I quickly fixed that and reuploaded the package.

1 Like
[Property(uniformName = "SamplerState")][SamplerState(filterMode = Trilinear, anisotropicLevel = 8, wrapMode = MirrorOnceV, depthCompare = true)] UnitySamplerState MySamplerState;

So does this mean we can expose sampler states to materials in a non-texture binded way? That would be a big help in fighting the sampler stripping issues that arise when sharing samplers or when passes are run that don’t use all the outputs. For instance:

TEXTURE2D(_Foo);
TEXTURE2D(_FooNormal);
SAMPLER(sampler_Foo);

SurfaceFunc(Input i, Output o)
{
// this will break on URP2021's depth normal pass because albedo is not used and the sampler gets stripped
    o.Albedo = SAMPLE_TEXTURE2D(_Foo, sampler_Foo, i.uv);
    o.Normal = SAMPLE_TEXTURE2D(_FooNormal, sampler_Foo, i.uv);
}

You can either have texture + sampler bundles or you can declare an inline sampler state. Using “UnityTexture2D” is a bundled texture + sampler. Using [SamplerState] creates a deduplicated inline sampler state. This follows the same rules as described here: Unity - Manual: Using sampler states.

Ok, so no way to have that exposed to the user then. I’m also assuming I can create a texture without a sampler state, right? Because I have hundreds of textures in some of my shaders, so I can’t have a sampler for every texture, but I still need the user to specify things like wrap mode, etc (it just uses the albedo one). And that also means I’ll still have to hack around samplers getting stripped when they are still being used by other textures, which is a drag but no worse than what it is now.

1 Like

Would it be possible to add ability to expose sampler properties to editor somehow? Like selecting filtering from shader/setting from C#. This would be really nice

I believe in the prototype there is no way to declare a texture without a sampler right now. We can definitely add that if that is something unity already supports.

BlockShaders are currently only really a layer on top of shaderlab, so if something isn’t supported in shaderlab then it isn’t supported. Since there is no current way to expose sampler states to materials this isn’t something that we could support easily in MVP.

1 Like