Transparency in QNX Unity

Hi Unity,
We are developing a HMI application for QNX target using Unity (Version : 2022.3.12f1)
along with our HMI application some other HMI application also running parallelly.
We have a requirement for making some specific part of our UI to full transparent for specific use case.
In this transparent area other HMI application will render their UI.
When the second application is stopped we will disable the transparency and will show our UI content
This transparent area can be in any shape(e.g. Rectangle, Square, Circle, rectangle with rounded corners)

In Unity(2022.3.12f1) how can we achieve this requirement?

Note : We have to achieve this in one physical display, and the transparency should be configurable(enable/disable)
Since it is embedded based application, the performance of our Target should not be reduced by doing this transparency related configurations.

Hi! Sounds like you want to render a “hole” on an object into whatever you already have in your framebuffer :slight_smile: This works like this:

  • Ensure that Unity Renders to a transparent frame buffer (Set force RGBA framebuffer in player settings) EDIT: this should be default on QNX since it’s common to do this.

  • Create a custom shader that overwrites the framebuffer alpha

  • With a blend operation (One Zero) that overwrites the alpha in the framebuffer with whatever is in the alpha channel of your texture (this way you can use an arbitrary mask for this).

  • Ztest Always (so it renders “above” whatever comes before and the “hole” isn’t overwritten again

  • Set RenderQueue (in the shader inspector) to something higher (e.g. 2500) to ensure it’s “rendered” after other things.

Here’s an example setup that I made.

This might look confusing, so let me explain what’s happening here :slight_smile:

Since we can’t preview the transparency of Unity’s framebuffer itself in the editor as we would see it on the final system that would be configured to overlay Unity over other content, my scene has a preview and an offscreen part. I use layers in this setup to specify which camera should render what, but if your goal is to render everything into one final transparent framebuffer, you don’t need to do this. This is only for this test setup.

  • “preview”

  • contains a “PreviewPlane” plane that just shows the test framebuffer transparently so we can see if our transparency works

  • “Background Stuff” contains some test objects that simulate the things that would be “behind” Unity on your system.

  • The camera renders the preview things to screen.

  • “offscreen”

  • That’s the actual Unity part.

  • The RenderTextureCamera renders things into the TargetTexture buffer.

  • “Cube” is just a test object that’s behind our “hole” object so we can test if we can render a hole into things.

  • “cutout_test” is an object that uses the custom “Cutout” shader. This behaves like a normal lookup (I just put some color gradients for testing) but it does something special with the alpha channel (that’s the blending mentioned in the previous post). Instead of Rendering itself transparently over whatever exists already in the buffer (TargetTexture in the example case), it will override the contents of the buffer with the alpha from the texture. This ensures that this area is guaranteed transparent, no matter what’s rendered before.

So as a result, in the “hole”, can see that we can see the cylinders in the back (which are part of BackgroundStuff) but not the cube (in the area that is the cutout object).


9602180–1361309–test.unitypackage (101 KB)

One more note: You might want to render only a “hole”, leaving the color intact and only overwriting alpha, using the minimum of whatever it may already be (so it can stay transparent) and your defined mask.
For this, you can use this blend mode:

BlendOp Add, Min
Blend Zero One, One Zero

9602612--1361378--hole_small.gif

Hi!

Thank you for the quick and detailed response.
Actually, the above example is for enabling the transparency within the same application,
In our case we have two different applications and there is no relation between them,
both running with different process name and process id.

Our Real use case:
For example, with Navigation application in the back layer and a UI application in the front layer, from the UI app we need to show navigation in the transparent area.

We have tried using the camera’s ‘Background Type’ properties as ‘Uninitialized’ and kept the transparent area in UI app to show navigation, but the result was a solid black color in the transparent area instead of showing navigation.

9603278--1361534--Camera_Background_Setting.png

Kindly share a solution for how to achieve this use case.

Thank you…!

The reason I did the transparency in the same application is that you can’t show it in Unity otherwise (since we wouldn’t see your real background, so I had to simulate it).

The same technique should work for your target system where Unity itself overlays non-Unity content. You just don’t have the do the render texture part.

For a URP camera, could you set the “background type” to “Solid Color” with the alpha at 0?

It’ll also require you to turn post-processing off for URP, which conflicts with this other issue . We’re working on a setting to preserve alpha with post-processing in URP.

1 Like

How can we do for 2D sprite, consider a shape of sprite need to be mask, from that mask image need to see other application in behind layer.

You can assign a shader to a 2D sprite as well, and then you can use the blend mode I described above .
I attached an example of what the final setup would look like.

The 2D cutout layer cuts out the background of everything else “behind it”. But this is not visible in the Editor since the editor currently doesn’t support preview of a transparent framebuffer (that’s why I build the workaround with the render texture in the other examples)

9627530–1367516–cutout2.unitypackage (1.39 MB)