Hey there!
So I was testing some different technical approaches with a scifi scene where various emissive materials contribute to the atmosphere. Since the game is made for VR, using MSAA is pretty much vital for the visuals, enforcing usage of the forward renderer.
Now here’s my problem: I want these lights (and only these lights) to be affected by the bloom filter, but I cannot use HDR and MSAA at the same time in Unity. Without HDR, I cannot reasonably set the bloom threshold to 1 and any value lower than that will bloom any somewhat brighter surface, which is then amplified to an uncomfortable degree by the lenses in VR - something I would like to avoid.
Anyone with more shader knowledge than I have with an idea how to approach this issue?
So i’m not going to answer the question you asked, but instead answer a different, and probably more interesting question.
HDR + MSAA in unity. It can be done! Just not using the default rendering settings. The reason for this is because of how the backbuffer gets created and a bunch of legacy stuff. But if you know the hardware you are targeting has HDR and MSAA support it’s pretty simple.
In your quality settings disable MSAA, and set the camera to HDR mode. This will allow you to work in the scene quite well.
Using this technique MSAA will only be enabled in play mode.
using UnityEngine;
[RequireComponent(typeof(Camera))]
public class HDRAA : MonoBehaviour
{
public RenderTexture temp = null;
public bool AA = false;
void OnPreRender()
{
var camera = GetComponent<Camera>();
temp = RenderTexture.GetTemporary(camera.pixelWidth, camera.pixelHeight, 24, RenderTextureFormat.DefaultHDR, RenderTextureReadWrite.Default, AA ? 8 : 1);
camera.targetTexture = temp;
}
void OnPostRender()
{
GetComponent<Camera>().targetTexture = null;
Graphics.Blit(temp, (RenderTexture)null);
RenderTexture.ReleaseTemporary(temp);
}
}
Just note: This will use more memory for the texture (8xmsaa in this example). I tested with bloom and it seemed to work pretty well
For whatever reason, the notifications for replies in this thread only just appeared, totally missed out on this. Yes, a solution like this for VR would be highly appreciated. On other platforms, I could always go for deferred instead, but having MSAA + HDR in OpenVR would be an amazing step forward
I’m curious: Is there anything beyond the additional memory one should be aware of using this technique? I assume there’s a reason for Unity not allowing MSAA + HDR by default? In our case, we are targeting stationary machines with a GTX1080 configuration, so we have 8 gigs of memory available, should be enough
I think the main reason is that this part of our render pipe is a little old. Modern hardware supports HDR msaa (and so do we if you do it manually), but that part of our render pipe hasn’t had the engineering love it needs. We have a pretty cool plan to address this that extends the concept of command buffers.
Are you using Unity 5.4? Stereo rendering should work with render targets and image effects, though you may have to make some shader changes for Single-Pass Stereo Rendering. We would appreciate a bug report about cases that don’t work.
The repro case is create an new project, enable VR support (single pass disabled), add that script to a camera, run. The result is a black game view. If you remove the script and add it while in play mode the view “freezes”.
The logical flip of that is any camera with a render texture does not get rendered “to your device”, to which I assume to mean the front buffer.
This doesn’t cause issues for image effects or separate cameras that are rendering to a render texture as those are being composited into the main camera with at initial render time does not have a render target.
Same Problem here, Unity 5.4f3 single pass disabled results in a black screen when using this script, specifically assigning camera.targetTexture.
Looking further into it, the script seems to work with VR mode disabled, but putting the Bloom Effect (newest version from the Bitbucket repo) onto the camera results in a freezed screen as well (with VR enabled, it freezes the editor completly)
Yep, as much as I was hoping I could just copy Tim-C’s script and drop it into our VR project and have MSAA and HDR…I already knew it would not be that easy. The difference for the project we are working on of having MSAA is huge. How huge? Huge enough that we will drop back from the deferred renderer just to use MSAA…if I have to add glow cards to everything that previously bloomed nicely, I will do it. Out of the box no other AA type can touch MSAA under certain circumstances. Unity engineers, I beseech you to sort this out asap. Others devs and I need HDR support with MSAA…and it needs to work in VR with SPS. So if there is an alter somewhere that I could sacrifice a goat or an offshore account that i could deposit funds into that would speed this up, let me know.
@Tim-C Rather than fixing this I’d rather just have HDR and MSAA work in Unity natively, with some way of defining an inverse tonemapper to be used, along with the Cinematic Tonemapping & Color Grading component supporting it.
Haha, agree! We were exhibiting at gamescom last week and being one of the smaller VR exhibitors at the show, people actually came to us and told us how our graphics looked nicer / crispier than most of what they have seen at the larger companies, who must’ve looked rather blurry most of the time. So it’s really worth choosing MSAA/forward for VR. I’m really looking forward to that HDR fix, as that will ramp up the graphical fidelity even more, especially with emissive surfaces
For those who can’t wait… I managed to make it work in VR. It’s semi-efficent, but together with single-pass rendering, the performance impact should be rather minor.
using UnityEngine;
[RequireComponent(typeof(Camera))]
public class HDR_MSSA_VR : MonoBehaviour {
private Camera msaaCam;
private GameObject msaaCamGO;
public LayerMask LayerMask;
public bool MSSA = true;
public bool didRender = false;
void OnRenderImage(RenderTexture source, RenderTexture destination) {
RenderTexture rtMSSA_HDR = null;
var srcCam = GetComponent<Camera>();
if (msaaCamGO == null) {
msaaCamGO = new GameObject("tmpCam");
msaaCamGO.hideFlags = HideFlags.DontSave;
msaaCamGO.transform.parent = this.transform;
msaaCam = msaaCamGO.AddComponent<Camera>();
}
msaaCamGO.SetActive(true);
msaaCam.CopyFrom(srcCam);
msaaCam.cullingMask = LayerMask.value;
msaaCam.depth = -100f;
rtMSSA_HDR = RenderTexture.GetTemporary(source.width, source.height, 24, source.format, RenderTextureReadWrite.Default, MSSA ? 8 : 1);
msaaCam.targetTexture = rtMSSA_HDR;
msaaCam.Render();
msaaCam.targetTexture = null;
msaaCamGO.SetActive(false);
Graphics.Blit(rtMSSA_HDR, destination);
didRender = true;
if (rtMSSA_HDR != null)
RenderTexture.ReleaseTemporary(rtMSSA_HDR);
}
}
Put the script onto your camera, before all post-effect stuff. Also set your camera culling mask to “Nothing”, since this script basically renders the scene on a seperate camera and blits it back into the vr-camera post-effect chain. Together with valves lab-renderer, this should be epic!
It’s done by in-place tonemapping, so you have to put “NeutralTonemapping()” (in InverseTonemapping.cginc) right on every output in every pixel shader you use. For surface shaders, it can be done by final color modification, but I didn’t testet that. Once the everything is drawn, the script does the inverse tonemapping and everything is ready for futher processing . Downside is that it breaks blending, so one may do all particle etc. stuff on a seperate buffer. But since I need this for a project, I may post a solution here soon.
Interessting: with tonemapping, ARGB2101010 is indistinguishable from ARGBHalf (which even glitches with high values on my R390), even for values beyond 2000.0!