Keeping the rendering in line with active shutter 3d

Hey folks!

I’ve been using Unity 2.6 (havent upgraded yet) with a 120hZ projector to create some pretty good looking 3d effects. The projector uses a active shutter 3d techology, so I’m essentially running a build out of Unity with two separate cameras that alternate every 1/120th of a second. I figured I’d share my current setup (and a strange problem) and see if anyone has any suggestions for a better method.

I’ve never done much work getting Unity to run at a constant framerate (especially not 120 fps) so I instead put the camera switch function in FixedUpdate and set the physics timestep to .00833334 (which is as close to 120 fps as rounded decimals get me). The cameras don’t actually enable and disable. I think its the left camera that is always on and the right camera renders on top with a normalized viewport that changes between the entire screen and none of the screen.

On a small scale, the setup seems to work pretty well. Without 3d glasses, the image on the projector looks like a double image, but with the shutter glasses the illusion is pretty nice. However, it only stays smooth when I’m running at really low resolutions (640 x 480 is pretty consistent). The larger the resolution, the quicker the image starts tearing. At lower resolutions, similar tearing starts to happen. What I can only assume is that the camera doesn’t fully finish rendering the scene before switching back to the previous camera, resulting in half of the screen showing the left cameras view while the other half showing the right cameras view. The effect starts to show up after a small amount of time running the build and slowly gets worse. I initially thought it was a performance issue, but i added a 15k polygon object with a 2048 texture to the otherwise very simple scene with a hotkey to show and hide the object. When there isn’t any tearing, sometimes showing the object starts to create tearing while hiding it again removes the tearing again. With this test, it seemed like I was adding too much to the scene for the camera to keep up.

However, I noticed something I was confused about. When the build sits for a bit with nothing major in the scene and tearing starts to occur, showing the complex model actually resolves it and removes any weird graphical oddities. Hiding the object then introduces the tearing again. It almost seems like the camera rendering is falling out of sync with the camera switching (like the fixedUpdate is switching cameras in the middle of the camera render). Changing objects in the scene seem to change the timing of the renderer.

I’m not very knowledgeable on the inner workings of the renderer in Unity, but maybe someone has a few ideas on how to improve what I have or a method for resolving the graphical tearing. Ideally, I would like to take the camera change out of FixedUpdate so I can put this solution into a program with physics without overtaxing it, but for now I’m not sure how consistent another method would be.

Hi, You could maybe try setting the vsync on to force it to wait for a vblank after every render?
If that works and the refresh rate is 120Hz, you might be able to do the camera switching in the Update function instead of the FixedUpdate.

Go to Edit/Project Settings/Quality and enable “Sync To VBL” for all the different modes.

I thought that would work too, but it didnt come up with the results i wanted. I guessing its because the projector was set up as a second monitor and the primary monitor was a regular 60hz monitor. I dont know if vsync would behave any more if the primary monitor was set to the projector, but I can try testing that later.

Hi Bongo,

I’m trying to achieve the same as you. I have two cameras in my scene, the vsync is enabled, and I switch among them on each frame in the Update method. The problem is that my application always receive the vsync at 60 Hz (I’ve implemented a frame rate display), even when the projector says that the signal is 120 Hz. Thus, the synchronized Update method is called at 60 Hz and the 3D effect is not being achieved.

I’m also using active shutter 3d glasses and the nVidia 3D system, which otherwise works great in Unity’s normal applications, but I need to control each eye’s camera.

Did you had any success on having 120 FPS with vsync in your application? I can provide you with the code for the FPS display if you need it.

Hi! Bongo

Step1 Just Switch Your Camera by the code.

Step2 In Quality Settings SyncToVBL on.

Step3 Make EXE.

Step4 Its Working.