Unity3D w/ skybox vs. Oculus 360 Photos

Hello, I’ve been recently developing VR-based Unity applications, and I’m currently working on using the Gear VR for a 360 experience, comparable to the Oculus 360 photos application.

However, I noticed that I can never quite get the quality of Unity’s skyboxes to match that from the native Oculus 360 photos application. I’ve tried ramping up the settings to max (uncompressed RGBA 32 bit, no size loss, using 2048 x 2048 images), turned up the quality settings (both with Anisotropic and without) as well as played around with the OVR camera controller object to no avail.

Is it at all possible to achieve a really high quality, photorealistic skybox in Unity similar to what can be done with Oculus 360 photos, or am I stuck having to use Oculus’ solution if I want that kind of fidelity?

2 Likes

I think you need to be more specific. Quality/Fidelity are pretty general terms that can apply to many different aspects of your output.

Also… You should clarify exactly how are you implementing the skybox in Unity? The Oculus 360 photos are single equirectangular projection images, unlike the usual 6 image skyboxes you often see used in Unity.

1 Like

I think the issue lies with the sharpness of the image. Comparing a 360 image made in the Oculus 360 photos and the same one in Unity, the 360 photos is sharper, more clear, and I can see the finer details whereas in Unity, unless I disable mipmaps and/or use point filtering (which results in artifacts when moving & blockiness in the image), the image is more blurred & less sharp than what can be achieved in Oculus 360 photos.
An example is when there is a text within the image that can be clearly recognised as “36” in 360 photos, but in Unity it always more blurred so I would not be able to make out what it is if I did not have prior knowledge of it.

In terms of implementation, I’m using the usual method of generating a skybox in Unity, which is to make a skybox material, then place 6 images to the corresponding slots & assigned to the skybox material. Would it be better and possible to use a single equirectangular projection as the 360 image? From what I’ve seen, Unity caps the max image size to 4096 on import.

1 Like

I’ve done this by mapping an equirectangular projection on the inside of a sphere and putting the camera rig at the center of that sphere. If you use a 4k image, the quality should be quite good.

1 Like

I’ve tried something similar following the “Setting up the scene” from this, Full 360 stereoscopic video playback in Unity – Bernie Roehl, but I’m still unable to get it as nice as I would have liked it, unfortunately.

1 Like

If you really want help, you need to provide more information… What exactly do you think is wrong with what you have? Are you using the same exact image that you were looking at on the Oculus? If not you are not really doing a fair comparison. What exactly are you comparing?

Sorry if I’m a bit vague, I’m just not sure what else to mention or include. I’m comparing the exact image in both applications, in 360 photos the dimensions of the equirectangular projection is around 10000 x 6000. The quality of the image is not bad in unity, that’s not the problem. The problem lies in the fact that the same image in unity is not as sharp as in 360 photos, but the requirement is that it has to match the level of detail displayed in 360 photos. I’m not sure if the slight loss of quality/sharpness is due to how unity renders it, if I am missing a setting somewhere, if it’s due to the ovr sdk or something else.

1 Like

I’m not 100% sure, but doesn’t the Oculus Rift apply Anti-Aliasing on Unity applications by default? Wasn’t this 2x or 4x?
I had the same issue and went trough some experiments. I couldn’t get the image sharper if I had a skybox with each side 1024 or 2048 textures ( while source was big enough ). However I went back to 1024 textures and sharpen them by hand in Photoshop with 1.5pixels. The result was that the image was a lot sharper and notices more details in the small details of the image.

I’m not sure, but it also could be that the person developing the native application applies a sharpen onto the images displayed in the 360 photo gallery. This won’t be the first time I see this, as some TV’s do the same when viewing photo’s on the TV’s image gallery.

If you find another solution, let me know =)

1 Like

Earlier you said … “unless I disable mipmaps”. You should definitely disable mipmaps in this case. Mipmaps would serve no purpose here.

“10000 x 6000” … well you obviously had to scale that down to use in Unity, so that might account for the difference you are seeing. How did you go about scaling the image down? Sharpening after scaling down might help some. You might try breaking it into two images and breaking the sphere in half so you would end up with something closer to the original size.

Its also still not 100% clear what you are comparing. If you are comparing what you see on the actual devices there are bound to be some differences there also… The optics are little different, the resolutions are a little different, etc…

I’m comparing the same image on the same device, the Gear VR, with two different applications: “Oculus 360 photos”, which is an official Oculus application developed for the mobile, and the Unity application that’s intended, at least for now, to replicate the 360 image viewing & quality experience in the oculus 360 photos app, but made in Unity.
Hopefully that should clear it up.


These aren’t the exact screenshots, but the describe what I mean. The one on the left is the original photo, the image on the oculus 360 photos is not far from what’s shown here. On the right is roughly how ‘blurry’ it is in comparison, when viewed in the Unity app.

I would like to disable mipmaps, but although there is improvement in the sharpness, it also generates artifacts, which detracts from the experience. However, even despite disabling mipmaps, it still does not reach the level of quality that 360 photos can achieve.

With the scaling, Unity scales it down automatically, and caps the size at 4096. Although that makes sense when using a sphere or an equirectangular projection (since it’s a single whole image), it doesn’t when I’m using a unity skybox, as the skybox is created from 6 images stitched together within the skybox material (with each image being 2048 x 2048). It could be that the material is also capped at 4096. but on the PC preview of the scene, there is a distinct difference in quality between the skybox (better quality) and the sphere (lower quality).

Just for good measure, I’ve tried the method described in the link Unity3D: Using textures larger than 4096x4096 | 41 Post, still the same result.

I gave sharpening a try, but I don’t think that’s it either. On the photo itself, it looks fine, but once it’s rendered on the Gear VR, it still looks as blurry as before.

I’m beginning to think it’s either the way it’s being rendered on Unity or something to do with how it goes about rendering it with the Gear VR device. Worse case, the Oculus 360 photos app’s source code is provided in the oculus mobile SDK, but it’s preferred to use unity, as it streamlines the development process & makes adding UI elements & features into the scene easier.

1 Like

After a couple of tests I’ve done, I can’t get images as sharpen as other non unity VR applications.

This week I’ve rendered 2 different stereo equirectangular images. One at the size of 8192px ( 4 x 2048 ) and one of 16384px ( 4 x 4096 ). I both imported them into unity to see a difference in image quality. The 8k one had textures of 2048 for it’s skybox and the other 4096 textures. Both appeared to be exactly the same quality but still a bit blurred and defently not crisp sharp. Using no compression didn’t made any difference as well turning off mipmaps had no effect.

It seams there is a factor inside Unity that prefents the view from the camera to have crisp sharp textures. All other non unity applications with the same setup I’ve seen are really sharp. The creators of Octane renderer put there own Samsung Gear VR application in the store and those skybox images are incredible sharp. So far I have not succeeded doing the same. The only difference I’ve found so far, is that I render an equirectangular, maybe an cubemap render would reduce some distortions.

Could someone shed a light on this and help me getting them crisp sharp, am I missing something?

1 Like

face the same problem, any news?

Unity has (had?) resolution limits. I think they have made some of the parameters available via mediaSurface or similar class. Check the Oculus forums since these limitations have been discussed at least regarding moving images.

Not yet further tested, but soon I’m able to do that.
First off, we deal with not photo’s but stereo renders as equirectangular images. Using Cubemaps straight from the renderer would make the image way sharper. Also disabling Mipmaps was something that made it sharper as well. Rest I haven’t tested it, but those two did a little. Next test for me would test different resolutions up to 4k for each cube side.

@sleekdigital what shader are you using? I have 360 photosphere shaders that work perfectly on Rift DK2 in Unity, but when I port them to my GearVR / Android projects, they simply show up as white. Can you point me to, or post here, shader code that does interior sphere texture mapping compatible with GearVR / Note4?

I didn’t use shader code to get the texture on the interior. Instead, I flip the normals of the geometry in Blender. I haven’t tested that with Gear VR, but it worked fine on the few Android phones I tested.

Try changing your material shader to unlit.

I think gregroberts was asking about getting the texture on the interior of the sphere. One way is to use special shader code and another is to configure the geometry. But yes, the normal unlit shader is what you want if you already have the geometry configured to to display the texture on the interior surface (flipped normals).

Been working to fix this all day and just had a eureka moment, the cure is setting UnityEngine.VR.VRSettings.renderScale to a value higher than 1

For example
UnityEngine.VR.VRSettings.renderScale = 3;

This causes supersampling which makes the textures look very nice :slight_smile:

4 Likes

Thank you very much for sharing, this worked perfectly.