I’m testing out the volume camera resizing functions using play to device on a vision pro device. I found out that all properties under “Output Dimensions Options” (the “Window Resizing Limits” as well as the min&max window size) are not taking effect.
I’m using basically the same scene in the post here. The only difference is that I removed the unbounded volume camera.
If I set the “Window Resizing Limits” to “Fixed Size”, I can still resize the volume on device.
If I set the “Window Resizing Limits” to “Limit Minimum And Maximum Size” and change the min&max size limits, those limits don’t seem to work.
Couple of quick points here, which may or may not apply to your scene:
Modifying WindowResizingLimits during runtime is not supported.
WindowResizingLimits will not work during P2D connections - P2D has a fixed set of window output configurations and tries its best to match your scene’s requested volume with that list.
Ensure that there’s only one unique volume window output configuration per OutputDimension (only applies to Bounded volumes) in your scene, otherwise on run-time, it may be a toss up on which output configuration actually gets used.
That being said, once you build an Xcode project, there should be a file called UnityVisionOSSettings.swift in the project. Can you post a snippet or copy+paste the contents of that, especially the stuff in mainScenePart0? That should show what the output configurations are actually being defined as in RealityKit.
Hello, I just tried in PolySpatial 2.1.2. It seems that the configuration saved in UnityVisionOSSettings.swift will always be the sample bounded configuration given shipped in package samples (whose min output size is 0.4 and max output size is 2). The volume camera in my scene uses my own configuration file, but that does not seem to work in builds. The build does not use the config in scene unless I change the “Default Volume Camera Window Config” to my own config file in project settings.
For the configuration you are using in your scene - has it been included in a folder named Resources in your project? All window configurations must be placed in a folder named Resources for it to be transferred during a build.
Otherwise, please submit a bug report with a minimal repro project - that’ll allow us dig further into this issue.
Yes my custom config is right under Assets/Resources.
I did some experiments and found out that if I import the samples from PolySpatial package and set the “Default Volume Camera Window Config” in project settings to be the bounded one included in sample, it would finally be transfered in the build Xcode project, i.e. the “Default Volume Camera Window Config” would overwrite the file actually specified in volume camera gameobject in scene.
I have filed a bug report (IN-93708) and you can check it out. Basically there’s nothing else but a cube and a volume camera in the scene.