I am trying to port an existing iOS app. This app uses code to access the device camera. When trying to compile it, I noticed that compilation failed because the required APIs are not available on visionOS.
Tracking the eror down, I saw that Unity creates a Preprocessor.h file that includes flags that control which APIs are available or being used. For my app, it still included UNITY_USES_WEBCAM 1 which caused the error. Wouldn’t it make sense to set this flag to 0 for all visionOS apps, even if screencapture calls can be found in the code, or exclude the respective code in CameraCapture and AVCampture like for PLATFORM_TVOS?
(Of course, I’d prefer that cam recording would be available.)
I’m wondering what version of PolySpatial and what version of Unity you are using?
Using 2022.3.13f1, and looking at the project that was generated when building for visionOS, I see that define as set to 0, not 1. In fact, in that block of defines the only one that is not 0 is UNITY_USES_DYNAMIC_PLAYER_LIB.
My preferred solution, however, would be that visionOS allowed photo and video capture. In my opinion, Apple is preventing a lot of interesting use-cases because of this.
Do you mean you are just updating an existing iOS Xcode project? Because that most likely will not work. This Preprocessor.h file looks like what you’d get from the iOS Trampoline project that we build out for iOS builds only, and not the visionOS project that is built out for visionOS builds. I’m also pretty sure that you can not just generate a visionOS build into an existing iOS Xcode project location, which is the only other way I can think that you would see something like this happening.
No, I am porting the Unity iOS project to visionOS. Meaning: I opened it in Unity, switched the platform to visionOS, did some modifications to fix compilation errors in Unity caused by the new platform setting, and then re-exported the project for visionOS.
If you can strip your project down to a small repro case I’d love to see it so I can figure out what is going on. I have never seen the build generate what is obviously an iOS build project for visionOS and interested in what it is we might be having an issue with.
I also got an error relating to AVCapture
I didn’t take a screenshot, but I remember that it happened when I put link.xml in my project.
I am testing link.xml to avoid code strip for VisionOS build.
This link.xml includes all classes in UnityEngine name space. When you put this file in your project, you will get an error at DeviceSettings.mm
Another link.xml includes all classes in all name spaces. This will cause the DeviceSettings.mm error, and also some of the lines of the xml caused AVCapture error. (Sorry that I don’t remember which one caused the error.)
My guess is that stopping Unity from stripping is making Unity think that you have Ads enabled when you don’t and so it’s trying to use the ads native header, which doesn’t exist.
Camera capture is disabled on all the XR headsets for safety reasons. Vision, Meta Quest, Vive XR Elite all block camera capture. They don’t want an app developer to to spy on people in their homes. This wont be an available feature any time soon. Apple may open it up to large developer to port specific technologies, but for now I’m assuming we’ll never have that access. The only option is ARkit, so hopefully that supports what you need.
Does anyone know how to track down what code is triggering cameracapture.h to be included in my build? My project is nearly a hundred gigs and has a lot of code created by many different people. Any suggestions for code to search for in unity?
I had compilation error with CameraCapture.h & CameraCapture.mm in Xcode and they are now excluded from compilation after I removed all usage of WebCamTexture.
I’m not sure as I don’t know what is bringing those symbols in for your project. I don’t think those are Unity symbols so not sure where they are coming from. Are you including some third party libraries in your assets that may be defining these?