Porting iOS App: Unavailable APIs still included in visionOS build

I am trying to port an existing iOS app. This app uses code to access the device camera. When trying to compile it, I noticed that compilation failed because the required APIs are not available on visionOS.

Tracking the eror down, I saw that Unity creates a Preprocessor.h file that includes flags that control which APIs are available or being used. For my app, it still included UNITY_USES_WEBCAM 1 which caused the error. Wouldn’t it make sense to set this flag to 0 for all visionOS apps, even if screencapture calls can be found in the code, or exclude the respective code in CameraCapture and AVCampture like for PLATFORM_TVOS?

(Of course, I’d prefer that cam recording would be available.)

1 Like

Please file a bug and post the bug ID here.

I’m wondering what version of PolySpatial and what version of Unity you are using?

Using 2022.3.13f1, and looking at the project that was generated when building for visionOS, I see that define as set to 0, not 1. In fact, in that block of defines the only one that is not 0 is UNITY_USES_DYNAMIC_PLAYER_LIB.

I tried both 2023.3.13f1 and 2023.3.14f1 using PolySpatial 0.6.2.

Please note that I am porting an existing iOS app that uses camera features. I assume this is not being catched in the build process yet.

The relevant section in Preprocessor.h looks like this:

It might also make sense to adapt the platform-dependent compilation directives in files like AVCapture.m or CameraCapture.mm:

My preferred solution, however, would be that visionOS allowed photo and video capture. In my opinion, Apple is preventing a lot of interesting use-cases because of this.

Do you mean you are just updating an existing iOS Xcode project? Because that most likely will not work. This Preprocessor.h file looks like what you’d get from the iOS Trampoline project that we build out for iOS builds only, and not the visionOS project that is built out for visionOS builds. I’m also pretty sure that you can not just generate a visionOS build into an existing iOS Xcode project location, which is the only other way I can think that you would see something like this happening.

No, I am porting the Unity iOS project to visionOS. Meaning: I opened it in Unity, switched the platform to visionOS, did some modifications to fix compilation errors in Unity caused by the new platform setting, and then re-exported the project for visionOS.

If you can strip your project down to a small repro case I’d love to see it so I can figure out what is going on. I have never seen the build generate what is obviously an iOS build project for visionOS and interested in what it is we might be having an issue with.

I wish it was that easy. It’s actually a pretty complex projects with hundreds of dependencies.

I also got an error relating to AVCapture
I didn’t take a screenshot, but I remember that it happened when I put link.xml in my project.

I am testing link.xml to avoid code strip for VisionOS build.
This link.xml includes all classes in UnityEngine name space. When you put this file in your project, you will get an error at DeviceSettings.mm

Another link.xml includes all classes in all name spaces. This will cause the DeviceSettings.mm error, and also some of the lines of the xml caused AVCapture error. (Sorry that I don’t remember which one caused the error.)

My guess is that stopping Unity from stripping is making Unity think that you have Ads enabled when you don’t and so it’s trying to use the ads native header, which doesn’t exist.

Unity should be able to use the VisionOS webcam functionality to access the Persona, like videoconferencing apps do.

I’m trying to do this in my build but it shows errors in CameraCapture.h like this:

/visionos/Classes/Unity/CameraCapture.h:14:26 'AVCaptureDepthDataOutput' is unavailable: not available on visionOS

Looks like we need a new CameraCapture.h for VisionOS?

Well, at a minimum it should at least not create a build error. Please file a bug and post the id (IN-XXXXX) here.

Camera capture is disabled on all the XR headsets for safety reasons. Vision, Meta Quest, Vive XR Elite all block camera capture. They don’t want an app developer to to spy on people in their homes. This wont be an available feature any time soon. Apple may open it up to large developer to port specific technologies, but for now I’m assuming we’ll never have that access. The only option is ARkit, so hopefully that supports what you need.

Does anyone know how to track down what code is triggering cameracapture.h to be included in my build? My project is nearly a hundred gigs and has a lot of code created by many different people. Any suggestions for code to search for in unity?

I think this is a mistake. There is so many important use-cases that would require camera access. Apple is blocking itself.

If you are using WebCamTexture anywhere in your code, that will automtically enable cameracapture.

I’ll also note that this compilation issue (just the compilation issue) should be resolved in the next release.

Interesting. So Camera Capture is actually possible? That would be amazing!

No, sorry. That was why I clarified that it was just the compilation issue.

Ok, cool. So at least the compilation issues are fixed. That’s great.

Hello all.

I had compilation error with CameraCapture.h & CameraCapture.mm in Xcode and they are now excluded from compilation after I removed all usage of WebCamTexture.

However, there are still linking issues related.

@joejo Before next release, are there any temporary solution for this? Thank you.

I’m not sure as I don’t know what is bringing those symbols in for your project. I don’t think those are Unity symbols so not sure where they are coming from. Are you including some third party libraries in your assets that may be defining these?