I followed the example in the github examples project, but when I implement in the same way in my project, it switches to the face camera as soon as I start recording using NatCorder.
The only difference in the scene is the presence of an ARFaceManager. If I disable that, it doesnt switch the face camera when recording.
Any clues? I wish I could control which camera it shows in the ARCameraBackground better.
Yes you can. With ARKit 3.0 and iOS13 with an iPhoneXR or XS or equivalent iPad, you can get facial tracking data from the front camera at the same time as using AR with the rear camera. It’s a new thing they just announced and Unity supports it. It’s mentioned in the ARFoundation docs somewhere.
Yup https://github.com/Unity-Technologies/arfoundation-samples/tree/5.0 Rear Camera (ARKit)
iOS 13 adds support for face tracking while the world-facing (i.e., rear) camera is active. This means the user-facing (i.e., front) camera is used for face tracking, but the pass through video uses the world-facing camera. To enable this mode in ARFoundation, you must enable an ARFaceManager, set the ARSession tracking mode to “Position and Rotation” or “Don’t Care”, and set the ARCameraManager’s facing direction to “World”. Tap the screen to toggle between the user-facing and world-facing cameras.