AR Foundation Camera output to Render Texture?

Hi all! I’m using AR foundation on an Android device, and I want to take the output of the camera and put it onto a render texture live. Whenever I try to do this, I just get a black image. What I have is as follows:
On a canvas I have a little Raw Image with a render texture applied to it. Then I’m using this line that was in the documentation in Update(). Nothing works the way it should. Am I doing something wrong?

Graphics.Blit(null, myRenderTexture, myARCameraBackground.material);

I want this render texture to update live with the motion of the camera. The plan is to use FMStream (on asset store) to live stream it to a PC for further processing.

I appreciate any and all help! :smile:

It’s possible, but there are some caveats. While Android uses one camera texture, iOS uses two textures. So for iOS, you’ll have to Blit two times and output these two textures with a custom shader.

  • Subscribe to ARCameraManager.frameReceived event.
  • For each ARCameraFrameEventArgs.textures do the Graphics.Blit():
  • For Android: Graphics.Blit(texture, targetRenderTexture, arCameraBackground.material). Passing arCameraBackground.material is required for Android.
  • For iOS: Graphics.Blit(texture, targetRenderTexture). No need to pass arCameraBackground.material, this will produce green texture.

The other possible solution is to access the camera image on CPU.

Works perfectly, thank you! I’m on Android though, will just that code work for iOS (without passing in the CameraBackground), or is there anything else I need to do?

Thanks so much!

Yes, this will work on iOS too, but you’ll have to write a custom shader that will display two render textures (Y and CbCr) correctly.

Got it, thank you. I will look into that in the future.

@KyryloKuzyk : Thank you so much for the insight, really helps me points to the right direction with a project, but I wonder though, do you have a link to documentation regarding the 2 textures output on IOS? Not sure how it is outputted, it is by index? (ie [0] and [1] for the two textures needed)

If you subscribe to ARCameraManager.frameReceived event, then you can access these textures via ARCameraFrameEventArgs.textures.
To be sure which texture is TextureY and which one is TextureCbCr, I would not recommend accessing them by index, rather than find the index of property id and match it with the ARCameraFrameEventArgs.textures index. I think a code sample is better than a thousand of words :slight_smile:
Not optimized for performance, just to show an idea:

using UnityEngine;
using UnityEngine.XR.ARFoundation;


public class ARKitCameraTextures : MonoBehaviour {
    [SerializeField] ARCameraManager cameraManager = null;


    void Awake() {
        cameraManager.frameReceived += args => {
            var textureYPropId = Shader.PropertyToID("_textureY");
            var textureYPropIdIndex = args.propertyNameIds.IndexOf(textureYPropId);
            var textureY = args.textures[textureYPropIdIndex];

            var textureCbCrPropId = Shader.PropertyToID("_textureCbCr");
            var textureCbCrPropIdIndex = args.propertyNameIds.IndexOf(textureCbCrPropId);
            var textureCbCr = args.textures[textureCbCrPropIdIndex];
        };
    }
}
3 Likes

Thank you so much for the code samples, it is very helpful indeed @KyryloKuzyk !

Hello there, I would like to know if someone know how to get the projected face texture in ARFoundation

Hello again @kirill!
Mhmm, I don’t think that works anymore. If I run

var textureYPropId = Shader.PropertyToID("TextureY");
var textureCbCrPropId = Shader.PropertyToID("TextureCbCr");
Debug.Log($"textureYPropId: {textureYPropId}, textureCbCrPropId: {textureCbCrPropId}");
// textureYPropId: 1788, textureCbCrPropId: 1789
// but
foreach (var item in args.propertyNameIds)
   Debug.Log($"propertyNameId: {item}");
// prints 703, 707

Could this be related to me using URP and not built-in?

PS: Is there a way of getting which keys the 703 and 707 stand for?

Turns out there is no need for “blit” or inspecting the raw textures, all I had to do, to get the AR-Camera Images on a custom object was:

using UnityEngine;
using UnityEngine.UI;
using UnityEngine.XR.ARFoundation;

[RequireComponent(typeof(RawImage))]
[RequireComponent(typeof(ARCameraBackground))]
public class ARTextureResponder : MonoBehaviour
{
    void Start()
    {
        var rawImage = GetComponent<RawImage>();
        var camBack = GetComponent<ARCameraBackground>().material;
        rawImage.material = camBack;
    }
}

In this case I used a RawImage , I guess any other object that has a material would also work.

3 Likes

Hmm, this is strange. I can confirm that using texture display names (TextureY, TextureCbCr) doesn’t work. But using the actual property names (_textureY, _textureCbCr) does work. I updated the original response , huge thanks for pointing that out!

2 Likes