Hi,
is there any example how to get camera texture in ARFoundation (not screenshot)?
Thanx
Hi,
is there any example how to get camera texture in ARFoundation (not screenshot)?
Thanx
Add the ARCameraBackground component to your GameCamera as explained here: About AR Foundation | Package Manager UI website
Curious for future API planning: what are you trying to do with the texture exactly?
I just want to save a camera image to a file (like taking photo from camera app), without any unity objects, just plain camera image.
Still having troubles to do that. Is there any example code how to do that?
Thank you.
Try this:
// Copy the camera background to a RenderTexture
Graphics.Blit(null, renderTexture, m_ARCameraBackground.material);
// Copy the RenderTexture from GPU to CPU
var activeRenderTexture = RenderTexture.active;
RenderTexture.active = renderTexture;
if (m_LastCameraTexture == null)
m_LastCameraTexture = new Texture2D(renderTexture.width, renderTexture.height, TextureFormat.RGB24, true);
m_LastCameraTexture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
m_LastCameraTexture.Apply();
RenderTexture.active = activeRenderTexture;
// Write to file
var bytes = m_LastCameraTexture.EncodeToPNG();
var path = Application.persistentDataPath + "/camera_texture.png";
File.WriteAllBytes(path, bytes);
Trying to do the same, and used your method tdmowrer, but looks like the ARCameraBackground’s material has Unity objects in it.
Have you overridden the ARCameraBackground’s material with a custom material? If not, can you explain what you mean by “Unity objects”? Could you post a screenshot?
I haven’t overridden the ARCameraBackground’s material with a custom material. By “Unity Objects in it”, I mean 3D virtual objects from my scene are shown in the image when I only want an image of what the device’s camera sees.
In this example image which I got from using the ARCameraBackground’s material, you can see a cube from my scene.
Sounds like you haven’t set the RenderTexture.active to your render texture, though hard to diagnose without seeing more of your code.
I’ve got a branch of the samples repo that has a working example: https://github.com/Unity-Technologies/arfoundation-samples/tree/graphics_blit?files=1
Disclaimer: l’ve only tested it on Android.
Your example definitely works…
I’ll try and figure out the differences and post what I was doing wrong here.
Turns out this was my problem:
If I blit the ARCameraBackground.material to the RenderTexture in the same method as I ReadPixels() and Apply() to my Texture2D, the texture is completely black. So, I turned the method into a Coroutine and added a yield return new WaitForSeconds so that the texture wouldn’t be black. If I put the yield after the RenderTexture.active = renderTexture, the image will have Unity GameObjects in it. If I put the yield before that, the image will only show what the camera sees.
Also, I tested without setting RenderTexture.active = renderTexture, and just as you said, the image had Unity GameObjects in it, and I’m not sure why that happens. Why would the renderTexture not contain Unity GameObjects when it is set as the active render texture, but contain them when it is not?
Because ReadPixels reads pixels from the RenderTexture.active. If you do not set this (or yield a frame after setting it), then the active RenderTexture is probably the last frame that was rendered.
The reason the resulting image is black is probably due to a setting in your RenderTexture. Make sure it doesn’t have a depth buffer:
Makes sense, thanks for all the help!
Is it posible to blit camera view to render texture even if camera renders to render texture?
I saw that meanwhile ARFoundation provides the TryGetLatestImage method:
https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@1.0/manual/cpu-camera-image.html
Nevertheless on my Android device the received image is much smaller (480x640) than the screen size (1440x2960). Is it somehow possible to get the image with full resolution?
You want to enumerate and select a CameraConfiguration. There’s also an example here.
Thanks for pointing me to the example. Works pretty well.
UPDATE:
I finally got reflections on AR Foundation. The problem was that Real Time Reflections was disabled in the default settings for Android on ProjectSettings / Quality.
Hello, I want to get de Camera to a RenderTexture to use as a reflection.
Thanks to tdmowrer I got to the RenderTexture part, but I can not make work the reflection.
What I want to do is have the same behaviour as this repo (https://github.com/johnsietsma/ARCameraLighting) but instead of ARCore with AR Foundation.
I try using the built-in Skybox/Panoramic and the Skybox/ARSkybox from that repo. But I got a black reflection on the mobile device.
On the Editor I tried with another image with both shaders, and both skybox and my reflective sphere got the texture correct. But I cannot make it work in the mobile with the RenderTexture.
I use a Canvas to show the RenderTexture and works perfect, so I capture the camera correctly on the RenderTexture.
And with a common skybox my reflective sphere works as expected.
Any idea?
Thank you.
@MarcoElz , care to share how you were able to convert arcameralighting to work with ARFoundation? I’ve gotten it working on android but I’m having a hard time with ios.
Same here.