Hey guys
I’ve followed this page to get the picture from the camera using this tutorial. One of the problem I encounter is that whenever I call the function to call the image 2 different situations happen:
- If Multithreading rendering is enabled, the FPS will drop all the way down to 10 FPS
- If Multithreading rendering is disabled, I have a full black screen
Is there any other parameter I should change?
Unity 2019.1.14.f1
ARFoundation 2.1.4
Arcore XRPlugin 2.1.2
Tested on a Xiaomi mix 2
The code to get the camera image in CPU is the following
private unsafe void GetImage()
{
XRCameraImage image;
if (m_CameraManager.TryGetLatestImage(out image))
{
var conversionParams = new XRCameraImageConversionParams
{
// Get the entire image
inputRect = new RectInt(0, 0, image.width, image.height),
// Downsample by 2
outputDimensions = new Vector2Int(image.width / 2, image.height / 2),
// Choose RGBA format
outputFormat = TextureFormat.RGBA32,
// Flip across the vertical axis (mirror image)
transformation = CameraImageTransformation.MirrorY
};
// See how many bytes we need to store the final image.
int size = image.GetConvertedDataSize(conversionParams);
// Allocate a buffer to store the image
var buffer = new NativeArray<byte>(size, Allocator.Temp);
// Extract the image data
image.Convert(conversionParams, new IntPtr(buffer.GetUnsafePtr()), buffer.Length);
// The image was converted to RGBA32 format and written into the provided buffer
// so we can dispose of the CameraImage. We must do this or it will leak resources.
image.Dispose();
// At this point, we could process the image, pass it to a computer vision algorithm, etc.
// In this example, we'll just apply it to a texture to visualize it.
// We've got the data; let's put it into a texture so we can visualize it.
m_Texture = new Texture2D(
conversionParams.outputDimensions.x,
conversionParams.outputDimensions.y,
conversionParams.outputFormat,
false);
m_Texture.LoadRawTextureData(buffer);
m_Texture.Apply();
}
}
Thanks!