Trying to render a stack of cameras

I am developing a photo camera mechanic for my escape room game. Basically, I am rendering a texture from the camera and then saving it as texture2d for the player to visualize.

Unity Version: 6000.0.23f1

This is the code.

    IEnumerator getIcon() {

        yield return new WaitForEndOfFrame();

        int width = 1024;
        int height = 1024;

        RenderTexture tempRender = new RenderTexture(width, height, 24);

        tempRender.mipMapBias = 0;
        tempRender.useMipMap = false;

        Texture2D tex = new Texture2D(width, height, TextureFormat.RGBA32, false);
        cam.targetTexture = tempRender;
        RenderTexture.active = tempRender;
        cam.Render();

        tex.ReadPixels(new Rect(0, 0, width, height), 0, 0, false);
        tex.Apply();

        cam.targetTexture = null;

        photos.Insert(0, tex);
    }

There is the main camera which is a stack of other cameras and there is the Item Inspection Camera which is only enabled when the player wants to inspect an item.

The mechanic is working when the player take a photo without activating the Item Inspection Camera:

However, when the player needs to activate the Item Inspection camera to inspect an item and take a photo, the generated image from the main camera is rendered at 25% of the resolution, while the image from the inspection is full size.

I render the items inside a canvas and its Render Camera is the Item Inspection Camera.

I tried changing every option in Unity to see if something was going to fix the problem, but nothing did.

Is there anything that can be done to solve this problem?