VK_KHR_sampler_ycbcr_conversion for Texture2D.CreateExternalTexture

In the Native Vulkan Rendering Plugin we use video as a source of textures. Currently we have to convert decoded video frames from YCbCr/yuv420p/nv12 format (common for video) to RGBA32 (common for Unity). AMediaCodec on Android or ffmpeg+VAAPI/NVDEC on other platforms gives decoded video frames as VkImages in VK_FORMAT_G8_B8R8_2PLANE_420_UNORM or VK_FORMAT_G8_B8_R8_3PLANE_420_UNORM format, that have to be sampled further by immutable sampler with VkSamplerYcbcrConversion enabled. Unity’s Texture2D.CreateExternalTexture expects VkImages in VK_FORMAT_R8G8B8A8_UNORM or VK_FORMAT_R8G8B8A8_SRGB format. Because of this, we have to convert VkImage one to one by means of rendering source decoded video frame to RGBA32 framebuffer of the same size using VkSamplerYcbcrConversion in simple full-screen-triangle shader. This step is redundant. All these can be done in zero-copy way. The only problem Texture2D.CreateExternalTexture has no support of importing YCbCr textures (it accepts only a pointer to VkImage in IntPtr argument, but there should be the second handle additionally: VkSampler (with VkSamplerYcbcrConversion enabled), it is not possible to fuse both in a single combined image-sampler).
Another point for improvements is to make it to be able to export both VkImage and VkSamplerYcbcrConversion or VkSampler handles by means of Unity-side created Texture’s Texture.GetNativeTexturePtr.

1 Like