Camera RenderWithShader culling performance

Currently when rendering a specific section of our game where we are using the Camera::RenderWithShader, this camera is set to render only a specific CullingMask (single layer) with a specific replacement tag that we know only renders a specific mesh.

Below I have a capture, what I would like to know is if there is away we could avoid the culling call since we know nothing will be culled, that would save us 1.6% of our frame time on iOS…

Running Time Self Symbol Name
359.0ms 3.7% 0.0 m_wrapper_managed_to_native_UnityEngine_Camera_RenderWithShader_UnityEngine_Shader_string
359.0ms 3.7% 2.0 Camera_CUSTOM_RenderWithShader(ReadOnlyScriptingObjectOfType, ReadOnlyScriptingObjectOfType, ICallString)
352.0ms 3.6% 0.0 Camera::StandaloneRender(unsigned int, Shader*, std::string const&)
157.0ms 1.6% 0.0 Camera::StandaloneCull(Shader*, std::string const&, CullResults&)
155.0ms 1.6% 0.0 Camera::CustomCull(CameraCullingParameters const&, CullResults&)
117.0ms 1.2% 0.0 Unity::Scene::RecalculateDirtyBounds()
23.0ms 0.2% 0.0 CullScene(SceneCullingParameters&, CullResults&)
5.0ms 0.0% 0.0 Camera::PrepareSceneCullingParameters(CameraCullingParameters const&, RenderingPath, CullResults&)
4.0ms 0.0% 4.0 BaseRenderer::GetWorldAABB(AABB&)
4.0ms 0.0% 1.0 SetupShadowCullData(Camera&, Vector3f const&, ShaderReplaceData const&, SceneCullingParameters const*, ShadowCullData&)
1.0ms 0.0% 0.0 operator new(unsigned long, MemLabelId, bool, int, char const*, int)
1.0ms 0.0% 0.0 0xbf391e93
1.0ms 0.0% 1.0 Unity::Component::GetGameObject() const
1.0ms 0.0% 0.0 InitShaderReplaceData(Shader*, std::string const&, ShaderReplaceData&)

Would like to know this too - same situation with a scene that nothing needs culling at all.