Edge detection mobile issue

I am working on a puzzle game and what I want to do is show a selected puzzle piece highlighted.

What I want:
2 cameras.
One running an edge detection script only rendering the user selected piece.
Call this the Selection Camera:
Clear Flag: Depth
Layer 1
Forward Rendering
Edge detection script

The other rendering everything else. Call this the Scene Camera.
Clear Flag: Skybox
Layer 0
Forward Rendering

How it looks in editor:

How it looks on mobile:

If I switch the layers I get everything rendered but of course the selected piece is now below the others.

This problem does go away if I put the scene camera on deferred render. This leads to a huge performance hit (from 60+ fps to 15fps). This isn’t fixed if I put the selection camera on deferred, this is odd to me since it’s the one that seems to be rendering all that black.

I really dont know enough about the rendering pipeline to know if this is an issue in the edge detection script or camera config or unity.

It seems to me the Selection Camera is not clearing upon depth and so there is no transparency for the scene camera to render underneath it. Is there a way to use forward rendering for edge detection with 2 cameras in a scene? I have tried just one camera with edge detection and it works just fine with forward rendering, it only looks bad if I have 2 cameras.

Probably would be easier to just draw that selection box, instead of using image effects…

could use plugins like,

or just GL.Lines, LineRenderer or other ways…

If the puzzle shapes get complicated, could clone selected piece,
set its color to white, scale it up a bit, and draw behind the piece…

Also this, outline effect for sprites: