Animating signal flow along paths?

I am making a first-person puzzle game for PC and mobile. The goal is to activate signal sources and make signals (“electricity”) flow to destinations along the surface of the walls.

You can appreciate my art skills in this beautiful example.

The world consists of cubical tiles which can move, so I cannot merge the environment into a single mesh. For ease of prototyping, I even compose these cubes from 6 quads (faces) to be able to change them independently without adjusting the UVs. Each cube face is basically a junction of signal paths, or just a straight path.

My problem is animating the signal flow. When the user activates the source, I want the red signal to slowly fill the connected paths (example above).

My first attempt was a shader which renders 0 to 4 “arms” of the cross-shaped signal junction. These parameters are per-instance, so each cube face in the world has its own set. Rendering the path itself is easy: I just need to pass 4 floats to the shader (whether to render top, right, bottom and left arms). But animation makes it much more difficult.

I need to specify 1 of the 4 directions to animate the “incoming” signal from one of the sides to the center of the junction. And then I need to specify 0 to 3 sides to which the “outgoing” signal should flow from the center.

So my per-face data looks roughly like this:

directions: left | right | top | bottom (any combination)
ingoing side: none | left | right | top | bottom (only one)
outgoing sides: none | left | right | top | bottom (any combination)
last animation start time: a single float

How do I even pass this info to a shader? In C# I would use bit flags, but shaders seem to lack bitwise operators required to implement them. The only way I know is passing about 20 separate flags with value 0 or 1 (“should signal flow out to the top?”, “should signal flow in from the left?”, etc). And passing 20 floats for every face rendered seems excessive to me, especially on mobile.

Is there a better way to pass animation & directions information to the shader? Is custom shader even the correct approach, or can I animate things in a simpler way? Should I just not worry about passing 20-30 float parameters per cube face and breaking batching/instancing? Maybe I should split the meshes even further, and instead of a cube face render a triangle per junction “arm”?

I am looking for any advice, whether for shader writing or for changing my approach entirely.

Shaders do have all the usual bit operations so that’s actually a perfectly valid way of doing it. What’s a bit problematic is getting the data into the shader since Unity doesn’t actually allow you to set integers on a material (SetInt casts the value to a float internally). What should work is stuffing the bits into a float using System.BitConverter and then using asuint(float) in the shader to convert it back to an unsigned integer. Alternatively you could make sure that you use “small enough” ints that convert perfectly to floats and back to ints. That should work for ints up to 2^24.