So I’ve tried to add another texture into my texture mixing machine(cough The shader). However, being a noob in this area, I can’t understand the logic error in my code. I seem to be getting very bad alpha artifacts? My script works, with two textures. I simply can’t transfer that logic into three.
If they’re tangents, they’ll get modified as the batching comes into play (they get rotated along with the normals - as if they were vectors, regardless of what you’re storing in there) and so might not show what you’re after in-game…
UV: Location of diffuse texture in texture atlas.
UV2: Location of border texture in its own texture atlas.
Tangent: Standard mapped to cover a standard texture.
Would it possibly be better to say, attempt to convert the first one into the second, in the shader? I would think it’d be better to cache it one the mono side, and send it in.
You might be able to use vertex colour for the border texture location?
If it’s an atlas and there’s not many tiles in the atlas, you might be alright (vertex colours are clamped 0-1 and only use 8bit precision rather than floating point).
Oops, forgot to mention that vertex colors are used to color the borders. Sorry, about that!
Tangents and normals are the only things left, that is standard. I think normals are needed for the surface shader? So I was trying to use tangents four values to pass some extra data in. Is there a better way?
I don’t think so, sounds like you’ve used everything up!
So yeah, disabling batching sounds like your best bet there - or I think that ensuring that your tiles remain unrotated should mean that the batching doesn’t touch the normals or tangents (moving them around should be fine… scaling should be ok, I think?).
Hmm, well this is disappointing. This work is for a framework, and so I can’t expect customers to disable batching. I saw in Unity 5.0 that there are 4 uv channels, but the framework is nearing completion in a few weeks. Not a few months.
On topic, where do you disable batching? I couldn’t find it, just trying to determine if thats the problem indeed.
I think it’s in the player settings for the project.
By default, non-Pro version only has dynamic batching (game spends time at run-time combining meshes together).
Pro version has static batching (objects marked as static are combined into a big mesh at build-time, so you don’t waste time later combining them during run-time).
Is there no way to do this with two mapping channels?
I can imagine each tile simply has a flat mapping in channel 1. With just a diffuse texture applied all tiles will look the same to start out with.
Then you replace the simple diffuse texture with a texture atlas that contains all the different tiles. This can go for a normal map or others in the same way.
A secondary texture contains offsets in the atlas. Each tile is mapped to a single pixel in this secondary texture in channel 2.
// No filtering or mipmaps on the offset texture!
float2 offset = tex2D(offset_sampler, input.uv1);
float3 diffuse = tex2D(diffuse_sampler, offset + 0.25 * input.uv0);
It would even be possible to place the offsets in channel 2 directly.
I don’t think we are on the same page here. I use one channel for the diffuse mapping. The first uv channel contains the uv mapping for the diffuse texture in the texture atlas. The second channel has the mapping for the border texture atlas. This is mapped completely different. Vertex colors are used to tint the border texture. Now, I’m trying to use tangents are a third channel, to hold the 1:1 mapping where the uv fills the entire area.
Doing that required rewriting all of vertex buffer code for someone (since it was, uhm, not terribly clever before), and was several months of work. It’s not something that can be easily brought back to 4.x