Hi there experts,
I will try and describe things as clearly as I can…
I’m trying to create a dynamically built custom text box derived from a string using alpha mapped letters (with shadows) derived from a large 32bit texture containing all the letters in a grid. I’m after the most memory / speed efficient way to implement this on iOS.
I tried using a single texture on the text panel, and using GetPixels() SetPixels() Apply() to copy different letters onto the text panel texture. This failed due to Apply() being cripplingly slow on iOS AND GetPixels() failing to work with alpha channel formats on iOS (bug?) mentioned elsewhere on this iOS forum.
Next I thought I’d try to instantiate a series of prefabs of a single letter (maya mesh quad) and adjust the material UV Offset to make each character show the correct letter on the letter texture grid. I was worried that this would create a separate material for each letter and be very inefficient (100s of chars per text box)? Does a cloned material (due to changing the UV offsets) mean the texture memory is cloned as well (very bad!)? Or do cloned materials still share the same texture memory?
One other option is to create a grid of mesh quads in maya and export the grid as a single mesh, then update the UV’s using the .mesh vertices UV access directly (would this bypass the material clone problem of using material offset?). The downside to this is that I have to hard build the size of text box into the mesh (chars wide / high).
I think the most versatile, but complicated way would be to create my own mesh quad grid from scratch and update the vert UVs directly for each quad in the mesh, but I can probably work it out if it is really necessary.
Have I misunderstood something silly?
Any advice super appreciated!
Richard