So, I’m working on a game were you create your own beast.
When you train this beast, the texture is supposed to change to reflect the change in stats.
There are four stats, so there’s four different looks to mix between.
My first approach was to offer the artists only three different textures, and that the mix was binary. This made it 8 texture combination for each beast, and I’d just cycle between them.
The artists then went mad. They don’t want that, they want more!
So, as the change in stat does not happen often, I figured maybe it’s possible to have the different textures on the iPhone, and then actually have the game read it in, and do the blending and save it to a texture (“currentBlend.alpha”).
My worries are texture bandwidth, but having only alpha here, would make it only 8 bit (so a doubling of 4 bit of the compressed texture)
The iPhone supports two texture multitexturing, and the other channel is for tattoos.
So my question to you, is this. Have anyone of you tried something similar? Would it be hopeless to expect that this would even work?
What should I be looking at in order to actually code this system? I guess maybe some integratioin of objective c/c++ to produce the currentBlend.alpha image is needed.
would a combiner of alpha + color and decal on a skinned mesh * 2 (beasts fight each other), be too much strain on the system?
Aaah, damn artists and game designers, they never opt for the easy way of doing things
Hope anyone has some tips or opinions.