Hey.
Currently in my project I only need the support of 1 finger on IOS device. And I am using the MouseInput Api for that. Yes Unity is so nice that as long as you are using one finger the MouseInput API will be also supported besides the TouchInput API.
My solution would be to have a good object oriented code and with Input(mouse/touch) related logic in one class. I would implement all the actions as mouse clicks and all the zooming/panning with mouse scroll (plus additional keyboard buttons if needed). Then I would have either event listeners attached to that class. Or I would use raycast from that class and trigger the right method on the colliding object. Then when you port the game to IOS add the touch input implementation.
In runtime you can detect if your game is running on IOS or Editor mode and chose the correct class to capture your Input events.
In my experience the Touch api is more complex but easier to implement. More complex because it reports input information from up to 5 different fingers (mouse has only 1 cursor). Easier to implement because it gives you additional information like:
deltaMove (how much did finger move from last frame)
phase: begin, end, stationary, moved
tapCount (how many times user has tapped)
Some of those things you need to manually track when dealing with mouse input.
UPDATE:
I will show you with some pseudo code how I have solved this. I hope this example helps you.
All the objects in my game that are touch/mouse sensetive have a component that extends an ITouch interface (everything you can trigger, move, grab… in the game). For the needs of my game that interface is defined like this:
public interface ITouch {
//When touching an object
void OnTouchDown();
//When releasing an object
void OnTouchUp();
//When moving over an object not grabbing it
void OnTouchMoveOver(InputController.TouchAction ta);
//When moving over an object and grabbing it
void OnTouchMoveGrab(InputController.TouchAction ta);
}
So that is all the info my game objects that I need to interact with need, where my TouchAction is basically the same as Touch. For example in fruit ninja all fruit objects would have a component that extends ITouch and impelements the method
OnTouchMoveOver(){ this.CutInHalf() }
Then all you need is a class that will capture all the input information (mouse or touch) every frame and call the ITouch methods on all the correct objects. Short explanation on how that class works would be:
MOUSE
If Mouse down {
if mouse coordinates (x,y) doing Raycast collide with object {
coliding object get touchable component. Call OnTouchDown();
}
}
If you were to support the TOUCH api it would be something like:
If Any finger touching down {
for every finger that is touching {
if touch coordinates (x,y) doing Raycast collide with object {
coliding object get touchable component. Call OnTouchDown();
}
}
}
So when you transition from Mouse input to Touch input all you need to do is update this class that is handling all your mouse/touch input. If it would help you I can paste you the whole class, since it is a bit more complex.