iOS control conversion: mouse/keyboard --> touch

Hello users,

I have completed about 20% of my first indie project and I haven’t purchased the platform license (iOS, in my case) yet. So, I haven’t done any testing on iPhone. But, I intend the game to take advantage of the panning/zooming capabilities of iOS using finger-based controls.

For now, (obviously) the only way I can test what I’ve done so far is using the keyboard & mouse on my MacBook. From what I understand, scripts for capturing finger-based controls are quite different from scripts for keyboard/mouse controls. As such, I’d think I would have to make quite a bit of adjustments in my scripts if I were to test the game on the iPhone.

So, I am debating as to whether I should continue to go down the current path (using keyboard/mouse input for testing) until the end of the development phase OR start creating/testing on the target device immediately, which means I need to purchase the iOS license and Apple’s dev license NOW.

In general, how hard is it to change the “control” part of the scripts after the development is done? Are there any best practices for Unity developers on this or similar topic? If so, I wasn’t able to find it. I am trying to avoid any significant efforts to change various parts of my custom scripts when I thought I was ready to publish the game.

Any insights/advice will be much appreciated!

Hey.

Currently in my project I only need the support of 1 finger on IOS device. And I am using the MouseInput Api for that. Yes Unity is so nice that as long as you are using one finger the MouseInput API will be also supported besides the TouchInput API.

My solution would be to have a good object oriented code and with Input(mouse/touch) related logic in one class. I would implement all the actions as mouse clicks and all the zooming/panning with mouse scroll (plus additional keyboard buttons if needed). Then I would have either event listeners attached to that class. Or I would use raycast from that class and trigger the right method on the colliding object. Then when you port the game to IOS add the touch input implementation.

In runtime you can detect if your game is running on IOS or Editor mode and chose the correct class to capture your Input events.

In my experience the Touch api is more complex but easier to implement. More complex because it reports input information from up to 5 different fingers (mouse has only 1 cursor). Easier to implement because it gives you additional information like:
deltaMove (how much did finger move from last frame)
phase: begin, end, stationary, moved
tapCount (how many times user has tapped)
Some of those things you need to manually track when dealing with mouse input.

UPDATE:

I will show you with some pseudo code how I have solved this. I hope this example helps you.

All the objects in my game that are touch/mouse sensetive have a component that extends an ITouch interface (everything you can trigger, move, grab… in the game). For the needs of my game that interface is defined like this:

public interface ITouch {
	
	//When touching an object
	void OnTouchDown();
	//When releasing an object
	void OnTouchUp();
	//When moving over an object not grabbing it 
	void OnTouchMoveOver(InputController.TouchAction ta);
	//When moving over an object and grabbing it
	void OnTouchMoveGrab(InputController.TouchAction ta);
}

So that is all the info my game objects that I need to interact with need, where my TouchAction is basically the same as Touch. For example in fruit ninja all fruit objects would have a component that extends ITouch and impelements the method

OnTouchMoveOver(){ this.CutInHalf() }

Then all you need is a class that will capture all the input information (mouse or touch) every frame and call the ITouch methods on all the correct objects. Short explanation on how that class works would be:

MOUSE

If Mouse down {

if mouse coordinates (x,y) doing Raycast collide with object {

    coliding object get touchable component. Call OnTouchDown();
}

}

If you were to support the TOUCH api it would be something like:

If Any finger touching down {

for every finger that is touching {

    if touch coordinates (x,y) doing Raycast collide with object {

        coliding object get touchable component. Call OnTouchDown();

    }

}

}

So when you transition from Mouse input to Touch input all you need to do is update this class that is handling all your mouse/touch input. If it would help you I can paste you the whole class, since it is a bit more complex.