I’m still pretty new to Unity as a whole, but I’m an absolutely neophyte when it comes to Android. Looking on the Unity site I’ve only found a couple of very rudimentary tutorials about building for Android and handling multi-touch events.
But what about all the rest? I am, for example, about to dive into the docs to try and figure out how to detect the screen resolution, and that should be easy enough, but it seems like just the tip of the iceberg as far as mobile specific issues go.
Are there any good tutorials out there talking about using Unity with Android?
Hi, have you tried this site? I think it will help.
https://unity3d.com/learn/tutorials
Have fun
Of course I checked there. I’ve done a whole lot of the tutorials there, but there’s only a handful that are Android or mobile specific and those only cover a few small and very highly specific things. I’m looking for a more generalized Android+Unity tutorial. Something geared towards people who are new to Android as well as Unity.
As examples of the kind of information I’m looking for:
-
Dealing with device screen resolutions. Right now I’m not even sure if various Android devices have the same aspect ratio on their screens. I assume they do not. I can use the Screen class to get the resolutions available, which on my Galaxy S3 at least is just a single res. Is that the case with all Android devices? I don’t know, but once I have the current resolution are their any special techniques or tools in Unity to make it easy to properly set up the play area? For instance, I’m currently porting a breakout game I made as my first Unity project, and I need to change the number of bricks in a row to accommodate the screen. I can write custom code to do this, but I’m wondering if there is anything in Unity itself that might help me with it.
-
A video tutorial I found on Youtube showed an example of using Unity Remote, and although I’d seen that topic in the table of contents of the Unity manual, I hadn’t read it yet because I was so busy looking up info on the main utility classes instead, which seemed the immediate need. Now that I have Unity Remote running I am much happier! Something to point out useful tools like this would be great.
Another thing I just noticed that would be nice to have explained: In a little test project I have this code:
float total = 0f;
void Update()
{
//theCube.transform.Rotate( rotDir * Time.deltaTime * rotSpeed );
if ( Input.touchCount > 0) {
Touch touch = Input.GetTouch(0);
total += touch.deltaPosition.x ;
Debug.Log( total );
// non -debugging stuff here
}
}
This just prints out the sum of all the delta positions as I swipe my finger across the screen from left to right. I expected that to add up to the screen width - 720. Instead I’m getting a sum of 2341.96. Huh? Why is that? The Touch class documents the position and deltaPosition attributes as being measured in “pixel coordinates” so I really don’t see how I can traverse more than 3 times as many pixels as there are on the screen, even if I assume that the starting and ending touch positions are not quite accurate and the various delta intervals in my swipe overlap each other.
Is deltaPosition not measured in pixels?
This is the kind of beginner information I’m looking for.
[EDIT] I just noticed that if I swipe my finger slowly I get a sum of more than 2800! That’s even more confusing.