I recently switched from a 2D isometric tilemap to a 2D top-down tilemap for my game. We’re using Unity 2019.4.6f1. With the isometric tilemap, I would use CellToWorld to get the position for spawning objects. The problem was if I wanted an object to spawn on cell (7, 4), according to the Coordinate Brush, the object would always without exception spawn at cell (6, 3). WorldToCell always seemed correct, but CellToWorld always had this bizarre off by one error. I always found this perplexing but dealt with it in a simple way, by internally keeping track of the cell number and adding +1 to the x and y coordinates whenever necessary.
For production reasons I’ve switched to a 2D top-down tilemap, and to my dismay CellToWorld seems to be giving me incorrect information again. This time, I don’t really know how to fix it. Here are a few pictures:
This is the Coordinate Brush telling us that the tile to the right of the bed, which is where I want to spawn the player, is at cell (-8, -2). In the code, I can verify that the Vector3Int passed into the CellToWorld function is (-8, -2). However, when I play the game, the character spawns here:
Somewhere in the middle of the tile below and to the left of where we actually want them to spawn.
Does anyone have any experience with CellToWorld giving strange world coordinates? The only special thing I’m doing with the tilemap is setting its scale to x = 0.5, y = 0.5 so the tiles are smaller. Even with the tilemap scale at 1 it’s not working. Thanks for any help.
I use that method extensively, both in CellToWorld and its interpolated variant, and I’ve never had this problem, except when there’s float inaccuracy. I only have encountered that when converting WorldToCell then back CellToWorld when very close to a corner. This sniffs of being the same thing. Perhaps you could write an extension method to use the interpolated version of the method and then RoundToInt it yourself? That way it won’t floor a float at 6.999, which (irrc) is what the method does. At the very least, using the interpolated version should tell you whether its perhaps a floating point error issue.
Another thing to try, though would surprise me if it made a difference, would be completely deleting the tilemap/grid and recreate it. Perhaps when fiddling with its settings and it has some internal setting slightly off leading to your (7, 4) being (6.999, 3.999) thus coming out as (6, 3).
This sounds a bit strange to me. Have you tried adjusting the PPU on your textures to achieve the same?
No luck on deleting the tilemap, but I’m interested to learn it has always worked for you. I checked the numbers and they’re quite round - the world position i’m getting is (-1, 4) and the player transform is being set to that. I consistently struggled with the off by 1 problem for every tile on the isometric map, and that was a completely different tilemap. I’m guessing there’s something about my setup that’s causing this. I think I’m going to invest in Super Tilemap Editor, which allows for tile parameters - something the Unity implementation doesn’t seem to do. I’ll also look at changing the PPU instead of messing with the tilemap’s scale in the future. Thanks for your reply!
My suspicion would be you have inconsistent anchoring of the tiles compared to your gameobjects. All placements are based on their anchors so if they are inconsistent they won’t match up when placed.