In my game, I’m using a (point top) hex tilemap, and for several features I need to know the position, rotation and/or size of the edges of the hexagons. Specifically, I want characters to turn so that the direction they are facing is perpendicular to one of their tile’s edges, and I want certain npcs to be able to build a wall on the edge between two hexagons. This would be easy to compute if the hexagons used an isometric projection (equal edges, 60 degree angles), but the hexagonal tilemaps in Unity use a dimetric projection, so the angles and edges are not all the same size, which means the math becomes much more difficult to do by hand. Is there a way to ask the hex tile where its edges are, or at the very least efficiently compute where they are?
With a pointed top hexagon cell, starting from the top, the points are: (0, 0.5), (0.5, 0.25), (0.5, -0.25), (0, -0.5), (-0.5, -0.25) and (-0.5, 0.25) away from the center of the cell. You can convert these cell positions to the world-space positions using Tilemap coordinate conversions.
If you want a Grid with a hexagonal cell that has the same length for sides, you could adjust the Grid’s cell size (eg. x = 0.8659766).