How accurate terrain data can Unity render?

I am using real world data to create a height map. I am using contour detail and creating a grey-scale height map by colouring in each consecutive ring with a different shade of grey. So my question is how accurate should i make my grey-scale image, every contour at 0.5 of a meter or every contour at 0.2 of a contour.

Unity's terrain engine uses a fixed resolution range for height, but you can choose over what distance that resolution is spread. So, the higher your terrain, the less height resolution you'll get. If you set your terrain height as 1000m, the steps in the available resolution spread over that distance will be much larger than if you have your terrain height set as 100m.

So ideally, you should have your terrain height set to the height of your highest hill, and normalize your heightmap data to match, filling the available resolution fully.

For information about importing greyscale heightmaps into unity, read here (at the bottom):

http://unity3d.com/support/documentation/Components/terrain-Height.html

EDIT:

It might be worth noting that with a 16-bit greyscale image, the 16-bit range effectively gives you 2^16 unique values - 65536. So, if your terrain height was set to 1000, and you're using the convention of 1 unit = 1 meter, this would give you an approximate resolution of 1.5cm (1000m / 65536).

However, internally, I believe Unity stores terrain heights as 32 bit ints, so perhaps you could achieve greater resolution (if required) by manually reading and applying height data from some format other than 16-bit imagery.