Why does my .raw heightmap create ugly errors when loaded into a terrain?

Hey folks. I’m baffled. After watching this, I’ve been experimenting and trying to turn this image…

…into a photoshop .raw (see attached)… then load that file onto a Terrain in Unity, only to create a spikey disaster:

This is not totally unlike the one that appears in the linked tutorial video (above) at around 16.30 minutes in. Crushing the range with curves removes some of the spikeyness, but I’m still left with what appears to be only a portion of the map rendered in cards of different elevations (cool in a retro sorta way, but not what I’m looking for)…

I’m importing at Bit 16, at 1025width x 1025height, Byte order of Windows (having exported as IBM), with a terrain size of x512 y128 z512.

Can anybody put me right? Please explain like I’m five.


Actually, I think I’ve solved the problem! The issue is… I’m not entirely sure what I did to solve it. So. Uh:

I was under the impression that Unity required 16Bit Raw files, but… I think at some point in my experimentation, I saved the file as an 8Bit Raw which loaded into Unity mostly without complaint (spikey, but roughly the right shape of the country and with that weird card effect, seen above).

I figured I shouldn’t look a gift horse in the mouth and exported the .raw from Unity - and it did so as an 8Bit Raw which opened easily (and later saved with no complaints) in Photoshop, allowing me to use curves to smooth out the hills and mountains a bit.

I’ll be honest, I kinda lucked out on this one.

Looking at this, what I think I’ll do in future is build a terrain to my required size, add a few arbitrary elevation strokes, save the heightmap from Unity, load that into Photoshop, edit in PS, save and reimport into Unity. Seems like the simplest way to ensure compatibility and performance.

…Though I’d be happy to hear if anybody has a different suggestion/workflow!