Problem with creating a procedurally generated object

I’m trying to create an application that reads information from a database and creates buildings from it.
It’s in a very early stage of development (I guess it can’t be called development yet, its more like research phase).

In the attached image, you can see a cube with some holes. What you are looking at are several diferent objects that look like one. The frontal face of the cube (the grey area) is one object, The red walls around the cube are another object, and the walls around each hole are 2 more objects.

What I’m having problems with are the red areas. There is something not quite right with it, and you can tell just by looking at it. I can’t tell if it is the normals, the UV’s or something else that I’m forgeting.
Any sugestions on what I should look at?

Link to download my project if you want:
https://dl.dropboxusercontent.com/u/20794300/Sebastião%20Teste%202.rar
You will notice I’m using some big numbers for coordinates, that is because this is to create buildings in the future.
To control the camera, hold RMB to rotate and MMB to pan/drag the camera.
The script that creates the red walls around the cube is in Assets/Tools/Tools.CreateWalls

From your screenshot, that looks like an issue with the normals to me.

If you use a single color material, you can rule out uvs. That is, color set in shader, not by using a texture.

I’m not using any textures, I just create a material with the right click, and chose a color there. Not sure if this is setting the color on the shader

You could print out the those vectors and check whether is what you are expecting.

Yeah that would be a single color material, so it’s nothing to do with UVs. Pretty sure it’s the normals then.

you guys are right. I’m getting stupid values on my normals.
I’m getting { (0,1), (1,0), (0,-1), (-1,0) }
When i should be getting { (0,-1), (1,0), (0,1), (-1,0) }

Must find out what is going on

These values can not be the normals. Normals are Vector3s, those are Vector2s. Maybe those are UVs.

Z is allways 0, so I didn’t type it

:roll_eyes: alright…
What’s wrong about these normals then?

Maybe this might help:

Done that already. Gives me worse normals. Thats why I tried to write my own normal calculating script that isn’t working

Check your triangles winding. Unity triangle front-face is defined clockwise.

You are right. I had my triangles CCW.
And it turns out my normals calculation algorithm is a mess. Jisus I’m screwed

I got a handy litle script to show the normals from this website http://catlikecoding.com/unity/tutorials/noise-derivatives/
After aplying it, I seem to be missing some normals :S
In the attached image A, you can see a print I took. First, the face is turned to the oposite of the normal, and second, I seem to be missing some normals, as you can see in the B image, the 2 red lines are the example.

What do you think?


I’m not sure what this “noise derivatives” script does…

Usually, I like to use shaders to check the normals of a model. If something is wrong, it will jump at your eyes.
The Unity shader tutorial contains an example to display the normals:

That’s the result:

1 Like

That link creates a random terrain but that is not what I wanted to show. It was only the script that shows the normals.
Can you post 2 similiar images, 1 with correct normals and a similar one with incorrect normals, using the shader to identify them, so I can know what to look for?

Edit:
I tried to use the same shader on my object and the attached images are the result. I have no idea what is wrong or right.

Couldn’t upload the images on my previous post (in edit mode). Had to create a new one.
I tried to use the same shader on my object and the attached images are the result. I have no idea what is wrong or right.


The colors encodes the direction of the normal, the color might seems random when you are not use to it. But after some practice you will “see” where the normals are pointing.

I have modified the unity example to display the normal in view space (instead of world space), which I found more convenient when inspecting model since it makes the normal relative to the screen.

2639614--185749--reading-normal.png
So the color encodes the direction:

  • red means the normal is pointing to the right.
  • green means it’s pointing up.
  • blue means it’s pointing toward the screen.

Then you have a mix of color for a mix of these basic directions.

Use the shader in the attachment on your model, then upload a screenshot and will see wether something is wrong about the normals.

2639614–185748–Unlit-ViewSpaceNormal.shader (1.09 KB)

Thanks.

These ‘colors’ are all over the place. :S

2639634--185754--18-05-2016 10-36-45.png
2639634--185755--18-05-2016 10-36-15.png
2639634--185756--18-05-2016 10-36-31.png
2639634--185757--18-05-2016 10-37-05.png
2639634--185758--18-05-2016 10-35-47.png

There is definitely something wrong with the normals. The front face (the one with the holes) is ok.
But the others are really screwed up. You can see different colors on the same face, which means that the normals of the same face are pointing in different directions. For instances, on the first image, the top face is facing down, therefore it should be colored plain purple. However it is green and blue, which means it is both pointing up and toward the camera…

Well, at least now I know for sure what is wrong :slight_smile: