1.) what I want to realize is a simple inner glow effect in GLSL, but I don’t find any resources / links. I use unity pro, someone has any ‘easy to follow’ links, or could explain how to do this?
2.) still have no clue… what I’m searching for is a shader-code that colorates a edge (line) between 2 verteces - that all over the whole model…
like 1 tri has 3 vertecis with 3 edges (lines) connecting the verteces, like you see it in 3d-modelling software.
Actually I think I’m searching for something called wireframe-effect (not faked with lines drawn on a texture)
But there is no chance to do it over shader?
That’s a pitty :-/ I have calculated some alpha values on the shader that should be fading the wireframe aswell.
You might be able to do it by clever texturing, where each(!) triangle has one vertex with texture coordinates (0,0), one vertex with texture coordinates (1,0) and one vertex with texture coordinates (0,1). (I guess it depends on the mesh whether that’s possible at all.) These texture coordinates could then look up into a texture which is transparent for all texels except those close to u=0, v=0, and u+v=1.
As of Unity 4, you can use the topology parameter of Mesh.SetIndices() to create a mesh that will render as lines or points. This allows you to use the graphics card to very efficiently draw a mesh, and is supported by Unity Free.
I actually don’t completely understand this.
If I would do it over a texture, I would create a procedural texture, calculate all lines / edges (done with 2 verteces each), and write it on the map.
My Problem with this is, that if the texture got streched, the lines begin to strech aswell.
Does GLSL provide Information about the Offset of the current Pixel to each vertex? That would totally solve my Problem.
GLSL doesn’t provide offsets of the current pixel to each vertex. However, if you know that one vertex of each(!) triangle has texture coordinates (u,v)=(0,0), one vertex has (1,0) and one vertex has (0,1), then the interpolated texture coordinates give you those offsets.
I think he means that if someone is using a weird screen resolution that the texture might stretch when it’s rendered (I’ve seen it happen on lower end graphics cards where the texture that’s rendered changes resolution - once in a while a high end graphics card) Whereas if this was done with a shader it would be rendered fairly the same and if it was viewed on a lower end graphics card it wouldn’t have this appearance - assuming that the graphics card supports things like that which most should.