# Compute shaders not behaving as expected

Hi everyone, I’m new to Unity

I have generated an icosphere. It has a vector3 array (vertecies) that stores all the vertecies. Now I want to make it round:

``````void setHeight(){
Planet planet = transform.root.gameObject.GetComponent<Planet>();
for (int i = 0; i < vertecies.Length; i++){
vertecies <em>= vertecies_.normalized * planet.radius;_</em>
``````

}
}
[173180-sphere.png|173180]_
It works great. I now want to do the same by using a compute shader instead:
C#
Planet planet = transform.root.gameObject.GetComponent();
ComputeBuffer buffer = new ComputeBuffer(vertecies.Length, sizeof(float) * 3);
buffer.SetData(vertecies);
buffer.GetData(vertecies);
buffer.Dispose();
HLSL
_

RWStructuredBuffer vertecies;

{
float3 vertex = vertecies[id];
float test = -abs(sqrt((vertex.x * vertex.x) + (vertex.y * vertex.y) + (vertex.z * vertex.z)));
vertecies[id] = vertecies[id] * test * radius;
}
and the result looks like this:
as I turn the camera, it morphs and wraps around. The picture above looks different depending on the camera’s angle.
I thought I understand the concept of compute shaders but it looks like I don’t. Can anyone please explain why I get different results when using a compute shader instead of pure C#?
_

_*

Turns out my formula was wrong. To convert the vector to a unit vector, I had to divide by the magnitude in the shader, not multiply.

So this:

`vertecies[id] = vertecies[id] * test* radius;`

becomes this:

`vertecies[id] = vertecies[id] / test* radius;`