GameObject disappears when velocity is set to (near) zero.

I’m using Unity 5.1.0f3. I have a simple GameObject called “Ball” with a non-kinematic rigidbody and simple script attached.
I know that I shouldn’t modify rigidbody’s velocity directly, but this is a very simple scenario (pong-like game). At some point of time (let’s say after 3 seconds of delay) the external event occurs that triggers the ball to init the move, by calling following function:

void Run()
{
    rb.velocity = new Vector3(Random.Range(-BallSpeed, BallSpeed), Random.Range(-BallSpeed, BallSpeed), -BallSpeed);
    running = true;
}

My FixedUpdate looks like this:

void FixedUpdate()
{
    if (!running)
    {
        return;
    }
    
    rb.velocity = new Vector3(rb.velocity.x, rb.velocity.y, rb.velocity.z < 0 ? -BallSpeed : BallSpeed);
}

So far it works OK. However, when I RESET the Ball velocity to zero (by assigning rb.velocity = Vector3.zero in event handler) then after around 0.5 seconds the whole GameObjects is destroyed. It disappears also from hierarchy. I inserted Debug.Log in OnDestroy() and it is actually called. However, I don’t call any destroy from my scripts. Moreover, if I reset the velocity not to zero but for example to (0.4f, 0.4f, 0.4f) it works fine (although of course ball moves a little)! The 0.3f seems to be a limit value.

Why after setting velocity value of rigidbody to zero or almost zero my whole GameObject is destroyed? How to debug such behavior?

This is a little late, but might still be of use to someone. I’ve just encountered this same problem whilst making a breakout clone (as a way to test out the engine) and had problems with the ball disappearing when I reset its speed and position. I eventually tracked it down to the Trail Renderer component that I had added to my ball. Specifically it has a parameter called Autodestruct. If set to True, the ball would disappear. Changing it to false fixed the problem.