So, I’m real new to scripting and I’m lost. I’ve got a slider which counts up from -500 to 1000 of which I can set the time using mainSlider.value += 5.0f; which is great but I need it to be speed dependent on how quickly it’s parent is scaling down in size.
I have the scaling down object working how I like having a minSize, scaleRate, and Current scale (scale). Here’s the code for that.
public float minSize;
public float scaleRate;
public float scale;
private void Update()
{
transform.localScale = Vector3.one * scale;
scale -= scaleRate * Time.deltaTime;
if (scale < minSize) Destroy(gameObject);
}
and here’s the code for the slider min to max over time using 5.0f as a place holder for now.
public Slider mainSlider;
public int minValue;
public int maxValue;
private void Start()
{
mainSlider.value = minValue;
}
private void Update()
{
if (mainSlider.value <= maxValue)
{
mainSlider.value += 5.0f;
}
}
All I need is for the +=5.0f to be not set solid like that but be at the minValue when the parent object is at it’s start size and the maxValue to be reached when the minSize of the parent object is reached.
I’ve been playing around with mathf.lerp where the 0.5f is but I can’t figure out how to get the t to scale the slider value properly, if I call the scale from the parent object it just forces the value to 0.
any thoughts??? First time I’ve gotten properly stuck.