How does Time.timeScale affect time?

Does unity keep track of system time, and then multiply it by timeScale to define Time.time, and, derivately, Time.deltaTime? If a frame took, say, 0.016 actual seconds, if timeScale were set to 0.5, Time.deltaTime would give 0.032? I know that different timeScale values are used for slow motion and speed ups, but I need to make sure how exactly it is implemented.

time.deltaTime and time.time are functions of timeScale and unity multiplies them with timeScale. as you said if one frame takes 0.1 as deltaTime and if timeScale is 0.5 then you'll get 0.05 as deltaTime. in this way unity can makes all framerate independent code faster or slower. keep in mind that it will affect time.time too. also FixedUpdate and it's fixedDeltaTime is a function of timeScale too and if you set time.timeScale to 0 then you'll have no FixedUpdate executed every second. if you want to know what variables are depending on timeScale, documentation says

Except for realtimeSinceStartup, timeScale affects all the time and delta time measuring variables of the Time class.

Use this script:

`function Update () {
    print (Time.deltaTime);
}
`

Run that, observe the numbers, change timeScale as desired, run again, compare results.

i am having a wired issue where time.timescale is 0 but time.deltatime is not 0?