The idea is, I want something to happen in 3seconds (3000ms). If I just take away a fixed amount (say 10) every time I calculate then it would look like
3000,2990,2980,2970,2960 etc.
But if some frames take longer to draw than others (say 12) and some take shorter (say

because of faster or slower computers or because the complexity of the scene changes, then these fixed time steps are wrong. So, if instead of using 10 for the amount to take away we should try to use how long we think it will take to draw the next frame. A good guess for how long the next frame will take is how long did it take to draw the last frame.
So if our frames took 10ms, 12ms, 15ms, 8ms, 9ms our counter would look like
3000,2990,2978,2963,2954
the steps become uneven because the frame rate is uneven. Which is exactly what we want
If you use the frame time in all your calcs (rotation, speed, acceleration, timers) etc, then you will always stay in time with the computer even if it only ends up doing 10fps.
Jim