Hello, I’m new to Godot and I recently learned about signals and when to use signals vs _process.
I’m working on a small project (tower defence) where a tower shoots every X seconds using the _on_timer_timeout signal, but when I tested the actual time (with Time.get_ticks_msec()) between shots I realized that depending on the value of X I get inaccuracies of about 3-16 milliseconds.
Initially I though that maybe the print() function was causing the problem, but even if I don’t print the values I get the same inaccuracies (see code below).
Minimal reproduction project: Single Timer node with the following script:
extends Timer
var duration : float = 1.5
var time_between_shots = []
var counter = 0
func _ready():
timeout.connect(_on_timeout)
wait_time = duration
func _on_timeout():
counter += 1
time_between_shots.append(Time.get_ticks_msec())
if counter > 10:
stop()
for index in range(10):
print(time_between_shots[-index-1] - time_between_shots[-index-2])
For example, for duration = 1.5 I get
1497
1503
1497
1503
1509
1491
1497
1503
1496
1509
Why is this happening? I guess it has to do with either floating point arithmetic or maybe fps. And is there a more reliable way of running a script every X seconds?
Thanks.
Timers are not 100% accurate, even in the documentation they tell you to avoid them for precision tasks. They’re good for casual tasks like disabling a button for 3 seconds or stuff like that, but not for the precision in the core of your game.
I still try to avoid timers, it’s one less Node in the scene, one less callback and less code.
In general I do this.
Hmm ok, so it seems like those errors are inevitable but at least the “average” time between timeouts is roughly correct which is good enough for what I want.
However, in my actual project (the script above was just to illustrate the precision of the Timer node by itself) the time between timeouts is always 1507ms or more.
Solved: I was using a one-shot timer that restarted manually on each timeout, and that was adding 8-16ms of delay. Now I switched to one-shot = false and it works flawlessly:
Great! Could you share your final script? I’ve always had trouble trying to make timers precise, this is why I prefer to code them.
Also so that whoever bumps into this thread looking to solve a timer precision issue could see how you solved it.
Even with your code (removing the stop() call and “one shot=false”) it still doesn’t work for me.
Also, a word of caution.
Maybe you have a very powerful computer, but it feels like it depends on the performance of your game (which might differ from computer to computer). I have a very decent PC (bought it like 3 years ago) and I still have issues. If I don’t do anything, it looks like it stabilizes at 1500. HOWEVER, if I drag the screen around differences start appearing. And the scene is empty, it has no images or objects, imagine a scene of a full game being rendered.
Again, be careful with timers.
Look at how after 10 seconds of standing still everything stabilizes to 1500, but as soon as I start moving the window around again it starts fluctuating (slightly, but it’s not the precision you were looking for)
My plan to avoid that is use notifications, so that if the player drags the window/the window loses focus/etc then the game is automatically paused.
Also, it seems like the weird reactions caused by dragging the window around or simply interacting with the title bar is a confirmed bug (see here).
Hopefully it gets fixed soon.