I’m running into issues regarding the use of Time.get_ticks_usec() in _physics_process(). It makes me wonder if I have a fundamental misunderstanding of the process functions.
My understanding was that process and physics process operate independently and that Time.get_ticks_usec() returns time similar to Python’s perf_counter() function. If that’s the case then using get_ticks_usec() within _physics_process() should return times synchronous with _physics_process() ticks but it doesn’t:
Code
func _physics_process(delta: float) -> void:
if index < 720:
times[index] = Time.get_ticks_usec() - start_time
delta_time += delta
delta_times[index] = delta_time
index += 1
else:
completed_timer()
queue_free()
func completed_timer():
for time in times:
print("Time: %d" % time)
for delta_time in delta_times:
print("Delta Time: %f" % delta_time)
The physics tick rate is set to 360. You can see in the results image that the delta time steps are consistent (1/360) but that Time.get_ticks_usec() has large .013 second jumps, which is consistent with a tick rate of 75, and has smaller intervals between those jumps.
Why?
I would greatly appreciate any pointers or insight you have into the function or the way the engine works (or my small snippet of code ) that’s lead to this behavior. Thanks in advance.
My guess is that the Time values are not updated at the same tick rate as the physics tick, so there’s no ensured constant delta for microseconds. I don’t know when exactly Time updates, but it might be the same as rendered frames or _process. This doc article has more information.
Okay, I think I see what you mean. Thanks for that link.
I expected it to return a precise, raw value every time it was called regardless of where but if it’s tied to _process in some way then it makes sense to me that the values returned by calling it from _physics_process are interpolated.
Though I guess I still don’t quite understand why the values are clustered rather than being consistently spaced?
I tried to look into how Time is implemented in the source code but with absolutely zero understanding of C++ I did not make it far lol.
It does return a precise value when called.
Interpolation is not really relevant here.
What you are seeing is expected, and is because your physics ticks are being processed in ‘groups’ (one immediately after the after) because the tick rate is higher than the frame rate.
An example, to hopefully provide more clarity:
60 FPS, 360 TPS
Would produce a ‘trace’ that looks something like:
Time.get_ticks_usec() is the most accurate time you can get. The engine internally uses that call for all its timing. It’s the “real” time.
The results you’re getting are likely because you’re trying to run physics tick faster than frames are or can be rendered. So you get multiple _physics_procees() calls during engine’s single main loop step.
For comparison, try doing the same test from _process()
It’s the real time gotten from the operating system. The engine doesn’t “update” it, but rather it depends on it. You can’t get more accurate than that time.
Thank you and @normalized for the additional replies but now I’m more confused lol.
Just to be clear, are you guys saying that physics will execute its ticks, wait for the corresponding rendered frame (in this case, frame 5), and then execute again once _process() is completed?
Because my confusion comes from the fact that the _physics_process() delta is consistently 0.00277 seconds, suggesting consistent execution intervals, but that Time.get_ticks_usec() is executing irregularly. The function placed in _physics_process() returns:
Time: 1_956659 # Intervals:
Time: 1_969888 # ~0.013s gap
Time: 1_969911 # ~0.000013s gap
Time: 1_969925 # ~0.000014s gap
Time: 1_969938 # ~0.000013s gap
Time: 1_969951 # ~0.000013s gap
Time: 1_983196 # ~0.013s gap
The .013s gap always exists even when physics rate = FPS (which makes sense, 75 FPS), but the smaller intervals are introduced when physics rate is increased. I would have expected those intervals to be regular (0.00277 just like delta) when physics rate is increased, if that makes sense.
You are assuming that the physics ticks are evenly divided over time. They are not. They are tied to the framerate that can actually be rendered by your computer. If you have a really good computer and videocard, you can get up to 240fps, but based on your results, you are running at 60 fps.
This means that in 1 second, running at 360 Ticks/sec wiith 60 frames/sec, you are running 6 ticks/frame. Which is why you are seeing what you are seeing.
Don’t do that. Use delta.
Your entire exercise is literally trying to create a Timer, in which there are already three ways to create one that syncs up with both _process() and _physics_process().
If you were doing this as an academic project, I’d refer you to the C++ code, but you’ve already been there.
If you’re doing this because you want to implement something, I would suggest you tell us what you’re actually trying to accomplish and what led to this experiment.
Know that Time.get_ticks_usec() is the ground truth. Whatever it reports - it’s true. The time you get from it is the real time and delta in _phyiscis_process() in lying to you to make you feel better and can do your simulation calculations. I could go into details…
As @dragonforge-dev said, the engine is not spacing the calls out like you are expecting; and yes, it is running the ‘physics tick(s)’ between frames, and as many times as is needed to ‘catch up’ (each one worth 1.0 / tick_rate, which is the delta you get and will never change), any extra time there is left over is either blocking waiting for a VSync, or the engine explicitly sleeping (I presume) if vsync is disabled (according to max_fps and assuming there is time left over).
In a nutshell: The engine is under no obligation to space your physics processing out to match the delta, and it’s not.
So, follow dragonforge’s advice: use the delta. Time.get_ticks_usec() might be useful to measure execution time in certain cases where running the profiler is undesirable, but don’t use it for anything game related, that is exactly what the delta is for.
I know to use delta so no problem there. I just stumbled across Time (while building a timer) and was curious about how it worked.
Then the uneven intervals made me ask because I figured knowing why Time was behaving that way would offer some insight into how the engine worked…and it did! (I think )
The technical purpose of a game (engine) is to incessantly render frames and display them for your viewing pleasure. It runs in an infinite while loop and just pumps out frames as fast as possible (or desired)
Suppose we’re vsynced at 60 fps, and we have a blank screen. The engine will display a blank frame in 16 millisecond intervals. Let’s call this interval the “frame time”:
Now if we have something to render, the computer will need to do some work inside the frame time to produce the pixels that will be displayed when the next vsync interrupt arrives. Let’s suppose the rendering work takes up 6 milliseconds of the frame time:
After the rendering work has been finished, the computer will wait for another 10 ms until it displays the result of its work, i.e - it’ll be idle until the next vsync interrupt.
Let’s now suppose that we want to render the results of a physics simulation that runs at 240 ticks per second and that we want to run the simulation precisely synchronized with real time. And let’s suppose that the engine internal time to run the simulation is 1 ms and your callback function takes another 1 ms, resulting in 2 ms of work to run one tick of the simulation:
Doing it like this, we have two problems. First, we’ll need to interrupt whatever the rendering code is doing at the time, just to run our simulation tick, and second the rendering code wouldn’t know what to render because the simulation state would change in the midst of the rendering job.
So the solution is to first run the simulation, and then render it, pretending that simulation step time (delta) advances in regular intervals as if it was executed in real time with desired ticks per second. In other words, the execution of the simulation code is not synchronized with the time the simulation simulates:
As you can see, in the first 8 ms of the frame time, we execute the whole 16 ms worth of simulation, with uniform deltas as we were running every 4 ms. Then we render the results of that simulation. And then we wait for the leftover 2 ms for the vsync interrupt to arrive and display our results on the screen.
Note that your _physics_process() gets called at each h in the diagram. With the delta value that looks like it’s corresponding to the h in the previous diagram. So if you print the real time at each tick, it will apparently move in small increments for 4 ticks and then one big increment as the frame is completed. Rinse and repeat. Just like you’ve seen in your tests.
To verify this, your can alter your test code a bit. In each physics tick, measure the difference between the real time gotten from Time.get_ticks_usec() and the time accumulated via delta:
-> display frame 23
tick 94: delta-real = -5 ms
tick 95: delta-real = -1 ms
tick 96: delta-real = 2 ms
tick 97: delta-real = 6 ms
-> display frame 24
tick 98: delta-real = -4 ms
tick 99: delta-real = 0 ms
tick 100: delta-real = 3 ms
tick 101: delta-real = 7 ms
-> display frame 25
tick 102: delta-real = -4 ms
tick 103: delta-real = 0 ms
tick 104: delta-real = 3 ms
tick 105: delta-real = 7 ms
-> display frame 26
We can see that the time accumulated through delta consistently dances around the real time within the frame span.
If we falsely assume that delta represents the real time interval, we’ll equally falsely conclude that there’s something wrong with Time.get_tick_usec(). But in fact, the opposite is true - delta is not tied with real time. It’s just the in-simulation time.
If you wish we can visit the key parts in Godot’s source code to verify that this is indeed what’s happening.
It’s ok to build your own as long as it accumulates using delta. If you wish to use Time.get_ticks_usec() then it should accumulate only in _process() as its delta is more or less in sync with real time.
It’s also important to read the room and understand when’s appropriate to “smile”. I’ve put an hour in writing that post and you’ve just dismissed it with some stupid meme.