Querying texture memory

Godot Version

Godot v4.3.stable

Is there a way within Godot to determine how much texture memory is available? My game is currently using relatively high res textures, and whereas I’d like to keep those textures for machines that support it, I’d also like to be able to detect low-memory GPUs so I can load lower res textures.

My game is using about 2G of texture space right now and I have a laptop (2015 macbook running Linux, IRIS embedded graphics…) that runs out of texture memory during asset load. The game still runs, but badly, and largely untextured.

I can put in a settings switch to let the user choose their detail level, but I’d also like to make sure that if a machine simply can’t handle a detail level, I don’t try to load at that level. I could detect load failures, bail and reload everything at lower res, but I’d prefer to call some sort of get_total_gpu_texture_memory() and compare it to a test value ahead of time, rather than throwing away and redoing a bunch of loading when it fails part way through.

I see a 2021 github issue for this (#55787), but though it seems to be merged, RenderingServer.get_video_adapter_total_memory() isn’t listed in the docs.

Is there a way to do this short of modding Godot?

Textures aren’t the only resource that use VRAM. Meshes, buffers, shaders, pipelines,… will also use VRAM.

There’s no way to get the total memory (apart from the PR you found)

You can get the total memory currently used with Performance.get_monitor() like: Performance.get_monitor(Performance.RENDER_VIDEO_MEM_USED)

In my particular case, I’ve got relatively low poly models, I’m using the built-in shaders for the most part, and I’m not doing anything particularly complex with pipelines. Those costs will be there, but they’re a single digit percentage of GPU memory use. What I’m hoping for is a simple rule of thumb I can use:

if GPU.vram() < GiB(2.0):
    use_low_res_textures = true
else:
    use_low_res_textures = false

The texture size isn’t a perfect measure, but it would be a close enough heuristic if I could get access to it. Ideally at worst if someone has a bordeline card I wind up loading low res textures when it could have handled high res if I’d done all the math precisely.

The VRAM monitor in the debugger says the game is using about 1.75GB of texture memory, so if I halve or quarter the resolution of all the UV textures and various normal/roughness/metallic maps (which account for the vast majority of that) I’d be under a quarter or a sixteenth of the texture memory usage, which ought to handle even embedded GPUs.

The alternative is to try loading the high res textures, and if I get any allocation failures, evict everything and reload it all with low res textures. I’d prefer not to do it this way if I can avoid it.

I’m not an expert, but I think that on PC it gets even worse - other apps might be using VRAM too, so getting “total vram” or even “free vram” isn’t going to help you much, because the situation can change at any moment… Better think of some other alternatives.

I’m not looking for a perfect “will always work” answer, though if one appeared it would be nice to have. What I’m looking for is something I can check to see if this computer could ever load the high res textures, and if not, not bother to try. I still have to deal with the failure case, obviously, but I’d prefer reserving “load high, run out of space, discard, reload low” for corner cases.

If I knew the GPU had less than 2G of VRAM I would know not to even bother trying to load the high res textures. If they have more than that, I can try and then fail over to low textures if necessary.

I don’t know anything about GPU programming, but maybe it’s possible to try and reserve the memory before loading the textures into it? That should be much faster and should give you the answers you seek.

Does Godot have a mechanism for that?

No idea. :smile:

Someone has to have figured this out before. You’ve gotta know your system specs to post on Steam.

When playing Hogwarts Legacy, the game spends a while loading shaders. I’ve noticed other games do that too. I wonder if they are brute-forcing it. Trying to load high-res and if that fails dropping down.