I’m gradually working towards implementing graphics and display settings in my game, and it felt like implementing a simple resolution option felt kind of difficult, and there wasn’t really any ‘simple’ way to do it without manually making an array of different resolutions, which is how I found out that in Godot the recommended way is to use the resolution scaling multiplier, which seems rather odd to me. I looked at some discussions and it was mentioned that it’s the more “modern” approach and that games nowadays don’t include a resolution option, which seemed even more odd to me, because pretty much every game I’ve played including ones released in 2024 and 2025, even ones that implemented a separate resolution scaling factor let you change the resolution manually. Even games made with Godot often introduce a ‘traditional’ resolution option. Is there really any major difference or downside to doing it in such a way? Is there any specific reason I should be using the scaling multiplier, or are both approaches theoretically correct?
Using resolution scale fits all monitors while being easier to understand, if someone totally new to gaming is given a huge list of numbers like “1920x1080, 1280x720, 800x600” it can be daunting and certainly makes less sense than “100%, 75%, 50%”.
If 32K monitors, or super-ultrawide become the norm in a year is your game ready for that with hard coded monitor sizes? I’ve played a lot of old games that don’t list up to 720p but can absolutely run it with the right registry edit or change to a hidden ini file.
It’s also easier to implement resolution scaling (in Godot), only changing a zero to one slider. While getting every supported resolution on a monitor depends on it’s connection and the operating system, which again could update in a year, but usually developers add an array of what they find to be acceptable resolutions.
I find it best to implement resolution scaling, a fullscreen/window mode toggle, and allow the user to resize the window however they prefer.
Here’s a github issue on changing the fullscreen resolution of monitors, something most modern games do not do, but it’s where this behavior comes from.
Interesting thoughts, and maybe I’m just trying to do things the old school way here, but I do think there are ways to get around some of the issues you mentioned here. I think there are plenty of angles to be considered.
If 32K monitors, or super-ultrawide become the norm in a year is your game ready for that with hard coded monitor sizes?
I worked with Unreal previously, and there was a method to get all the “correct” resolutions for the current screen, so I believe it would be possible to be done without a manual array. Granted I’m not exactly sure how it works, but I do have something set up in Godot that gets the current screen resolution and sets that as the default setting when the game launches. I think it simply depends on the mindset, since Godot expects you to use the scaling, you have to use workarounds to do it another way, which I don’t necessarily approve of, but to each their own.
Another thing is that while you are correct that new gamers might be confused when it comes to screen resolution, I think there will be just as many people that are used to that option that will expect to have said option. And while for some 100%, 75%, 50% will be easier, another person with a 4K screen wanting to play exactly in 1080p or 1440p instead will have trouble setting that up with a scalar, since I doubt people know off the top of their head how many % of 4K is 1080p.
So perhaps maybe the best way would be to simply implement both options, at least to me it seems reasonable.
Lastly,
Here’s a github issue on changing the fullscreen resolution of monitors, something most modern games do not do, but it’s where this behavior comes from.
As I said already, pretty much every game I played includes that option, including ones released this year, so I’m not exactly sure where the claim that modern games don’t do it comes from. Mind naming any examples? Are there really that many games that don’t let you change the resolution?
I created a plugin to deal with this for me. Well, technically two. Display handles all the code behind, and Game Template the UI. However I just broke out my Screen and SplashScreen class to a User Interface plugin, so I may move the specific settings screens to their plugins.
Regardless, my default settings for windowed are:
And for fullscreen:
So I combine the two. If you’re windowed, I assume you know what your resolution is and let you choose one. If you’re fullscreen, I lock the resolution and let you scale.
I would love to see that code.
This thread also reminds me that I’ve been thinking about trying to edit the project properties directly and save them so that the game always starts up in whatever you last set it. Right now my games start in whatever I set the default I set in the editor (which honestly I usually leave windowed), and then when the game loads the settings it applies whatever the user set before. So often my game will start on the wrong monitor (I have four monitors) in windowed mode, then pop open on the correct monitor in the correct resolution.
BTW, I strongly recommend letting players select a monitor. It drives me nuts when games don’t allow that and I have to switch from fullscreen to windowed, lower the resolution, drag the window to the correct monitor, then fix the resolution and go back to fullscreen. (Which is about 70% of modern games.)
It is possible to query the monitor for resolutions, this depends on the operating system and monitor’s connection, I don’t think Godot has implemented this. As you may have found you can query the current resolution the display is set to.
True, but I think this is a developer mindset, wanting an exact pixel-dense resolution that matches a different yet typical resolution is actually a fairly strange thing to do; what does 1080p get you over 75%? If you are reducing your resolution you probably want to get better performance, so anything less than 100% while looking good enough will do. Exact numbers don’t really help, just feel nice to us.
While most games select resolutions from a preset list, they are changing the viewport’s resolution, not the actual monitor’s resolution which older games would do, if you open up Unreal Tournament 2004 it will flash the screen black, shrink down your windows UI, if you tab out or the game crashes everything stays small as it’s changing the entire monitor’s display. This sucked! and it only went up to 4:3 with no HD options. I haven’t seen a game alter the real monitor resolution setting since maybe 2008.
I only mention this as it was normal for older games to do, they would query or fallback to a preset list of resolutions and set your monitor for you, that specific feature is where lists of resolutions come from, and that specific feature is what won’t be implemented in Godot regarding the issue I linked to.
Thanks for the insight, I’m sure these will come in handy.
I would love to see that code
It’s very simple, although it needs further testing, hopefully it actually works correctly.
var user_screen_res = DisplayServer.screen_get_size()
get_tree().root.content_scale_size = user_screen_res
resolutions.append(user_screen_res)
edit: to clarify, resolutions is just an array of Vector2i that I’m using for my settings.
Ah I see what you mean now. I get that this is undesirable. I am changing the viewport resolution in my game, not the monitor resolution to be clear.
I dug deep and Unreal is using SDL2’s SDL_GetDisplayMode for this, I’d say Godot could lift this code as it has graciously adapted other SDL snippets, but the function was removed in SDL3 so it may not be functioning properly today, enough to warrant dropping instead of fixing.
Thanks. I took a look at my code, and turns out I already am using that function. I use it to determine what resolutions to offer based on the max resolution of the monitor. Though I do like the idea of the default resolution.

