Mapping a Planet Texture to a Sphere

Godot Version

4.4.1.stable.official

Question

I feel like I am losing my mind here, but I am trying to display a planet texture on sphere. When using a StandardMaterial3D and using the planet texture as the albedo texture it looks, okay, except for pinching near the poles. I feel like I have looked everywhere and tried every combination of coordinate conversion formulas with no luck at getting the results I am looking for. I believe I must be misunderstanding something about UVs on sphere meshes in Godot or something.

This is the style of image that I am trying to display

Does anyone happen to know what calculation I need to do to the UV’s to properly get this image to display on the sphere?

shader_type spatial;

uniform sampler2D map;

void fragment() {
	ALBEDO = texture(map, UV).xyz;
}

Any help would be greatly appreciated. Thanks so much!

Hi,

I believe that the mapping is correct. I think that we tend to see Earth textures mapped on spheres as distorted because we’re used to see it as a flat texture, but when you look at pictures of the planet, continent sizes on your sphere seem pretty much on point. And actually, the version that looks distorted/stretched is definitely the flat one, not the “sphered” one (dunno if that’s an English word).

And yes, Africa is reaaaaally huge. It just feels smaller than it really is on flat textures because it’s in the center so it gets shrunk a lot.
Just for fun: there a lot of illustrations online this this showing actual Africa size comparison:


Anyway, I’m getting distracted.
All this leads me to my question: what are you trying to do exactly? If you’re looking for a physically accurate planet mapping, I think you have it already. But, you may be looking for a shader that looks good more than realistic?

1 Like

Thanks for the reply! I appreciate the help. I think ideally I am looking for something that is both good looking and realistic. I am still not totally convinced that my shader is accurately mapping the texture to the sphere though. I think Mars is a better example of what I am talking about.

Here is an online example from Solar System Scope:


We can see the south pole is small but relatively round. If we look at an offical model from NASA it looks very similar.

So that tells me that Solar System Scope is accurately displaying their textures. Luckily they make their textures free to download so I can just take their Mars texture and throw it in my shader.


While the shape is very similar, I feel like there is pretty clearly some pinching happening right at the poles that is not present on Solar System Scope’s or NASA’s model. If you look closely at the screenshot you can see the banding near the poles where the image is getting pinched.

This docs page talks about pinching on the poles of a sphere but the texture they are trying to map is not taken from a sphere previously so it makes sense that it would look pinched due to the UVs.

editing this comment to add: That docs page states that “It uses a projection technique called equirectangular projection, which translates a spherical map onto a 2D plane.” referring to the StandardMaterial3D when used on a sphere. And I am pretty sure that Mars texture is in an equirectangular format. So I don’t quite know whats up.

I think it’s your geometry, and how the UVs are mapped on it. More to the point, I think it’s the relationship between the latitude on the sphere and the V coordinate.

It looks to me like the NASA model has less V range towards the poles. In your image, look at the little grey spot about 2/3 of the way to from the pole to the right side of the sphere. The NASA one above is rotated 90 degrees counterclockwise, but you can see the same spot vertically, and yours is way closer to the equator. Your polar ice cap looks larger as well.

If you can look at their models, maybe look at how they map the V coordinate on the the mesh. I’m pretty sure that’s the difference.

I was able to catch the NASA model in a screenshot before it fully loaded in and seem to be using a cubemap.


Unfortunately I don’t know if the original texture they are using is in the same format as my own. I found this tool for SpaceEngine+ to convert textures that look like they are formatted like the ones I found, so maybe I should try investigating the math behind that to make my shader convert the textures I have to a cubemap and then translate that cubemap onto the sphere.

If anyone else comes across this post and has any advice, it would be greatly appreciated!

I’m a godot beginner, but I’ve been working on a similar problem. You’re on the right track, I think, a cubemap will decrease the polar distortion from a projection to a sphere compared to the traditional equirectangular panorama, as it “pinches” at the corners but much less dramatically, and lends itself to a nice distant view. (One drawback is your pixels are essentially distorted across the whole sphere, pinched at the corners and stretched in the middle of the faces). The projection you’re doing onto the sphere is correctly applied, however the way that equirectangular projections work, is they tend to have pinching or distortion at the poles when mapped to a sphere.

I would recommend creating your cubemap texture offline rather than trying to map the equirectangular texture to a cubemap in realtime. This website will turn a panorama into six individual cubemap faces to easily test it out: Panorama to Cubemap

To display them in godot, you’ll need to arrange them correctly in a texture. The default 2x3 is a good balance for the overall texture sizes: Cubemap — Godot Engine (4.4) documentation in English

The filenames from the above tool correspond to where in the godot template you’ll need to stitch them: px is X+, ny is Y-, etc. I ended up writing a python script to do the projection (using numpy) and stitch them together into a texture. I haven’t used that spaceengine tool but it seems like a great way to automate the math behind it!

Lastly, to show them on your sphere you’ll need a shadermaterial to sample the cubemap, since you can’t just apply the cubemap directly to a standard material. Here’s a basic one that applies either a cubemap or a color (for planets or moons I don’t have a cubemap for).

shader_type spatial;

uniform samplerCube albedo_texture;
uniform bool has_albedo_cubemap = false;
uniform vec3 albedo_color = vec3(1.0);

varying vec3 direction_from_center;

void vertex() {
    direction_from_center = normalize(VERTEX);
}

void fragment() {
    vec3 calculated_albedo_color;
    if (has_albedo_cubemap) {
        vec3 albedo_color_cubemap = texture(albedo_texture, direction_from_center).rgb;
        calculated_albedo_color = albedo_color_cubemap.rgb;
    } else {
        calculated_albedo_color = albedo_color;
    }

    // Apply the final color.
    METALLIC = 0.0;
    ROUGHNESS = 1.0; // tweak roughness how you would like for your particular planet
    ALBEDO = calculated_albedo_color;
}

You’ll need to pass in the cubemap into the shader with set_shader_parameter, eg:

var cubemap = load(planet_data.mat_albedo_path)
planet_material.set_shader_parameter("albedo_cubemap_texture", cubemap)
planet_material.set_shader_parameter("has_albedo_cubemap", true)

And here’s the cubemap I used, converted from NASA’s blue marble equirectangular projections (note the rotation on the Y faces)

You could also use individual face textures, and load the higher detail ones as the camera got closer to a face, for example, to maximize the amount of detail for each loaded texture. If you used a “cube sphere” model rather than the standard godot sphere, you could use quadtrees to subdivide the mesh as the camera gets closer and even show DEM/elevation data in the vertices. (A cube sphere is essentially a cube mesh where the vertices are normalized to the sphere’s radius, which adds some distortion). Good luck! Hope this helps.

1 Like