I cannot seem to get MipMaps to work for me. I am creating MeshInstance3D with an ArrayMesh.The Texture2D’s import settings have MipMaps turned on. When viewing the texture in Inspector it says “8 Mipmaps” under the size. The MeshInstance uses a MaterialOverride to apply the Material. The Material uses a custom shader I wrote.
I have replaced that shader with the absolute simplest case I can think of which is this:
As you can see I have set the stexture sampler to filter_nearest_mipmap and I am using textureLod() to sample the texture.
I can toggle the effect and enter any value I want for the factor and, while the preview sphere changes in appearance, nothing changes in my game view.
What am I missing here? I feel like I’m going crazy checking Project Settings and every other point on the process but I don’t see anything that looks wrong.
Yes, this also works on my end. That is, I am also able to get Mipmaps working by adding a dropping a MeshInstance3D using a shape mesh into the editor. Which does seem to confirm that its not an issue with the rendering engine or any project settings.
However, the MeshInstance3D that I am creating via code for my game world (using the same Material) does not show Mipmaps. I can put them side by side in the scene and change the Material and the test object created in the editor using a PlaneMesh has no issues, but the ArrayMesh I am using for my game maps does not show any Mipmaps.
arrayMesh = new ArrayMesh();
this.Mesh = arrayMesh;
this.MaterialOverride = data.material;
this.CastShadow = data.shadowSetting;
surfaceArray = new Godot.Collections.Array();
surfaceArray.Resize((int)Mesh.ArrayType.Max);
vertices = new List<Vector3>();
uvs = new List<Vector2>();
uv2s = new List<Vector2>();
indexesFull = new List<int>();
normals = new List<Vector3>();
surfaceArray[(int)Mesh.ArrayType.Vertex] = vertices.ToArray();
surfaceArray[(int)Mesh.ArrayType.TexUV] = uvs.ToArray();
if(data.hasPositionAsUv2)
surfaceArray[(int)Mesh.ArrayType.TexUV2] = uv2s.ToArray();
surfaceArray[(int)Mesh.ArrayType.Index] = indexesFull.ToArray();
surfaceArray[(int)Mesh.ArrayType.Normal] = normals.ToArray();
arrayMesh.ClearSurfaces();
Mesh.ArrayFormat format = Mesh.ArrayFormat.FormatVertex |
Mesh.ArrayFormat.FormatNormal |
Mesh.ArrayFormat.FormatIndex |
Mesh.ArrayFormat.FormatTexUV;
if(data.hasPositionAsUv2)
format |= Mesh.ArrayFormat.FormatTexUV2;
arrayMesh.AddSurfaceFromArrays(Mesh.PrimitiveType.Triangles, surfaceArray, null, null, format);
arrayMesh.RegenNormalMaps();
The code for putting together these ArrayMeshes is quite lengthy so i’m trying to be brief here.
Can you show your data.material, the mesh doesn’t really affect how mipmaps are processed, only the material should. What does the end result look like for you? And what do you expect it to look like?
As for what I’m expecting: At this point I can’t evaluate it, but I definitely expect that if I crank up the Mipmap bias I would see my textures change into blobs of color, and instead there is absolutely no change to the displayed textures at any bias value. That is, the Meshes that I add in the editor do change into blobs of color, but the Meshes from my code remain exactly the same.
I was doing preprocessing on my texture after which I would save it and reload it. I neglected to manually force it generate mipmaps after doing that. The editor stuff worked because it wasn’t have its albedo texture messed with in this way.
So instead of doing this:
Texture2D texture = ResourceLoader.Load(file) as Texture2D;
material.SetShaderParameter("texture_albedo", texture);