I am creating meshes via RenderingServer, but since it seems to take the same input data this discussion also applies to ArrayMesh.
Currently, I am creating my mesh data by adding sets of 3 vertices (aka a triangle) to the ARRAY_VERTEX PackedVector3Array. At the same time, I am calculating the normal for each vertex of the triangle by taking the normalized cross product of two edges, resulting in “faceted normals” or “flat shading”.
How do I move away from this and instead create smooth normals, where identical vertices of different faces have the same normal? Would I need to use ARRAY_INDEX? Also, would it be possible to control the angle at which smoothing is applied? This would prevent weird blotches from occurring when the underlying faces are too steep.
I think I should also mention that my vertices are first created on the GPU before being sent to the CPU for usage via RenderingServer. Thus, I cannot know in which order the sets of 3 vertices will be added. This could add some complexity to what I’m trying to do, but I am 100% confident that there must be some method available to me here.
I don’t like this solution because in my case, using SurfaceTool is a quite bit of a downgrade in terms of performance. Additionally, the result that generate_normals() gives suffers the same blotching that I had mentioned before.
It says so in the documentation and I can confirm as I have used it before. Yes, I am using a compute shader to create the vertex data. After converting the resulting bytes via to_vector3_array(), I follow the standard procedure one would do for ArrayMesh, but instead create the mesh via RenderingServer.
Then calculate the normals in compute shader as well. You’ll need to output additional connectivity data so that each vertex knows which face normals it needs to average, and then do the averaging in the second pass. Or bundle it all into a single pass if your generation method and topology allow for that.
Btw if you’re using to_vector3_array() that means you’re downloading the data to the cpu address space anyway and later uploading it again to the gpu. Since you’re already involving the cpu and the data is already downloaded, you might as well just use the SurfaceTool here. I’d profile a SurfaceTool version and go with it if its performance is acceptable.
I am already calculating faceted normals in the compute shader, but I just don’t see how it’s possible to calculate smooth normals within a single pass.
Right. There isn’t any way around that though, is there?
Not that I’m aware of. There was some discussion about this in the past, I think on discord, but iirc it ended up without the answer. It should be possible to keep the data entirely on the gpu, but needed functionality may not be exposed to script.
I’d start with SurfaceTool just to have a working version, then profile it. If it underperforms, you can delegate its work to a worker thread. It that underperforms then start thinking about doing it in the second compute shader pass. The possibility of doing it in one pass would depend on what and how you’re actually calculating there.
I’ve done a bit of digging and I found a potential solution. I haven’t implemented this yet, but I’ll mention it briefly for any future readers. From what I understand this is the best method available.
The key understanding here is I cannot explicitly generate normals from the mesh data, as that is reasonably impossible to parallelize. Even if you are not using the GPU, you should opt for a solution that is inherent to your current calculations, as that will be the fastest performing solution. What I failed to mention is that my mesh data is based off a noise function. This means each vertex of my mesh is forming a surface that represents the underlying noise. We can find the normal by using the analytical derivative of the noise function.
A vertex normal is simply a vector pointing perpendicular to the mesh surface. Thinking in 2D, where the mesh surface can be represented by an equation (the noise function), the normal of a point can be thought of as a vector perpendicular to the tangent line at that point. For 3D, the normal is perpendicular to the tangent plane at a given vertex. The orientation of this tangent plane is given to us by the analytical derivative of the noise function. Thus, the normal is calculated by normalizing the analytical derivative at the vertex.
When you have a noise function without analytical derivatives, things get a little more interesting. You can use finite differences, but that has it’s own problems, and the extra noise samples will decrease performance. If you’re using a voxel approach, even if your voxels are not necessarily cubes, you can construct your own gradient from the noise samples you’re already making, however the result wont be mathematically “true”. If you have a choice, then realistically you should opt for using a noise function that is differentiable, such as simplex, because it is significantly quicker to sample from.
Well, if you’re building the mesh from a noise then your normals will be more or less implicit so you can do it in a single pass. I already said above that feasibility of doing it in a single pass depends on how and what you generate. You should have initially mentioned that you’re sampling noise.
Although having selective hard edges won’t be that straightforward because you’ll either need the second analytical derivative or more samples, as well as maintaining a vertex data format that allows for hard edges i.e. repeating vertex data for each triangle a vertex is part of.