Okay, so I think what’s happening is this:
I have a height texture that is 10x10 pixels.
I have a map that is comprised of 5x5 vertices.
I map UV to the X/Y position across the entire map - effectively saying that the vertex at location (x,y) has this UV value: (x/MAP_SIZE_IN_X, y/MAP_SIZE_IN_Y). In the example here, MAP_SIZE_IN_X == 5, same for the Y value.
This will give me a UV value in the range of 0 to 1, with let’s say the tile at position (1,1) having a UV of (0.2, 0.2).
In the vertex shader:
I lookup the height in the height map using the UV position, and get a value, lets say h0.
Moving to the next vertex over, at (0.4, 0.4) - I lookup the height again, and get a different value, h2.
In the fragment shader:
Since UV gets interpolated between points - and I’m using this UV to lookup the normal values in the “as normal map” version of my height map, I have this:
At UV (0.2, 0.2): I lookup the normal, get n0 which matches h0, great.
At UV (0.4, 0.4): I lookup the normal, get n2 which matches h2, great.
At UV (0.3, 0.3) - interpolated and doesn’t have a corresponding vertex: I lookup the normal, get n1 which doesn’t match h0 or h2 or h_interpolated, and it looks wrong.
At least I think this is what is happening. I apologise if I’m not clearly getting my point across, I’m very new to gamedev/shader work in general, and my use of terminology feels inaccurate.
When I was researching all of this, I bumped into a similar problem in an old tutorial I read:
A few paragraphs down:
This still does not look how we want it to. The issue here is that the noise changes faster than the vertices do. So when we calculate the normal at the point of the VERTEX it does not align with what we see in the final Mesh. In order to fix this we add more vertices.
I think this is also what is happening to me!