is UnprojectPostion not working on XRCamera3D ? And why is my Alternative not working

Godot Version

Godot Version 4.2.2

Question

Im pretty new to Godot and I am currently implementing a small optical Illusion in VR. For the effect to work I had to project vertecies of my mesh into screenspace and send this data to my fragment shader, where I am using them to calculate a mask in which the effect should be aplied.

I now moved my project over to VR and I learend very Quickly that the UnprojectPosition function of the Camera3D Object does not work properly in Vr. While it does produce a valid looking output it does not seem to be Projected to either eye/view.

I found a function to get a ProjectionMatrix from the XRInterface and what I think is the Inverse of viewmatrix of the Camera.

       XRInterface xRInterface = XRServer.PrimaryInterface ; 
       Rect2 viewport = new Rect2(Vector2.Zero, new Vector2(1920,1832)); 
       leftProjection = xRInterface.GetProjectionForView(0,viewport.Size.X / viewport.Size.Y ,cam.Near,cam.Far);
       leftTransform = xRInterface.GetTransformForView(0,xrOrigin.GlobalTransform); 

Now my matrix math and knowledge of projection is limited at best but I tried to do my on projection but the data I get from is never quite right.

        leftTransform = leftTransform.Inverse(); 
        Projection cameraProjection = new Projection(leftTransform); 
        Projection viewProjection = projection * cameraProjection;
        Vector4 output = new Vector4(vertex.X, vertex.Y,vertex.Z,1); 

        output = viewProjection * output; 
        output = output * (1/output.W); 

        return new Vector2(output.X,output.Y); 

Does anyone here know what I am doing wrong ? Any help would be greatly appreciated

Not sure what is going on here, at face value nothing immediately stands out though your code is projecting, not unprojecting.

UnprojectPosition indeed doesn’t work because Godot has no idea how to deal with unprojecting stereo, are you supplying coordinates for the left eye? right eye? combined frustrum?

If you could shed some more light on what you’re actually trying to do, we might be able to provide a suitable solution.

So I am trying to create the Impossible cube optical illusion in VR.
I did manage to get it working in standard flatscreen. By reversing the Z-Buffer value on specific fragments in a fragment shader I got this:

I Have segmented this cube into corner and edge pieces to selectivly enable or disable the effect and each piece is instantiatet at runtime.
The Instaces are instanced along the edges and vertecies of a default cube and their relationship is stored.
I then calculate the convex hull of this default cube in 2D screenspace to determain which instances should use my effect and which should not.
This is the first time I need my projection code.
Unfrotionatly this isn’t enough to completly nail this effect. So I am also calculating a mask in my shader, the check if a fragment should have a reversed z-Buffer value.
For this I project vertecies of the Instances, that should be excluded by my effect , into screenspace and send that list to my shader.
this is the second and final time I have to project something into screenspace
In this shader, I check if a given fragment is in any of the triangles my vertix list spans using barycentric coordinates.

the Intersection Code:

bool triIntersect(vec2 point, vec2 a, vec2 b, vec2 c)
{
    float as_x = point.x - a.x;
    float as_y = point.y - a.y;

    bool s_ab = (b.x - a.x) * as_y - (b.y - a.y) * as_x > 0.0f;

    if ((c.x - a.x) * as_y - (c.y - a.y) * as_x > 0.0f == s_ab) 
        return false;
    if ((c.x - b.x) * (point.y - b.y) - (c.y - b.y)*(point.x - b.x) > 0.0f != s_ab) 
        return false;
    return true;
}

the check for all triangles

	for(int i = 0; i < polyVerts.length(); i+=3)
	{
		if(polyVerts[i] == border) {break;}
		vec2 polygon[3] = {polyVerts[i],polyVerts[i+1],polyVerts[i+2]};
		if(triIntersect(FRAGCOORD.xy,polyVerts[i],polyVerts[i+1],polyVerts[i+2]))
		{
			excluded = true;
			break; 	
		}
	} 

The Array is sorted so that 3 vertecies build a face of a mesh instance.

Right now I am just trying to get this to work with just one eye of my VR headset.
So my ProjectVertex function gets the position of a Vertex in Worldspace, the projectionmatrix I got from this function:

xRInterface.GetProjectionForView(1,viewport.Size.X / viewport.Size.Y ,cam.Near,cam.Far);

And the camera Transform from this function:

xRInterface.GetTransformForView(1,xrOrigin.GlobalTransform)

The Vertex is projected into Worldsapce using :

Meshinstance3D.ToGlobal() 

I hope this clears up any confusions and I hope you can help me further.
Thank you very much

Very hard to say, this is definately an effect that would suffer from it being applied in stereo but as far as I can follow what you are doing, it should work.

What are you using for viewport size? Remember in XR, the XR interface handles the viewport size because we render at a different resolution than the final output. The image we render is the one before lens distortion is applied for final presentation on screen.

So you need to obtain the correct viewport size by calling XRInterface.get_render_target_size.

Also when calling get_transform_for_view the current tracking information is being used, however to combat latency the rendering engine will poll an updated headset location right before rendering starts. There can thus be a tiny correction in the location you calculate here, and the location used when rendering as we obtain more up to date tracking data.

But both are just wild stabs that may explain while you’re getting something different to what you expect.