Improving hand physics in VR

Godot Version

Any, but currently using Godot 4.3 stable and Godot 4.4 dev 6

Question

For years we’ve been working on improving hand physics in VR. These past few weeks I’ve been pushing the limits of what our current solution in XR Tools can do, and exploring new options for the future based on experimentation of other XR users.

While I’ve made great strides, there are still a number of holes in the solution so I figured I might post a topic here to explain a few things, and then talk about some of the issues, with the hope to get some ideas on how to tackle these.

The main problem we’re trying to solve is that the positioning of the users hands in a VR game is primarily driven by the real world tracking, but other causes can result in movement as well.

Physics doesn’t like this very much, its designed to have control over the object being moved so it can interact with the world correctly, not have a physics body just jumping from one location to another.

In the current version of XR Tools we’ve solved this by making the hands a CharacterBody3D and using the various move functions to move the hand to the tracked location. That works decently well for the hand moving and preventing the hand from going through walls, etc.
But we’ve had to create our own (flawed) logic to push rigidbodies away, and properly handling rotating of the hand has been near impossible.
Things get even more complex when the player picks items up.

We’ve got a decent system working, but it has many shortcomings. So we’ve starting looking into a new approach that has been spearheaded by a few other community members as well.

For the new version of XR Tools we’ve switched over to the hand being a RigidBody3D to which we apply forces and torque to nudge it toward the tracked location and orientation of the hand. With some careful choosing of the strength of this force we can do so with very little introduced lag.
Furthermore, picking items up means that we can use joints (though this has had some challenges) and we get proper simulation of weight of the picked up object, interaction with the world, and two handed functionality as a bonus.

Here is a video showing some of the progress we’ve made (sorry for the crappy encoding that Youtube applies to shorts):

This is still early progress but it shows a lot of promise. But it also shows where things are less then perfect.

When we move the player, the hands now need to catch up with that movement. The hardcoded strength of the force doesn’t account for this.
It gets even worse if the player is moving because they are on a moving platform. What’s really required here is a far more exact calculation of the force required to move the hand to its new location, keeping in mind current velocity, what’s attached to the hand, etc.
So I’m wondering if there are APIs or approaches that can give us the entire picture that would allow us to do this.

The other issue that I’ve not yet solved is that in VR it’s a common mechanism to teleport the player to a new location. That’s going to require repositioning the hand, and anything attached to the hand to a new location. I’m guessing this one is likely easier to solve but I’d need to find a way to query everything attached to the hand through joints.

So yeah, any ideas, feedback, alternative suggestions, would be very welcome.

How about an approach where normally the hands are kinda like the VR playerbody. They can push rigidbodies around but not more.

Then when you pick something up using the functionpickup, the mode of the hand switches to something that no longer collides with the rigidbody.

I’ve been trying to come up with something like this for the pointers. One issue I have is that whenever I want to use the pointer and the trigger_clicks for viewport_2d_in_3d_body inherited or instanced scenes, is that the trigger_click is also bound to the main function I want to use for that, such as teleporting, sprinting or jumping. I want to make some kind of switching mechanism (maybe with a match function?) so that in normal mode those buttons (trigger_click, grip click) are bound to the normal functionality such as sprint, teleport, jump, or grab. But as soon as there is a viewport_2d_in_3d that collide with the pointers, then the pointer should be active and use trigger_click instead of those other functions.

I haven’t been succesful yet (I’m still a coding noob) and was going to ask for help on these forums, but perhaps a similar methodology could be used for making the hands collide with rigis bodies VS having the hands pick stuff up? Or make a collisionshape as child of the hand/hand controller that only collides when there is no pickup function active?

Another issue I noticed with the playerbody is that as soon as you let it collide with rigidbodies, a small rigisbody may send you flying through the air when you collide with it. It would be useful if we could set the weight of the player (for example 70 - 110 kg) to make the collisions work more realistically.

@tieleman.m have a look at the latest changes in master of XR Tools and how CollisionHands now work. It works in a similar line to what you’re suggesting.

I’ve found it very limiting however, you can see especially when picking up the big hammers, they feel very unnatural.

Meanwhile I’m making decent progress with the rigidbody based hands. My main issue remains that they are trailing and especially when the player moves the hands don’t catch up. But for just normal hand movement, I’m able to do some really neat things, it all feels very natural.