Which is the best way to handle touch screen controls and clicking objects?

Godot Version



I’m trying to make some sort of interactive birthday card, so the idea is to deliver this as an android app, and to be able to click 2D or 3D elements that pop up a dialogue or do some sort of animation, and later I would like to be able to pan a bit the background with the gyroscope.

At the same time I’m trying to learn, so I would like to know which are the best ways to solve this problems.

I know that one of the best ways to handle input it’s to go to Project Settings>Input Map, register your actions and then call them with input.get… , but there is no way to get touchscreen input like taps like this.

Other way I know of controlling input like the mouse, is using the signals on_mouse_entered() on_mouse_exited() and input_event() of a rigidbody, to only seek for input if the mouse is inside the object we want to select, but at first glance there is no other signal for touch controls.

I could try to have a script in unhandled_input() that on InputEventScreenTouch if event.is_pressed() (And other conditions to differ it from swipe) to make it cast a raycast to activate a method on the interactable, but I don’t know if that respects UI to handle input when an overlay is between, and if that’s the best way to handle this type of inputs.

What do you people think?

There are Area2D pickable functionality in collision bodies. Similar to casting rays. Although I don’t know much about touch input.

I also feel like you could hack a UI element to do something similar.

Id would lean on notifications through _input and _unhandled_input instead of polling from a _process and Input Singleton. Especially since your are targeting mobile.

1 Like

this is very similar to mouse motion input which cant be assigned to an action. and since these finger events are relative and highly contextual you cant simply specify a particular action to any given single finger event. you would probable have a touch manager to judge based on relevant context that could determine the “action” and inject it int the event queue for another node to consume

there are two classes related to finger events…
InputEventScreenDrag, InputEventScreenTouch

it seems like a touch is a finger placed, you would probably just need to time it in order to decide if its a tap.otherwise i think these are pretty straight forward.

1 Like

Well, it is good to know that there is another class to drag :slight_smile:

And it seems to work with the signal of CollisionObject3D input_event() as if it was a mouse click

func _on_rigid_body_3d_input_event(camera, event, position, normal, shape_idx):
	if (event is InputEventScreenTouch):
		if (event.pressed()):
			DialogueManager.show_example_dialogue_balloon(dialogue_resource, dialogue_start)

For now, it seems to me as a nice solution, then I would have to make this logic a node or a class to have all interactive items inherit this.

Thanks @pennyloafers !

  • Had to debug with the dialogue manager because I can’t see the console output on the

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.