I’m trying to set up an Area2D in my game that the player can tap on and “focus” on. The focusing is already taken care of, but for some reason, no matter what I do, I cannot detect taps on said Area2D.
Listening to InputEventMouseButton works perfectly fine, which tells me that nothing is blocking the input, AND that the collisions are set up correctly, but InputEventScreenTouch doesn’t get recognized at all. Either if I include “Pressed: True” or not. Any idea why?
using Godot;
using Godot.Collections;
using GameNameRedacted.Scripts.Commands;
namespace GameNameRedacted.Scripts;
[GlobalClass]
public partial class Interactable : Area2D
{
[Export] public Array<TriggerAction> Actions;
public override void _Ready()
{
base._Ready();
InputEvent += OnInputEvent;
}
public override void _ExitTree()
{
InputEvent -= OnInputEvent;
base._ExitTree();
}
private void OnInputEvent(Node viewport, InputEvent @event, long shapeIdx)
{
if (@event is InputEventMouseButton { Pressed: true })
{
// This works.
GD.Print("Click");
}
if (@event is InputEventScreenTouch { Pressed: true })
{
// This does not.
GD.Print("Tap happened!");
}
}
private void ExecuteAllActions()
{
if (Actions == null)
{
return;
}
foreach (TriggerAction action in Actions)
{
action?.Execute(this);
}
}
}
I did my best to explain the situation, and I added comments in my code. The InputEvent works perfectly fine when I use a mouse to click inside it, but refuses to accept a touch input on a touchscreen device.
Also this doesn’t really apply in my case since I’m trying to get this to work on an actual touchscreen device first and foremost, as this will be a mobile game. But the touch input doesn’t work even if this setting is on or off unfortunately.
Yes i am a bit newly awake and drowsy. Missed that.
What happens if you set emulate_mouse_from_touch to true? Just as a test that could maybe give a hint about what is going on.
When i used touch in a small project i used the touch button. If you just drag it out to your desired size and don’t add any textures or such, it works more or less as an area that only handles touch input.
Issue is that this is going to be a point and click game, and the artists need to be able to put their art / interactable objects around the map freely. And these objects need to also be able to interact with each other, that’s why I wanted to use an Area2D, and since I can easily check for mouse clicks in said area, I assumed it would also be able to detect touch screen input taps, but apparently that’s not the case.
If I have Emulate touch from mouse disabled, and I use a mouse to click on the Area2D, it successfully executes the click code and prints click.
If I have Emulate touch from mouse enabled, and I use a mouse to click on the Area2D, both the mouse click AND the tap code will execute, and both will be printed.
However, in either case, when this same project is exported to an android device and running in Debug mode connected to my PC, the tap code never registers. Doesn’t matter if that setting is turned on or off.
I’ll look into touch screen buttons as well though, thank you. But I still hope that I can figure out why this Area2D doesn’t work.
Ok a final attempt - is the area covered by a node that you have set mouse filter ignore to? In that case the touch will not pass through but could perhaps pass through in some weird way when playing around with the settings, as when you described both click and tap registered above.
So I was just finishing up on touch screen camera controls yesterday for my Camera2D Plugin and I went and took a look at what I did.
As far as I know (and IMO touch screen controls are one of the least-documented parts of the engine) the only way to access the touch screen events is through the _input() function. You can’t do it through _physics_process() or an Area2D’s input_event signal.
I’ve also been working on mouse drag-and-drop code this week (which has been a nightmare) and I had to do a bunch of collision detection with Area2D objects. I think that combining the two might be your answer.
Relevant Drag-and-Drop Code
class_name Item extends Area2D
const GRID_SIZE: int = 64
# Returns whether another item is already in the space. Relies on [Item] nodes
# being on physics layaer and mask 5.
func _is_position_occupied() -> bool:
var overlapping: Array[Area2D] = get_overlapping_areas()
for area: Area2D in overlapping:
if area is Item and area != self and _get_grid_center(self) == _get_grid_center(area):
return true
return false
# Checks to see if the [Item] is currently overlapping anything when picked up.
func _check_overlaps() -> void:
var overlapping: Array[Area2D] = get_overlapping_areas()
if overlapping.size() > 0:
_on_area_entered(overlapping[0])
# Using the passed Node2D, get the center point for the closest grid slot,
# based on the GRID_SIZE constant, (which is 64).
func _get_grid_center(node: Node2D) -> Vector2:
# Calculate which grid cell we're in
var grid_x: int = int(floor(node.global_position.x / GRID_SIZE))
var grid_y: int = int(floor(node.global_position.y / GRID_SIZE))
# Calculate center of that grid cell
var center_x: float = (grid_x * GRID_SIZE) + (GRID_SIZE / 2.0)
var center_y: float = (grid_y * GRID_SIZE) + (GRID_SIZE / 2.0)
return Vector2(center_x, center_y)
My thought is to just look for an InputEventScreenTouch and then cycle through the available Area2D nodes and look to see if that point collides with them.
It could potentially be slightly more performative to create a Dictionary[Rect2, Area2D] Where the key was a Rect2 containing the global position and size of the Area2D and you could iterate through and check if the point is inside a Rect2 and if so retrieve the appropriate Area2D
I finally had time to try this and looks like it’s true, although that is quite an odd… limitation? And you’re right, the documentation NEVER mentioned this anywhere at all. But thank you! I’ll go with this solution for the time being!