40

I want to detect click/touch event on my gameObject 2D.

And this is my code:

void Update()
{
   if (Input.touchCount > 0)
   {
     Debug.Log("Touch");
   }
}

Debug.Log("Touch"); does not show when I click on screen or my gameObject.

Programmer
  • 121,791
  • 22
  • 236
  • 328
Lnitus
  • 409
  • 1
  • 4
  • 5
  • 1
    You're on a PC or on a mobile device? On a PC touchCount will always be 0 unless (iirc) you have a touchscreen. – Bart Mar 17 '14 at 16:03
  • I'm on a PC and when I clicked touchCount always be 0. That's reason why I asked Click can be considered as Touch in Unity3D.If not, how can I test Touch event on PC? – Lnitus Mar 17 '14 at 16:11
  • It depends if you need multitouch or not. If not, then an onMouseDown could serve as a test. – Bart Mar 17 '14 at 17:06
  • @naXa This may be new functionality but I believe using `OnMouseDown()` in a Unity project (Unity 4.6 and Unity 5) has worked for me on my iphone – Samie Bencherif Jun 21 '15 at 20:27
  • 1
    OnMouseDown is being triggered by a tap/touch on my iPhone 6+ – Valerie Nov 22 '16 at 17:36

4 Answers4

41

Short answer: yes, touch may be handled with Input.GetMouseButtonDown().

  • Input.GetMouseButtonDown(), Input.mousePosition, and associated functions work as tap on the touch screen (which is kind of odd, but welcome). If you don't have a multi-touch game, this is a good way to keep the in-editor game functioning well while still keeping touch input for devices. (source: Unity Community)

  • Mouse simulation with touches can be enabled/disabled with Input.simulateMouseWithTouches option. By default, this option is enabled.

  • Though it is good for testing, I believe Input.GetTouch() should be used in production code (because it is able to handle simultaneous touches).

  • Interesting approach is to add touch handling to OnMouseUp()/OnMouseDown() event:

      //  OnTouchDown.cs
      //  Allows "OnMouseDown()" events to work on the iPhone.
      //  Attach to the main camera.
    
      using UnityEngine;
      using System.Collections;
      using System.Collections.Generic;
    
      public class OnTouchDown : MonoBehaviour {
          void Update () {
              // Code for OnMouseDown in the iPhone. Unquote to test.
              RaycastHit hit = new RaycastHit();
              for (int i = 0; i < Input.touchCount; ++i)
                  if (Input.GetTouch(i).phase.Equals(TouchPhase.Began)) {
                      // Construct a ray from the current touch coordinates
                      Ray ray = Camera.main.ScreenPointToRay(Input.GetTouch(i).position);
                      if (Physics.Raycast(ray, out hit))
                          hit.transform.gameObject.SendMessage("OnMouseDown");
                  }
          }
      }
    

    (source: Unity Answers)

UPD.: There is Unity Remote mobile app for simulating touching in editor mode (works with Unity Editor 4 and Unity Editor 5).

naXa stands with Ukraine
  • 35,493
  • 19
  • 190
  • 259
  • Tks you so much. This is Bart's answer: "You're on a PC or on a mobile device? On a PC touchCount will always be 0 unless (iirc) you have a touchscreen". Because touchCount always be 0 on PC so I can only test your code on device? – Lnitus Mar 17 '14 at 17:27
  • @Lnitus, yes. you can use OnMouse-events and `Input.GetMouseButtonDown()` to handle both: mouse and touch events. But you can use `Input.touchCount` (and functions related to touch) **only** for touch-events. – naXa stands with Ukraine Mar 17 '14 at 17:34
  • "Though it is good for testing, I believe Input.GetTouch() should be used in production code" ... why do you believe this? I'm using Unity 2019 and it seems to work fine for click (touch) and right click (two finger touch), but if there's a real reason, I'd like to save time and use touch directly. – Gio Asencio May 14 '19 at 17:31
  • @GioAsencio it was sooo long ago... I don't remember the exact reason for writing this. now I just checked out [documentation](https://docs.unity3d.com/ScriptReference/Input-simulateMouseWithTouches.html) and I see that `Input.GetMouseButtonDown` is restricted to 3 touches (corresponding to 2 mouse buttons). and with `Input.GetTouch` we're able to handle more touches (usually consumer devices support up to 10 simultaneous touches). – naXa stands with Ukraine May 14 '19 at 17:57
  • @GioAsencio back then, I probably used this old answer as a reference -> http://answers.unity.com/answers/178557/view.html (it contains the same advice about production) – naXa stands with Ukraine May 14 '19 at 18:02
  • Thanks, though I still don't see any problem with it except confusion. If you use a device with touch screen it wont cause a mouse click any other way, and even if it did, the mouse would simply have less options (so you could implement both, and there wouldn't be an issue). As of the GUI, it has single rays regardless. The touchCount and GetTouch still works. The only thing I could see is if you needed 4 or more simultaneous touches, which isn't often. – Gio Asencio May 15 '19 at 22:52
26

From what I understand the Unity player does not allow you to trigger touch events, only mouse events.

But you can simulate fake touch events based on the mouse events, as explained in this blog post: http://2sa-studio.blogspot.com/2015/01/simulating-touch-events-from-mouse.html

void Update () {
    // Handle native touch events
    foreach (Touch touch in Input.touches) {
        HandleTouch(touch.fingerId, Camera.main.ScreenToWorldPoint(touch.position), touch.phase);
    }

    // Simulate touch events from mouse events
    if (Input.touchCount == 0) {
        if (Input.GetMouseButtonDown(0) ) {
            HandleTouch(10, Camera.main.ScreenToWorldPoint(Input.mousePosition), TouchPhase.Began);
        }
        if (Input.GetMouseButton(0) ) {
            HandleTouch(10, Camera.main.ScreenToWorldPoint(Input.mousePosition), TouchPhase.Moved);
        }
        if (Input.GetMouseButtonUp(0) ) {
            HandleTouch(10, Camera.main.ScreenToWorldPoint(Input.mousePosition), TouchPhase.Ended);
        }
    }
}

private void HandleTouch(int touchFingerId, Vector3 touchPosition, TouchPhase touchPhase) {
    switch (touchPhase) {
    case TouchPhase.Began:
        // TODO
        break;
    case TouchPhase.Moved:
        // TODO
        break;
    case TouchPhase.Ended:
        // TODO
        break;
    }
}
sdabet
  • 18,360
  • 11
  • 89
  • 158
5

The short answer is no, there is a unity remote android app (remote app) for simulating touching in editor mode. I think this maybe helpful.

The old android app was deprecated. go to related document for further information, and it now supports iOS and tvOS as well.

tim
  • 1,454
  • 1
  • 25
  • 45
1

Late but this may help somebody :)

To detect touch and swipe (Mouse not detected in this example - and also just single touch)

Tested with Unity 5.6 on windows 8.1 (with touch screen) and android.

I think you should handle mouse click separately. And also i have no idea about ios.

    bool isTouchMoved = false;
    float touchMoveThreshold = .5f;
    Vector3 posA, posB;

    void handleTouch ()
    {
        if (Input.touchCount > 0) {
            if (Input.GetTouch (0).phase == TouchPhase.Began) {
                posA = Camera.main.ScreenToWorldPoint (Input.GetTouch (0).position);            
            } else if (Input.GetTouch (0).phase == TouchPhase.Moved) {
                isTouchMoved = true;
            } else {
                posB = Camera.main.ScreenToWorldPoint (Input.GetTouch (0).position);
                float dx = Mathf.Abs (posA.x - posB.x);
                float dy = Mathf.Abs (posA.y - posB.y);
                if (isTouchMoved && (dx > touchMoveThreshold || dy > touchMoveThreshold)) {
                    Debug.Log ("Swiped");
                    if (dx > dy) {
                        if (posA.x < posB.x) {
                            Debug.Log ("Left to Right");
                        } else {
                            Debug.Log ("Right to Left");
                        }
                    } else {
                        if (posA.y > posB.y) {
                            Debug.Log ("Top to Bottom");
                        } else {
                            Debug.Log ("Bottom to Top");
                        }
                    }
                } else if (Input.GetTouch (0).phase == TouchPhase.Ended) {
                    Debug.Log ("Touched");

                    Vector2 pos2D = new Vector2 (posB.x, posB.y);
                    RaycastHit2D hit = Physics2D.Raycast (pos2D, Vector2.zero);
                    //RaycastHit2D hit = Physics2D.Raycast (Input.GetTouch (0).position, Vector2.zero);
                    if (hit.collider != null) {
                        Debug.Log ("Touched " + hit.collider.name);
                    }
                }
                isTouchMoved = false;
            }
        }
    }

Note(1): To detect touch on GameObject you should add 2D collider on that GameObject.

Note(2): Put HandleTouch() in Update() function.


Update: To handle Mouse

Check this link:

if (Input.GetMouseButtonDown(0))
{
   Ray ray = Camera.main.ScreenPointToRay(Input.mousePosition);
   RaycastHit2D hit = Physics2D.Raycast(ray.origin, ray.direction);
   if (hit.collider != null) {
      Debug.Log ("CLICKED " + hit.collider.name);
   }
}
Shamshirsaz.Navid
  • 2,224
  • 3
  • 22
  • 36