Briefly
touch
events occur when touching an HTML element on touch screens. This includes both finger touches and stylus interactions. Depending on what action the user performed (touched, started moving their finger, etc.), a specific touch
event will occur:
touchstart
— triggers on the first touch;touchmove
— triggers during finger movement over the element;touchend
— triggers after the touch is released;touchcancel
— triggers when the event is interrupted.
How to write
Handler for the beginning of a touch on an element (analogous to mousedown
):
element.addEventListener('touchstart', (event) => { console.log('You have touched the element')})
element.addEventListener('touchstart', (event) => { console.log('You have touched the element') })
Subscribe to the event when the user drags their finger over the element (analogous to mousemove
):
element.addEventListener('touchmove', (event) => { console.log('I am being dragged by a finger')})
element.addEventListener('touchmove', (event) => { console.log('I am being dragged by a finger') })
Subscribe to the event when the user ends the touch (analogous to mouseup
):
element.addEventListener('touchend', (event) => { console.log('Touch has ended')})
element.addEventListener('touchend', (event) => { console.log('Touch has ended') })
How to understand
When a user is working on a computer, interaction with elements on the screen usually happens through the cursor. For processing clicks, the built-in click
event is sufficient. The click
event also works when a user interacts with the interface through a smartphone or tablet and uses taps on the screen.
However, mobile devices not only have taps but also gestures and multitouch. To enable developers to handle such complex user actions, browsers started providing low-level APIs for processing touch events. This allows for building interfaces that handle multitouch and gestures.
Despite the fact that touch
events are very similar to click
, their main difference lies in supporting multiple touches in different places on the screen (multitouch). In total, touch
has 4 events:
touchstart
— occurs at the moment when the user touches the element;touchmove
— occurs when the user drags their finger over the element;touchend
— occurs when the user lifts their finger off the element (ends the touch);touchcancel
— occurs if the event is interrupted, for example, if there are too many simultaneous touch points, or if a finger goes off the element or screen.
The touch event Touch
, which is passed to the handler, contains several useful fields:
touches
— an array that contains objects for all touch points on the screen (useful for multitouch handling);target
— an array that contains objects for all touch points on the target element.Touches
In the example, we use the fields of the events and types of touch events. Use a smartphone, as touches do not work with a mouse.
In practice
Advice 1
🛠 It is worth considering that browsers send both the click
event and the touch
event in response to certain user actions simultaneously. For example, when touching an element (let's say a button), the sequence of events will be as follows: touchstart
→ touchend
→ mousedown
→ mouseup
→ click
.
It is important to remember this feature if you are handling these events on the same element. If you need to prevent mouse events from occurring on the element, you should call prevent
in the touch event handler:
element.addEventListener('touchstart', (event) => { event.preventDefault() // Mouse events will now not be triggered})
element.addEventListener('touchstart', (event) => { event.preventDefault() // Mouse events will now not be triggered })
🛠 Using touch events, you can handle gestures, such as swipes. To do this, it will be necessary to save the coordinates where the user touched (the touchstart
event) and compare them with the change in coordinates during finger movement (the touchmove
event). You can look at the example for more details.