This article shares JS touch and gesture events for your reference. The specific content is as follows
1. Touch events
When iPhone 3G, which includes iOS 2.0 software was released, a new version of the Safari browser was also included. This new mobile Safari offers some new events related to touch operations. Later, browsers on Android implemented the same event. The touch event is triggered when the user's finger is placed on the screen, when sliding on the screen, or when moving away from the screen. Specifically, there are several touch events.
touchstart :Triggered when the finger touches the screen; it will trigger even if one finger is already placed on the screen.
touchmove :Triggers continuously when the finger slides on the screen. During this event, calling preventDefault() can prevent scrolling.
touchend :Triggered when the finger is removed from the screen.
touchcancel :Triggered when the system stops tracking touch. The exact trigger time of this event is not explicitly stated in the documentation.
The above events will all bubbling and can be cancelled. Although these touch events are not defined in DOM, they are implemented in a DOM-compatible way. Therefore, each touch event event object provides common properties in mouse events: bubbles, cancelable, view, clientX, clientY, screenX, screenY, detail, altKey, shiftKey, ctrlKey, metaKey.
In addition to common DOM properties, touch events also contain the following three attributes for tracking touch.
touches :An array of Touch objects representing the currently tracked touch operation.
targetTouchs :An array of Touch objects specific to the event target.
changeTouches :An array of Touch objects that represent what has changed since the last touch.
EachThe Touch object contains the following properties。
clientX: Touch the x coordinate of the target in the viewport.
clientY: Touch the y coordinate of the target in the viewport.
identifier: The unique ID that identifies the touch.
pageX: Touch the x coordinate of the target in the page.
pageY: Touch the y coordinate of the target in the page.
screenX: Touch the x coordinate of the target in the screen.
screenY : Touch the y coordinate of the target in the screen.
target : Touched DOM node target.
Use these properties to track user touching the screen. Let’s take a look at the following example.
function handleTouchEvent(event){ //Touch only once tracksif ( == 1){ var output = ("output"); switch(){ case "touchstart": = "Touch started (" + [0].clientX + "," + [0].clientY + ")"; break; case "touchend": += "<br>Touch ended (" + [0].clientX + "," + [0].clientY + ")"; break; case "touchmove": (); //Stop scrolling += "<br>Touch moved (" + [0].clientX + "," + [0].clientY + ")"; break; } } } (document, "touchstart", handleTouchEvent); (document, "touchend", handleTouchEvent); (document, "touchmove", handleTouchEvent);
The above code tracks a touch operation that occurs on the screen. For simplicity, information is output only if there is an active touch operation. When the touchstart event occurs, the touched position information is output to the <div> element. When the touchmove event occurs, its default behavior is cancelled, scrolling is blocked (the default behavior of touch movement is scrolling page), and then changes information of the touch operation are output. The touchend event outputs final information about the touch operation. Note that when the touchend event occurs, there is no Touch object in the touches collection because there is no active touch operation; at this time, you must switch to the changeTouchs collection.
These events are triggered on all elements of the document, so different parts of the page can be operated separately. When touching elements on the screen, these events (including mouse events) occur in the order as follows:
(1) touchstart
(2) mouseover
(3) mousemove (once)
(4) mousedown
(5) mouseup
(6) click
(7) touchend
Browsers that support touch events include Safari for iOS, WebKit for Android, Dolfin for bada, BlackBerryWebKit in OS6+, Opera Mobile 10.1+ and Phantom browser in LG's proprietary OS.
2. Gesture Events
Safari in iOS 2.0 also introduced a set of gesture events. A gesture is generated when two fingers touch the screen, which usually changes the size of the display item or rotates the display item. There are three gesture events, which are described below.
gesturestart: Triggered when one finger has been pressed on the screen and the other finger touches the screen again.
gesturechange: Triggered when the position of any finger on the touch screen changes.
gestureend: Triggered when any finger is removed from the screen.
These events are triggered only when both fingers touch the event's receiving container. Setting an event handler on an element means that both fingers must be within the range of the element at the same time to trigger a gesture event (this element is the target). Since these events are bubbled up, putting the event handler on the document can also handle all gesture events. At this point, the target of the event is the element in which both fingers are within its range.
There is some relationship between touch events and gesture events. The touchstart event is triggered when a finger is placed on the screen. If another finger is placed on the screen again, the gesturestart event will be triggered first, followed by a touchstart event based on that finger. If one or both fingers slide on the screen, the gesturechange event will be triggered. But as long as one finger is removed, the gestureend event will be triggered, followed by the touchend event based on that finger.
Like touch events, the event object of each gesture event contains standard mouse event properties: bubbles , cancelable , view , clientX , clientY , screenX , screenY , detail , altKey , shiftKey , ctrlKey , and metaKey . In addition, there are two additional properties: rotation and scale . Where, the rotation attribute represents the rotation angle caused by finger changes, a negative value represents the rotation counterclockwise, and a positive value represents the rotation clockwise (this value starts at 0). The scale attribute indicates the change in the distance between two fingers (for example, shrinking inward will shorten the distance); this value starts from 1 and increases as the distance increases, and decreases as the distance decreases.
Here is an example of using gesture events.
function handleGestureEvent(event){ var output = ("output"); switch(){ case "gesturestart": = "Gesture started (rotation=" + + ",scale=" + + ")"; break; case "gestureend": += "<br>Gesture ended (rotation=" + + ",scale=" + + ")"; break; case "gesturechange": += "<br>Gesture changed (rotation=" + + ",scale=" + + ")"; break; } } ("gesturestart", handleGestureEvent, false); ("gestureend", handleGestureEvent, false); ("gesturechange", handleGestureEvent, false);
As with the previous example of the touch event, the code here simply associates each event into the same function and then outputs the relevant information for each event through the function.
The above is all the content of this article. I hope it will be helpful to everyone's study and I hope everyone will support me more.